WorldWideScience

Sample records for proportional hazard model

  1. Proportional hazards models of infrastructure system recovery

    International Nuclear Information System (INIS)

    Barker, Kash; Baroud, Hiba

    2014-01-01

    As emphasis is being placed on a system's ability to withstand and to recover from a disruptive event, collectively referred to as dynamic resilience, there exists a need to quantify a system's ability to bounce back after a disruptive event. This work applies a statistical technique from biostatistics, the proportional hazards model, to describe (i) the instantaneous rate of recovery of an infrastructure system and (ii) the likelihood that recovery occurs prior to a given point in time. A major benefit of the proportional hazards model is its ability to describe a recovery event as a function of time as well as covariates describing the infrastructure system or disruptive event, among others, which can also vary with time. The proportional hazards approach is illustrated with a publicly available electric power outage data set

  2. Functional form diagnostics for Cox's proportional hazards model.

    Science.gov (United States)

    León, Larry F; Tsai, Chih-Ling

    2004-03-01

    We propose a new type of residual and an easily computed functional form test for the Cox proportional hazards model. The proposed test is a modification of the omnibus test for testing the overall fit of a parametric regression model, developed by Stute, González Manteiga, and Presedo Quindimil (1998, Journal of the American Statistical Association93, 141-149), and is based on what we call censoring consistent residuals. In addition, we develop residual plots that can be used to identify the correct functional forms of covariates. We compare our test with the functional form test of Lin, Wei, and Ying (1993, Biometrika80, 557-572) in a simulation study. The practical application of the proposed residuals and functional form test is illustrated using both a simulated data set and a real data set.

  3. Measures to assess the prognostic ability of the stratified Cox proportional hazards model

    DEFF Research Database (Denmark)

    (Tybjaerg-Hansen, A.) The Fibrinogen Studies Collaboration.The Copenhagen City Heart Study; Tybjærg-Hansen, Anne

    2009-01-01

    Many measures have been proposed to summarize the prognostic ability of the Cox proportional hazards (CPH) survival model, although none is universally accepted for general use. By contrast, little work has been done to summarize the prognostic ability of the stratified CPH model; such measures...

  4. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    Science.gov (United States)

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  5. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  6. Cox proportional hazards models have more statistical power than logistic regression models in cross-sectional genetic association studies

    NARCIS (Netherlands)

    van der Net, Jeroen B.; Janssens, A. Cecile J. W.; Eijkemans, Marinus J. C.; Kastelein, John J. P.; Sijbrands, Eric J. G.; Steyerberg, Ewout W.

    2008-01-01

    Cross-sectional genetic association studies can be analyzed using Cox proportional hazards models with age as time scale, if age at onset of disease is known for the cases and age at data collection is known for the controls. We assessed to what degree and under what conditions Cox proportional

  7. Proportional hazards model with varying coefficients for length-biased data.

    Science.gov (United States)

    Zhang, Feipeng; Chen, Xuerong; Zhou, Yong

    2014-01-01

    Length-biased data arise in many important applications including epidemiological cohort studies, cancer prevention trials and studies of labor economics. Such data are also often subject to right censoring due to loss of follow-up or the end of study. In this paper, we consider a proportional hazards model with varying coefficients for right-censored and length-biased data, which is used to study the interact effect nonlinearly of covariates with an exposure variable. A local estimating equation method is proposed for the unknown coefficients and the intercept function in the model. The asymptotic properties of the proposed estimators are established by using the martingale theory and kernel smoothing techniques. Our simulation studies demonstrate that the proposed estimators have an excellent finite-sample performance. The Channing House data is analyzed to demonstrate the applications of the proposed method.

  8. A Proportional Hazards Regression Model for the Subdistribution with Covariates-adjusted Censoring Weight for Competing Risks Data

    DEFF Research Database (Denmark)

    He, Peng; Eriksson, Frank; Scheike, Thomas H.

    2016-01-01

    function by fitting the Cox model for the censoring distribution and using the predictive probability for each individual. Our simulation study shows that the covariate-adjusted weight estimator is basically unbiased when the censoring time depends on the covariates, and the covariate-adjusted weight......With competing risks data, one often needs to assess the treatment and covariate effects on the cumulative incidence function. Fine and Gray proposed a proportional hazards regression model for the subdistribution of a competing risk with the assumption that the censoring distribution...... and the covariates are independent. Covariate-dependent censoring sometimes occurs in medical studies. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with proper adjustments for covariate-dependent censoring. We consider a covariate-adjusted weight...

  9. Optimization of maintenance policy using the proportional hazard model

    Energy Technology Data Exchange (ETDEWEB)

    Samrout, M. [Information Sciences and Technologies Institute, University of Technology of Troyes, 10000 Troyes (France)], E-mail: mohamad.el_samrout@utt.fr; Chatelet, E. [Information Sciences and Technologies Institute, University of Technology of Troyes, 10000 Troyes (France)], E-mail: chatelt@utt.fr; Kouta, R. [M3M Laboratory, University of Technology of Belfort Montbeliard (France); Chebbo, N. [Industrial Systems Laboratory, IUT, Lebanese University (Lebanon)

    2009-01-15

    The evolution of system reliability depends on its structure as well as on the evolution of its components reliability. The latter is a function of component age during a system's operating life. Component aging is strongly affected by maintenance activities performed on the system. In this work, we consider two categories of maintenance activities: corrective maintenance (CM) and preventive maintenance (PM). Maintenance actions are characterized by their ability to reduce this age. PM consists of actions applied on components while they are operating, whereas CM actions occur when the component breaks down. In this paper, we expound a new method to integrate the effect of CM while planning for the PM policy. The proportional hazard function was used as a modeling tool for that purpose. Interesting results were obtained when comparison between policies that take into consideration the CM effect and those that do not is established.

  10. The comparison of proportional hazards and accelerated failure time models in analyzing the first birth interval survival data

    Science.gov (United States)

    Faruk, Alfensi

    2018-03-01

    Survival analysis is a branch of statistics, which is focussed on the analysis of time- to-event data. In multivariate survival analysis, the proportional hazards (PH) is the most popular model in order to analyze the effects of several covariates on the survival time. However, the assumption of constant hazards in PH model is not always satisfied by the data. The violation of the PH assumption leads to the misinterpretation of the estimation results and decreasing the power of the related statistical tests. On the other hand, the accelerated failure time (AFT) models do not assume the constant hazards in the survival data as in PH model. The AFT models, moreover, can be used as the alternative to PH model if the constant hazards assumption is violated. The objective of this research was to compare the performance of PH model and the AFT models in analyzing the significant factors affecting the first birth interval (FBI) data in Indonesia. In this work, the discussion was limited to three AFT models which were based on Weibull, exponential, and log-normal distribution. The analysis by using graphical approach and a statistical test showed that the non-proportional hazards exist in the FBI data set. Based on the Akaike information criterion (AIC), the log-normal AFT model was the most appropriate model among the other considered models. Results of the best fitted model (log-normal AFT model) showed that the covariates such as women’s educational level, husband’s educational level, contraceptive knowledge, access to mass media, wealth index, and employment status were among factors affecting the FBI in Indonesia.

  11. ADVANCES IN RENEWAL DECISION-MAKING UTILISING THE PROPORTIONAL HAZARDS MODEL WITH VIBRATION COVARIATES

    Directory of Open Access Journals (Sweden)

    Pieter-Jan Vlok

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Increased competitiveness in the production world necessitates improved maintenance strategies to increase availabilities and drive down cost . The maintenance engineer is thus faced with the need to make more intelligent pre ventive renewal decisions . Two of the main techniques to achieve this is through Condition Monitoring (such as vibrat ion monitoring and oil anal ysis and Statistical Failure Analysis (typically using probabilistic techniques . The present paper discusses these techniques, their uses and weaknesses and then presents th e Proportional Hazard Model as an solution to most of these weaknesses. It then goes on to compare the results of the different techniques in monetary terms, using a South African case study. This comparison shows clearly that the Proportional Hazards Model is sup erior to the present t echniques and should be the preferred model for many actual maintenance situations.

    AFRIKAANSE OPSOMMING: Verhoogde vlakke van mededinging in die produksie omgewing noodsaak verbeterde instandhouding strategies om beskikbaarheid van toerusting te verhoog en koste te minimeer. Instandhoudingsingenieurs moet gevolglik meer intellegente voorkomende hernuwings besluite neem. Twee prominente tegnieke om hierdie doelwit te bereik is Toestandsmonitering (soos vibrasie monitering of olie analise en Statistiese Falingsanalise (gewoonlik m.b.v. probabilistiese metodes. In hierdie artikel beskou ons beide hierdie tegnieke, hulle gebruike en tekortkominge en stel dan die Proporsionele Gevaarkoers Model voor as 'n oplossing vir meeste van die tekortkominge. Die artikel vergelyk ook die verskillende tegnieke in geldelike terme deur gebruik te maak van 'n Suid-Afrikaanse gevalle studie. Hierdie vergelyking wys duidelik-uit dat die Proporsionele Gevaarkoers Model groter beloft e inhou as die huidige tegni eke en dat dit die voorkeur oplossing behoort te wees in baie werklike instandhoudings situasies.

  12. Limitations of Cox Proportional Hazards Analysis in Mortality Prediction of Patients with Acute Coronary Syndrome

    Directory of Open Access Journals (Sweden)

    Babińska Magdalena

    2015-12-01

    Full Text Available The aim of this study was to evaluate the possibility of incorrect assessment of mortality risk factors in a group of patients affected by acute coronary syndrome, due to the lack of hazard proportionality in the Cox regression model. One hundred and fifty consecutive patients with acute coronary syndrome (ACS and no age limit were enrolled. Univariable and multivariable Cox proportional hazard analyses were performed. The proportional hazard assumptions were verified using Schoenfeld residuals, χ2 test and rank correlation coefficient t between residuals and time. In the total group of 150 patients, 33 (22.0% deaths from any cause were registered in the follow-up time period of 64 months. The non-survivors were significantly older and had increased prevalence of diabetes and erythrocyturia, longer history of coronary artery disease, higher concentrations of serum creatinine, cystatin C, uric acid, glucose, C-reactive protein (CRP, homocysteine and B-type natriuretic peptide (NT-proBNP, and lower concentrations of serum sodium. No significant differences in echocardiography parameters were observed between groups. The following factors were risk of death factors and fulfilled the proportional hazard assumption in the univariable model: smoking, occurrence of diabetes and anaemia, duration of coronary artery disease, and abnormal serum concentrations of uric acid, sodium, homocysteine, cystatin C and NT-proBNP, while in the multivariable model, the risk of death factors were: smoking and elevated concentrations of homocysteine and NT-proBNP. The study has demonstrated that violation of the proportional hazard assumption in the Cox regression model may lead to creating a false model that does not include only time-independent predictive factors.

  13. Data-driven smooth tests of the proportional hazards assumption

    Czech Academy of Sciences Publication Activity Database

    Kraus, David

    2007-01-01

    Roč. 13, č. 1 (2007), s. 1-16 ISSN 1380-7870 R&D Projects: GA AV ČR(CZ) IAA101120604; GA ČR(CZ) GD201/05/H007 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * Neyman's smooth test * proportional hazards assumption * Schwarz's selection rule Subject RIV: BA - General Mathematics Impact factor: 0.491, year: 2007

  14. A ¤flexible additive multiplicative hazard model

    DEFF Research Database (Denmark)

    Martinussen, T.; Scheike, T. H.

    2002-01-01

    Aalen's additive model; Counting process; Cox regression; Hazard model; Proportional excess harzard model; Time-varying effect......Aalen's additive model; Counting process; Cox regression; Hazard model; Proportional excess harzard model; Time-varying effect...

  15. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    Science.gov (United States)

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  16. The Application of Extended Cox Proportional Hazard Method for Estimating Survival Time of Breast Cancer

    Science.gov (United States)

    Husain, Hartina; Astuti Thamrin, Sri; Tahir, Sulaiha; Mukhlisin, Ahmad; Mirna Apriani, M.

    2018-03-01

    Breast cancer is one type of cancer that is the leading cause of death worldwide. This study aims to model the factors that affect the survival time and rate of cure of breast cancer patients. The extended cox model, which is a modification of the proportional hazard cox model in which the proportional hazard assumptions are not met, is used in this study. The maximum likelihood estimation approach is used to estimate the parameters of the model. This method is then applied to medical record data of breast cancer patient in 2011-2016, which is taken from Hasanuddin University Education Hospital. The results obtained indicate that the factors that affect the survival time of breast cancer patients are malignancy and leukocyte levels.

  17. Novel Harmonic Regularization Approach for Variable Selection in Cox’s Proportional Hazards Model

    Directory of Open Access Journals (Sweden)

    Ge-Jin Chu

    2014-01-01

    Full Text Available Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL, the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  18. Further Results on Dynamic Additive Hazard Rate Model

    Directory of Open Access Journals (Sweden)

    Zhengcheng Zhang

    2014-01-01

    Full Text Available In the past, the proportional and additive hazard rate models have been investigated in the works. Nanda and Das (2011 introduced and studied the dynamic proportional (reversed hazard rate model. In this paper we study the dynamic additive hazard rate model, and investigate its aging properties for different aging classes. The closure of the model under some stochastic orders has also been investigated. Some examples are also given to illustrate different aging properties and stochastic comparisons of the model.

  19. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

    Science.gov (United States)

    Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

    2016-01-01

    Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  20. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    Science.gov (United States)

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  1. Mediation Analysis with Survival Outcomes: Accelerated Failure Time Versus Proportional Hazards Models

    Directory of Open Access Journals (Sweden)

    Lois A Gelfand

    2016-03-01

    Full Text Available Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH and fully parametric accelerated failure time (AFT approaches for illustration.Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively under varied data conditions, some including censoring. A simulated data set illustrates the findings.Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome – underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG.Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  2. Statistical analysis of Caterpillar 793D haul truck engine data and through-life diagnostic information using the proportional hazards model

    Directory of Open Access Journals (Sweden)

    Carstens, W. A.

    2013-08-01

    Full Text Available Physical asset management (PAM is of increasing concern for companies in industry today. A key performance area of PAM is asset care plans (ACPs, which consist of maintenance strategies such as usage based maintenance (UBM and condition based maintenance (CBM. Data obtained from the South African mining industry was modelled using a CBM prognostic model called the proportional hazards model (PHM. Results indicated that the developed model produced estimates that were reasonable representations of reality. These findings provide an exciting basis for the development of future Weibull PHMs that could result in huge maintenance cost savings and reduced failure occurrences.

  3. Determining the effects of patient casemix on length of hospital stay: a proportional hazards frailty model approach.

    Science.gov (United States)

    Lee, A H; Yau, K K

    2001-01-01

    To identify factors associated with hospital length of stay (LOS) and to model variations in LOS within Diagnosis Related Groups (DRGs). A proportional hazards frailty modelling approach is proposed that accounts for patient transfers and the inherent correlation of patients clustered within hospitals. The investigation is based on patient discharge data extracted for a group of obstetrical DRGs. Application of the frailty approach has highlighted several significant factors after adjustment for patient casemix and random hospital effects. In particular, patients admitted for childbirth with private medical insurance coverage have higher risk of prolonged hospitalization compared to public patients. The determination of pertinent factors provides important information to hospital management and clinicians in assessing the risk of prolonged hospitalization. The analysis also enables the comparison of inter-hospital variations across adjacent DRGs.

  4. DeepSurv: personalized treatment recommender system using a Cox proportional hazards deep neural network.

    Science.gov (United States)

    Katzman, Jared L; Shaham, Uri; Cloninger, Alexander; Bates, Jonathan; Jiang, Tingting; Kluger, Yuval

    2018-02-26

    Medical practitioners use survival models to explore and understand the relationships between patients' covariates (e.g. clinical and genetic features) and the effectiveness of various treatment options. Standard survival models like the linear Cox proportional hazards model require extensive feature engineering or prior medical knowledge to model treatment interaction at an individual level. While nonlinear survival methods, such as neural networks and survival forests, can inherently model these high-level interaction terms, they have yet to be shown as effective treatment recommender systems. We introduce DeepSurv, a Cox proportional hazards deep neural network and state-of-the-art survival method for modeling interactions between a patient's covariates and treatment effectiveness in order to provide personalized treatment recommendations. We perform a number of experiments training DeepSurv on simulated and real survival data. We demonstrate that DeepSurv performs as well as or better than other state-of-the-art survival models and validate that DeepSurv successfully models increasingly complex relationships between a patient's covariates and their risk of failure. We then show how DeepSurv models the relationship between a patient's features and effectiveness of different treatment options to show how DeepSurv can be used to provide individual treatment recommendations. Finally, we train DeepSurv on real clinical studies to demonstrate how it's personalized treatment recommendations would increase the survival time of a set of patients. The predictive and modeling capabilities of DeepSurv will enable medical researchers to use deep neural networks as a tool in their exploration, understanding, and prediction of the effects of a patient's characteristics on their risk of failure.

  5. Comparison of linear, skewed-linear, and proportional hazard models for the analysis of lambing interval in Ripollesa ewes.

    Science.gov (United States)

    Casellas, J; Bach, R

    2012-06-01

    Lambing interval is a relevant reproductive indicator for sheep populations under continuous mating systems, although there is a shortage of selection programs accounting for this trait in the sheep industry. Both the historical assumption of small genetic background and its unorthodox distribution pattern have limited its implementation as a breeding objective. In this manuscript, statistical performances of 3 alternative parametrizations [i.e., symmetric Gaussian mixed linear (GML) model, skew-Gaussian mixed linear (SGML) model, and piecewise Weibull proportional hazard (PWPH) model] have been compared to elucidate the preferred methodology to handle lambing interval data. More specifically, flock-by-flock analyses were performed on 31,986 lambing interval records (257.3 ± 0.2 d) from 6 purebred Ripollesa flocks. Model performances were compared in terms of deviance information criterion (DIC) and Bayes factor (BF). For all flocks, PWPH models were clearly preferred; they generated a reduction of 1,900 or more DIC units and provided BF estimates larger than 100 (i.e., PWPH models against linear models). These differences were reduced when comparing PWPH models with different number of change points for the baseline hazard function. In 4 flocks, only 2 change points were required to minimize the DIC, whereas 4 and 6 change points were needed for the 2 remaining flocks. These differences demonstrated a remarkable degree of heterogeneity across sheep flocks that must be properly accounted for in genetic evaluation models to avoid statistical biases and suboptimal genetic trends. Within this context, all 6 Ripollesa flocks revealed substantial genetic background for lambing interval with heritabilities ranging between 0.13 and 0.19. This study provides the first evidence of the suitability of PWPH models for lambing interval analysis, clearly discarding previous parametrizations focused on mixed linear models.

  6. Evaluation of protocol change in burn-care management using the Cox proportional hazards model with time-dependent covariates.

    Science.gov (United States)

    Ichida, J M; Wassell, J T; Keller, M D; Ayers, L W

    1993-02-01

    Survival analysis methods are valuable for detecting intervention effects because detailed information from patient records and sensitive outcome measures are used. The burn unit at a large university hospital replaced routine bathing with total body bathing using chlorhexidine gluconate for antimicrobial effect. A Cox proportional hazards model was used to analyse time from admission until either infection with Staphylococcus aureus or discharge for 155 patients, controlling for burn severity and two time-dependent covariates: days until first wound excision and days until first administration of prophylactic antibiotics. The risk of infection was 55 per cent higher in the historical control group, although not statistically significant. There was also some indication that early wound excision may be important as an infection-control measure for burn patients.

  7. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the

  8. Augmenting the logrank test in the design of clinical trials in which non-proportional hazards of the treatment effect may be anticipated

    Directory of Open Access Journals (Sweden)

    Patrick Royston

    2016-02-01

    Full Text Available Abstract Background Most randomized controlled trials with a time-to-event outcome are designed assuming proportional hazards (PH of the treatment effect. The sample size calculation is based on a logrank test. However, non-proportional hazards are increasingly common. At analysis, the estimated hazards ratio with a confidence interval is usually presented. The estimate is often obtained from a Cox PH model with treatment as a covariate. If non-proportional hazards are present, the logrank and equivalent Cox tests may lose power. To safeguard power, we previously suggested a ‘joint test’ combining the Cox test with a test of non-proportional hazards. Unfortunately, a larger sample size is needed to preserve power under PH. Here, we describe a novel test that unites the Cox test with a permutation test based on restricted mean survival time. Methods We propose a combined hypothesis test based on a permutation test of the difference in restricted mean survival time across time. The test involves the minimum of the Cox and permutation test P-values. We approximate its null distribution and correct it for correlation between the two P-values. Using extensive simulations, we assess the type 1 error and power of the combined test under several scenarios and compare with other tests. We investigate powering a trial using the combined test. Results The type 1 error of the combined test is close to nominal. Power under proportional hazards is slightly lower than for the Cox test. Enhanced power is available when the treatment difference shows an ‘early effect’, an initial separation of survival curves which diminishes over time. The power is reduced under a ‘late effect’, when little or no difference in survival curves is seen for an initial period and then a late separation occurs. We propose a method of powering a trial using the combined test. The ‘insurance premium’ offered by the combined test to safeguard power under non

  9. Comparing treatment effects after adjustment with multivariable Cox proportional hazards regression and propensity score methods

    NARCIS (Netherlands)

    Martens, Edwin P; de Boer, Anthonius; Pestman, Wiebe R; Belitser, Svetlana V; Stricker, Bruno H Ch; Klungel, Olaf H

    PURPOSE: To compare adjusted effects of drug treatment for hypertension on the risk of stroke from propensity score (PS) methods with a multivariable Cox proportional hazards (Cox PH) regression in an observational study with censored data. METHODS: From two prospective population-based cohort

  10. Predictors of time to relapse in amphetamine-type substance users in the matrix treatment program in Iran: a Cox proportional hazard model application.

    Science.gov (United States)

    Moeeni, Maryam; Razaghi, Emran M; Ponnet, Koen; Torabi, Fatemeh; Shafiee, Seyed Ali; Pashaei, Tahereh

    2016-07-26

    The aim of this study was to determine which predictors influence the risk of relapse among a cohort of amphetamine-type substance (ATS) users in Iran. A Cox proportional hazards model was conducted to determine factors associated with the relapse time in the Matrix treatment program provided by the Iranian National Center of Addiction Studies (INCAS) between March 2010 and October 2011. Participating in more treatment sessions was associated with a lower probability of relapse. On the other hand, patients with less family support, longer dependence on ATS, and those with an experience of casual sex and a history of criminal offenses were more likely to relapse. This study broadens our understanding of factors influencing the risk of relapse in ATS use among an Iranian sample. The findings can guide practitioners during the treatment program.

  11. Proportional and scale change models to project failures of mechanical components with applications to space station

    Science.gov (United States)

    Taneja, Vidya S.

    1996-01-01

    In this paper we develop the mathematical theory of proportional and scale change models to perform reliability analysis. The results obtained will be applied for the Reaction Control System (RCS) thruster valves on an orbiter. With the advent of extended EVA's associated with PROX OPS (ISSA & MIR), and docking, the loss of a thruster valve now takes on an expanded safety significance. Previous studies assume a homogeneous population of components with each component having the same failure rate. However, as various components experience different stresses and are exposed to different environments, their failure rates change with time. In this paper we model the reliability of a thruster valves by treating these valves as a censored repairable system. The model for each valve will take the form of a nonhomogeneous process with the intensity function that is either treated as a proportional hazard model, or a scale change random effects hazard model. Each component has an associated z, an independent realization of the random variable Z from a distribution G(z). This unobserved quantity z can be used to describe heterogeneity systematically. For various models methods for estimating the model parameters using censored data will be developed. Available field data (from previously flown flights) is from non-renewable systems. The estimated failure rate using such data will need to be modified for renewable systems such as thruster valve.

  12. A phenomenological SMA model for combined axial–torsional proportional/non-proportional loading conditions

    International Nuclear Information System (INIS)

    Bodaghi, M.; Damanpack, A.R.; Aghdam, M.M.; Shakeri, M.

    2013-01-01

    In this paper, a simple and robust phenomenological model for shape memory alloys (SMAs) is proposed to simulate main features of SMAs under uniaxial as well as biaxial combined axial–torsional proportional/non-proportional loadings. The constitutive model for polycrystalline SMAs is developed within the framework of continuum thermodynamics of irreversible processes. The model nominates the volume fractions of self-accommodated and oriented martensite as scalar internal variables and the preferred direction of oriented martensitic variants as directional internal variable. An algorithm is introduced to develop explicit relationships for the thermo-mechanical behavior of SMAs under uniaxial and biaxial combined axial–torsional proportional/non-proportional loading conditions and also thermal loading. It is shown that the model is able to simulate main aspects of SMAs including self-accommodation, martensitic transformation, orientation and reorientation of martensite, shape memory effect, ferro-elasticity and pseudo-elasticity. A description of the time-discrete counterpart of the proposed SMA model is presented. Experimental results of uniaxial tension and biaxial combined tension–torsion non-proportional tests are simulated and a good qualitative correlation between numerical and experimental responses is achieved. Due to simplicity and accuracy, the model is expected to be used in the future studies dealing with the analysis of SMA devices in which two stress components including one normal and one shear stress are dominant

  13. a study of the slope of cox proportional hazard and weibull models

    African Journals Online (AJOL)

    Adejumo & Ahmadu

    known and the hazard function is completely specified except for the values of the ... through the air when people who have an active TB infection, cough, sneeze ... The increase of. TB incidence is highest in Africa and Asia, areas with the highest ... further complicating treatment by increasing the length and cost of therapy.

  14. The role of social networks and media receptivity in predicting age of smoking initiation: a proportional hazards model of risk and protective factors.

    Science.gov (United States)

    Unger, J B; Chen, X

    1999-01-01

    The increasing prevalence of adolescent smoking demonstrates the need to identify factors associated with early smoking initiation. Previous studies have shown that smoking by social network members and receptivity to pro-tobacco marketing are associated with smoking among adolescents. It is not clear, however, whether these variables also are associated with the age of smoking initiation. Using data from 10,030 California adolescents, this study identified significant correlates of age of smoking initiation using bivariate methods and a multivariate proportional hazards model. Age of smoking initiation was earlier among those adolescents whose friends, siblings, or parents were smokers, and among those adolescents who had a favorite tobacco advertisement, had received tobacco promotional items, or would be willing to use tobacco promotional items. Results suggest that the smoking behavior of social network members and pro-tobacco media influences are important determinants of age of smoking initiation. Because early smoking initiation is associated with higher levels of addiction in adulthood, tobacco control programs should attempt to counter these influences.

  15. Shared and unshared exposure measurement error in occupational cohort studies and their effects on statistical inference in proportional hazards models

    Science.gov (United States)

    Laurier, Dominique; Rage, Estelle

    2018-01-01

    Exposure measurement error represents one of the most important sources of uncertainty in epidemiology. When exposure uncertainty is not or only poorly accounted for, it can lead to biased risk estimates and a distortion of the shape of the exposure-response relationship. In occupational cohort studies, the time-dependent nature of exposure and changes in the method of exposure assessment may create complex error structures. When a method of group-level exposure assessment is used, individual worker practices and the imprecision of the instrument used to measure the average exposure for a group of workers may give rise to errors that are shared between workers, within workers or both. In contrast to unshared measurement error, the effects of shared errors remain largely unknown. Moreover, exposure uncertainty and magnitude of exposure are typically highest for the earliest years of exposure. We conduct a simulation study based on exposure data of the French cohort of uranium miners to compare the effects of shared and unshared exposure uncertainty on risk estimation and on the shape of the exposure-response curve in proportional hazards models. Our results indicate that uncertainty components shared within workers cause more bias in risk estimation and a more severe attenuation of the exposure-response relationship than unshared exposure uncertainty or exposure uncertainty shared between individuals. These findings underline the importance of careful characterisation and modeling of exposure uncertainty in observational studies. PMID:29408862

  16. An estimating equation for parametric shared frailty models with marginal additive hazards

    DEFF Research Database (Denmark)

    Pipper, Christian Bressen; Martinussen, Torben

    2004-01-01

    Multivariate failure time data arise when data consist of clusters in which the failure times may be dependent. A popular approach to such data is the marginal proportional hazards model with estimation under the working independence assumption. In some contexts, however, it may be more reasonable...

  17. Assessing End-Of-Supply Risk of Spare Parts Using the Proportional Hazard Model

    NARCIS (Netherlands)

    X. Li (Xishu); R. Dekker (Rommert); C. Heij (Christiaan); M. Hekimoğlu (Mustafa)

    2016-01-01

    textabstractOperators of long field-life systems like airplanes are faced with hazards in the supply of spare parts. If the original manufacturers or suppliers of parts end their supply, this may have large impacts on operating costs of firms needing these parts. Existing end-of-supply evaluation

  18. Using Swiss Webster mice to model Fetal Alcohol Spectrum Disorders (FASD): An analysis of multilevel time-to-event data through mixed-effects Cox proportional hazards models.

    Science.gov (United States)

    Chi, Peter; Aras, Radha; Martin, Katie; Favero, Carlita

    2016-05-15

    Fetal Alcohol Spectrum Disorders (FASD) collectively describes the constellation of effects resulting from human alcohol consumption during pregnancy. Even with public awareness, the incidence of FASD is estimated to be upwards of 5% in the general population and is becoming a global health problem. The physical, cognitive, and behavioral impairments of FASD are recapitulated in animal models. Recently rodent models utilizing voluntary drinking paradigms have been developed that accurately reflect moderate consumption, which makes up the majority of FASD cases. The range in severity of FASD characteristics reflects the frequency, dose, developmental timing, and individual susceptibility to alcohol exposure. As most rodent models of FASD use C57BL/6 mice, there is a need to expand the stocks of mice studied in order to more fully understand the complex neurobiology of this disorder. To that end, we allowed pregnant Swiss Webster mice to voluntarily drink ethanol via the drinking in the dark (DID) paradigm throughout their gestation period. Ethanol exposure did not alter gestational outcomes as determined by no significant differences in maternal weight gain, maternal liquid consumption, litter size, or pup weight at birth or weaning. Despite seemingly normal gestation, ethanol-exposed offspring exhibit significantly altered timing to achieve developmental milestones (surface righting, cliff aversion, and open field traversal), as analyzed through mixed-effects Cox proportional hazards models. These results confirm Swiss Webster mice as a viable option to study the incidence and causes of ethanol-induced neurobehavioral alterations during development. Future studies in our laboratory will investigate the brain regions and molecules responsible for these behavioral changes. Copyright © 2016. Published by Elsevier B.V.

  19. A parsimonious model for the proportional control valve

    OpenAIRE

    Elmer, KF; Gentle, CR

    2001-01-01

    A generic non-linear dynamic model of a direct-acting electrohydraulic proportional solenoid valve is presented. The valve consists of two subsystems-s-a spool assembly and one or two unidirectional proportional solenoids. These two subsystems are modelled separately. The solenoid is modelled as a non-linear resistor-inductor combination, with inductance parameters that change with current. An innovative modelling method has been used to represent these components. The spool assembly is model...

  20. Global Drought Proportional Economic Loss Risk Deciles

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Drought Proportional Economic Loss Risk Deciles is a 2.5 minute grid of drought hazard economic loss as proportions of Gross Domestic Product (GDP) per...

  1. Confidence intervals for the first crossing point of two hazard functions.

    Science.gov (United States)

    Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng

    2009-12-01

    The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.

  2. Checking Fine and Gray subdistribution hazards model with cumulative sums of residuals

    DEFF Research Database (Denmark)

    Li, Jianing; Scheike, Thomas; Zhang, Mei Jie

    2015-01-01

    Recently, Fine and Gray (J Am Stat Assoc 94:496–509, 1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter...... estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. The proposed goodness-of-fit test procedures are based on the cumulative sums...

  3. A Model for Generating Multi-hazard Scenarios

    Science.gov (United States)

    Lo Jacomo, A.; Han, D.; Champneys, A.

    2017-12-01

    Communities in mountain areas are often subject to risk from multiple hazards, such as earthquakes, landslides, and floods. Each hazard has its own different rate of onset, duration, and return period. Multiple hazards tend to complicate the combined risk due to their interactions. Prioritising interventions for minimising risk in this context is challenging. We developed a probabilistic multi-hazard model to help inform decision making in multi-hazard areas. The model is applied to a case study region in the Sichuan province in China, using information from satellite imagery and in-situ data. The model is not intended as a predictive model, but rather as a tool which takes stakeholder input and can be used to explore plausible hazard scenarios over time. By using a Monte Carlo framework and varrying uncertain parameters for each of the hazards, the model can be used to explore the effect of different mitigation interventions aimed at reducing the disaster risk within an uncertain hazard context.

  4. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  5. A high-resolution global flood hazard model

    Science.gov (United States)

    Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-09-01

    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.

  6. The proportional odds cumulative incidence model for competing risks

    DEFF Research Database (Denmark)

    Eriksson, Frank; Li, Jianing; Scheike, Thomas

    2015-01-01

    We suggest an estimator for the proportional odds cumulative incidence model for competing risks data. The key advantage of this model is that the regression parameters have the simple and useful odds ratio interpretation. The model has been considered by many authors, but it is rarely used...... in practice due to the lack of reliable estimation procedures. We suggest such procedures and show that their performance improve considerably on existing methods. We also suggest a goodness-of-fit test for the proportional odds assumption. We derive the large sample properties and provide estimators...

  7. Household hazardous waste disposal to landfill: Using LandSim to model leachate migration

    International Nuclear Information System (INIS)

    Slack, Rebecca J.; Gronow, Jan R.; Hall, David H.; Voulvoulis, Nikolaos

    2007-01-01

    Municipal solid waste (MSW) landfill leachate contains a number of aquatic pollutants. A specific MSW stream often referred to as household hazardous waste (HHW) can be considered to contribute a large proportion of these pollutants. This paper describes the use of the LandSim (Landfill Performance Simulation) modelling program to assess the environmental consequences of leachate release from a generic MSW landfill in receipt of co-disposed HHW. Heavy metals and organic pollutants were found to migrate into the zones beneath a model landfill site over a 20,000-year period. Arsenic and chromium were found to exceed European Union and US-EPA drinking water standards at the unsaturated zone/aquifer interface, with levels of mercury and cadmium exceeding minimum reporting values (MRVs). The findings demonstrate the pollution potential arising from HHW disposal with MSW. - Aquatic pollutants linked to the disposal of household hazardous waste in municipal landfills have the potential to exist in soil and groundwater for many years

  8. Multistate cohort models with proportional transfer rates

    DEFF Research Database (Denmark)

    Schoen, Robert; Canudas-Romo, Vladimir

    2006-01-01

    of transfer rates. The two living state case and hierarchical multistate models with any number of living states are analyzed in detail. Applying our approach to 1997 U.S. fertility data, we find that observed rates of parity progression are roughly proportional over age. Our proportional transfer rate...... approach provides trajectories by parity state and facilitates analyses of the implications of changes in parity rate levels and patterns. More women complete childbearing at parity 2 than at any other parity, and parity 2 would be the modal parity in models with total fertility rates (TFRs) of 1.40 to 2......We present a new, broadly applicable approach to summarizing the behavior of a cohort as it moves through a variety of statuses (or states). The approach is based on the assumption that all rates of transfer maintain a constant ratio to one another over age. We present closed-form expressions...

  9. Models for estimating the radiation hazards of uranium mines

    International Nuclear Information System (INIS)

    Wise, K.N.

    1982-01-01

    Hazards to the health of workers in uranium mines derive from the decay products of radon and from uranium and its descendants. Radon daughters in mine atmospheres are either attached to aerosols or exist as free atoms and their physical state determines in which part of the lung the daughters deposit. The factors which influence the proportions of radon daughters attached to aerosols, their deposition in the lung and the dose received by the cells in lung tissue are discussed. The estimation of dose to tissue from inhalation or ingestion of uranium and daughters is based on a different set of models which have been applied in recent ICRP reports. The models used to describe the deposition of particulates, their movement in the gut and their uptake by organs, which form the basis for future limits on the concentration of uranium and daughters in air or on their intake with food, are outlined

  10. Models for estimating the radiation hazards of uranium mines

    International Nuclear Information System (INIS)

    Wise, K.N.

    1990-01-01

    Hazards to the health of workers in uranium mines derive from the decay products of radon and from uranium and its descendants. Radon daughters in mine atmospheres are either attached to aerosols or exist as free atoms and their physical state determines in which part of the lung the daughters deposit. The factors which influence the proportions of radon daughters attached to aerosols, their deposition in the lung and the dose received by the cells in lung tissue are discussed. The estimation of dose to tissue from inhalation of ingestion or uranium and daughters is based on a different set of models which have been applied in recent ICRP reports. The models used to describe the deposition of particulates, their movement in the gut and their uptake by organs, which form the basis for future limits on the concentration of uranium and daughters in air or on their intake with food, are outlined. 34 refs., 12 tabs., 9 figs

  11. Artificial neural networks versus proportional hazards Cox models to predict 45-year all-cause mortality in the Italian Rural Areas of the Seven Countries Study

    Directory of Open Access Journals (Sweden)

    Puddu Paolo

    2012-07-01

    Full Text Available Abstract Background Projection pursuit regression, multilayer feed-forward networks, multivariate adaptive regression splines and trees (including survival trees have challenged classic multivariable models such as the multiple logistic function, the proportional hazards life table Cox model (Cox, the Poisson’s model, and the Weibull’s life table model to perform multivariable predictions. However, only artificial neural networks (NN have become popular in medical applications. Results We compared several Cox versus NN models in predicting 45-year all-cause mortality (45-ACM by 18 risk factors selected a priori: age; father life status; mother life status; family history of cardiovascular diseases; job-related physical activity; cigarette smoking; body mass index (linear and quadratic terms; arm circumference; mean blood pressure; heart rate; forced expiratory volume; serum cholesterol; corneal arcus; diagnoses of cardiovascular diseases, cancer and diabetes; minor ECG abnormalities at rest. Two Italian rural cohorts of the Seven Countries Study, made up of men aged 40 to 59 years, enrolled and first examined in 1960 in Italy. Cox models were estimated by: a forcing all factors; b a forward-; and c a backward-stepwise procedure. Observed cases of deaths and of survivors were computed in decile classes of estimated risk. Forced and stepwise NN were run and compared by C-statistics (ROC analysis with the Cox models. Out of 1591 men, 1447 died. Model global accuracies were extremely high by all methods (ROCs > 0.810 but there was no clear-cut superiority of any model to predict 45-ACM. The highest ROCs (> 0.838 were observed by NN. There were inter-model variations to select predictive covariates: whereas all models concurred to define the role of 10 covariates (mainly cardiovascular risk factors, family history, heart rate and minor ECG abnormalities were not contributors by Cox models but were so by forced NN. Forced expiratory volume and arm

  12. Estimation in the positive stable shared frailty Cox proportional hazards model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Pipper, Christian Bressen

    2005-01-01

    model in situations where the correlated survival data show a decreasing association with time. In this paper, we devise a likelihood based estimation procedure for the positive stable shared frailty Cox model, which is expected to obtain high efficiency. The proposed estimator is provided with large...

  13. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  14. Education and risk of coronary heart disease: Assessment of mediation by behavioural risk factors using the additive hazards model

    DEFF Research Database (Denmark)

    Nordahl, H; Rod, NH; Frederiksen, BL

    2013-01-01

    seven Danish cohort studies were linked to registry data on education and incidence of CHD. Mediation by smoking, low physical activity, and body mass index (BMI) on the association between education and CHD were estimated by applying newly proposed methods for mediation based on the additive hazards...... % CI: 12, 22) for women and 37 (95 % CI: 28, 46) for men could be ascribed to the pathway through smoking. Further, 39 (95 % CI: 30, 49) cases for women and 94 (95 % CI: 79, 110) cases for men could be ascribed to the pathway through BMI. The effects of low physical activity were negligible. Using...... contemporary methods, the additive hazards model, for mediation we indicated the absolute numbers of CHD cases prevented when modifying smoking and BMI. This study confirms previous claims based on the Cox proportional hazards model that behavioral risk factors partially mediates the effect of education on CHD...

  15. An optimization model for transportation of hazardous materials

    International Nuclear Information System (INIS)

    Seyed-Hosseini, M.; Kheirkhah, A. S.

    2005-01-01

    In this paper, the optimal routing problem for transportation of hazardous materials is studied. Routing for the purpose of reducing the risk of transportation of hazardous materials has been studied and formulated by many researcher and several routing models have been presented up to now. These models can be classified into the categories: the models for routing a single movement and the models for routing multiple movements. In this paper, according to the current rules and regulations of road transportations of hazardous materials in Iran, a routing problem is designed. In this problem, the routs for several independent movements are simultaneously determined. To examine the model, the problem the transportations of two different dangerous materials in the road network of Mazandaran province in the north of Iran is formulated and solved by applying Integer programming model

  16. The New Italian Seismic Hazard Model

    Science.gov (United States)

    Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.

    2017-12-01

    In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme

  17. Hazard avoidance via descent images for safe landing

    Science.gov (United States)

    Yan, Ruicheng; Cao, Zhiguo; Zhu, Lei; Fang, Zhiwen

    2013-10-01

    In planetary or lunar landing missions, hazard avoidance is critical for landing safety. Therefore, it is very important to correctly detect hazards and effectively find a safe landing area during the last stage of descent. In this paper, we propose a passive sensing based HDA (hazard detection and avoidance) approach via descent images to lower the landing risk. In hazard detection stage, a statistical probability model on the basis of the hazard similarity is adopted to evaluate the image and detect hazardous areas, so that a binary hazard image can be generated. Afterwards, a safety coefficient, which jointly utilized the proportion of hazards in the local region and the inside hazard distribution, is proposed to find potential regions with less hazards in the binary hazard image. By using the safety coefficient in a coarse-to-fine procedure and combining it with the local ISD (intensity standard deviation) measure, the safe landing area is determined. The algorithm is evaluated and verified with many simulated descent downward looking images rendered from lunar orbital satellite images.

  18. Automated economic analysis model for hazardous waste minimization

    International Nuclear Information System (INIS)

    Dharmavaram, S.; Mount, J.B.; Donahue, B.A.

    1990-01-01

    The US Army has established a policy of achieving a 50 percent reduction in hazardous waste generation by the end of 1992. To assist the Army in reaching this goal, the Environmental Division of the US Army Construction Engineering Research Laboratory (USACERL) designed the Economic Analysis Model for Hazardous Waste Minimization (EAHWM). The EAHWM was designed to allow the user to evaluate the life cycle costs for various techniques used in hazardous waste minimization and to compare them to the life cycle costs of current operating practices. The program was developed in C language on an IBM compatible PC and is consistent with other pertinent models for performing economic analyses. The potential hierarchical minimization categories used in EAHWM include source reduction, recovery and/or reuse, and treatment. Although treatment is no longer an acceptable minimization option, its use is widespread and has therefore been addressed in the model. The model allows for economic analysis for minimization of the Army's six most important hazardous waste streams. These include, solvents, paint stripping wastes, metal plating wastes, industrial waste-sludges, used oils, and batteries and battery electrolytes. The EAHWM also includes a general application which can be used to calculate and compare the life cycle costs for minimization alternatives of any waste stream, hazardous or non-hazardous. The EAHWM has been fully tested and implemented in more than 60 Army installations in the United States

  19. Hazard identification based on plant functional modelling

    International Nuclear Information System (INIS)

    Rasmussen, B.; Whetton, C.

    1993-10-01

    A major objective of the present work is to provide means for representing a process plant as a socio-technical system, so as to allow hazard identification at a high level. The method includes technical, human and organisational aspects and is intended to be used for plant level hazard identification so as to identify critical areas and the need for further analysis using existing methods. The first part of the method is the preparation of a plant functional model where a set of plant functions link together hardware, software, operations, work organisation and other safety related aspects of the plant. The basic principle of the functional modelling is that any aspect of the plant can be represented by an object (in the sense that this term is used in computer science) based upon an Intent (or goal); associated with each Intent are Methods, by which the Intent is realized, and Constraints, which limit the Intent. The Methods and Constraints can themselves be treated as objects and decomposed into lower-level Intents (hence the procedure is known as functional decomposition) so giving rise to a hierarchical, object-oriented structure. The plant level hazard identification is carried out on the plant functional model using the Concept Hazard Analysis method. In this, the user will be supported by checklists and keywords and the analysis is structured by pre-defined worksheets. The preparation of the plant functional model and the performance of the hazard identification can be carried out manually or with computer support. (au) (4 tabs., 10 ills., 7 refs.)

  20. Discrimination of numerical proportions: A comparison of binomial and Gaussian models.

    Science.gov (United States)

    Raidvee, Aire; Lember, Jüri; Allik, Jüri

    2017-01-01

    Observers discriminated the numerical proportion of two sets of elements (N = 9, 13, 33, and 65) that differed either by color or orientation. According to the standard Thurstonian approach, the accuracy of proportion discrimination is determined by irreducible noise in the nervous system that stochastically transforms the number of presented visual elements onto a continuum of psychological states representing numerosity. As an alternative to this customary approach, we propose a Thurstonian-binomial model, which assumes discrete perceptual states, each of which is associated with a certain visual element. It is shown that the probability β with which each visual element can be noticed and registered by the perceptual system can explain data of numerical proportion discrimination at least as well as the continuous Thurstonian-Gaussian model, and better, if the greater parsimony of the Thurstonian-binomial model is taken into account using AIC model selection. We conclude that Gaussian and binomial models represent two different fundamental principles-internal noise vs. using only a fraction of available information-which are both plausible descriptions of visual perception.

  1. Comparison of Cox and Gray's survival models in severe sepsis

    DEFF Research Database (Denmark)

    Kasal, Jan; Andersen, Zorana Jovanovic; Clermont, Gilles

    2004-01-01

    Although survival is traditionally modeled using Cox proportional hazards modeling, this approach may be inappropriate in sepsis, in which the proportional hazards assumption does not hold. Newer, more flexible models, such as Gray's model, may be more appropriate....

  2. Modeling lahar behavior and hazards

    Science.gov (United States)

    Manville, Vernon; Major, Jon J.; Fagents, Sarah A.

    2013-01-01

    Lahars are highly mobile mixtures of water and sediment of volcanic origin that are capable of traveling tens to > 100 km at speeds exceeding tens of km hr-1. Such flows are among the most serious ground-based hazards at many volcanoes because of their sudden onset, rapid advance rates, long runout distances, high energy, ability to transport large volumes of material, and tendency to flow along existing river channels where populations and infrastructure are commonly concentrated. They can grow in volume and peak discharge through erosion and incorporation of external sediment and/or water, inundate broad areas, and leave deposits many meters thick. Furthermore, lahars can recur for many years to decades after an initial volcanic eruption, as fresh pyroclastic material is eroded and redeposited during rainfall events, resulting in a spatially and temporally evolving hazard. Improving understanding of the behavior of these complex, gravitationally driven, multi-phase flows is key to mitigating the threat to communities at lahar-prone volcanoes. However, their complexity and evolving nature pose significant challenges to developing the models of flow behavior required for delineating their hazards and hazard zones.

  3. Household hazardous waste disposal to landfill: using LandSim to model leachate migration.

    Science.gov (United States)

    Slack, Rebecca J; Gronow, Jan R; Hall, David H; Voulvoulis, Nikolaos

    2007-03-01

    Municipal solid waste (MSW) landfill leachate contains a number of aquatic pollutants. A specific MSW stream often referred to as household hazardous waste (HHW) can be considered to contribute a large proportion of these pollutants. This paper describes the use of the LandSim (Landfill Performance Simulation) modelling program to assess the environmental consequences of leachate release from a generic MSW landfill in receipt of co-disposed HHW. Heavy metals and organic pollutants were found to migrate into the zones beneath a model landfill site over a 20,000-year period. Arsenic and chromium were found to exceed European Union and US-EPA drinking water standards at the unsaturated zone/aquifer interface, with levels of mercury and cadmium exceeding minimum reporting values (MRVs). The findings demonstrate the pollution potential arising from HHW disposal with MSW.

  4. Modeling Compound Flood Hazards in Coastal Embayments

    Science.gov (United States)

    Moftakhari, H.; Schubert, J. E.; AghaKouchak, A.; Luke, A.; Matthew, R.; Sanders, B. F.

    2017-12-01

    Coastal cities around the world are built on lowland topography adjacent to coastal embayments and river estuaries, where multiple factors threaten increasing flood hazards (e.g. sea level rise and river flooding). Quantitative risk assessment is required for administration of flood insurance programs and the design of cost-effective flood risk reduction measures. This demands a characterization of extreme water levels such as 100 and 500 year return period events. Furthermore, hydrodynamic flood models are routinely used to characterize localized flood level intensities (i.e., local depth and velocity) based on boundary forcing sampled from extreme value distributions. For example, extreme flood discharges in the U.S. are estimated from measured flood peaks using the Log-Pearson Type III distribution. However, configuring hydrodynamic models for coastal embayments is challenging because of compound extreme flood events: events caused by a combination of extreme sea levels, extreme river discharges, and possibly other factors such as extreme waves and precipitation causing pluvial flooding in urban developments. Here, we present an approach for flood risk assessment that coordinates multivariate extreme analysis with hydrodynamic modeling of coastal embayments. First, we evaluate the significance of correlation structure between terrestrial freshwater inflow and oceanic variables; second, this correlation structure is described using copula functions in unit joint probability domain; and third, we choose a series of compound design scenarios for hydrodynamic modeling based on their occurrence likelihood. The design scenarios include the most likely compound event (with the highest joint probability density), preferred marginal scenario and reproduced time series of ensembles based on Monte Carlo sampling of bivariate hazard domain. The comparison between resulting extreme water dynamics under the compound hazard scenarios explained above provides an insight to the

  5. The 2014 United States National Seismic Hazard Model

    Science.gov (United States)

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Mueller, Charles; Haller, Kathleen; Frankel, Arthur; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen; Boyd, Oliver; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nicolas; Wheeler, Russell; Williams, Robert; Olsen, Anna H.

    2015-01-01

    New seismic hazard maps have been developed for the conterminous United States using the latest data, models, and methods available for assessing earthquake hazard. The hazard models incorporate new information on earthquake rupture behavior observed in recent earthquakes; fault studies that use both geologic and geodetic strain rate data; earthquake catalogs through 2012 that include new assessments of locations and magnitudes; earthquake adaptive smoothing models that more fully account for the spatial clustering of earthquakes; and 22 ground motion models, some of which consider more than double the shaking data applied previously. Alternative input models account for larger earthquakes, more complicated ruptures, and more varied ground shaking estimates than assumed in earlier models. The ground motions, for levels applied in building codes, differ from the previous version by less than ±10% over 60% of the country, but can differ by ±50% in localized areas. The models are incorporated in insurance rates, risk assessments, and as input into the U.S. building code provisions for earthquake ground shaking.

  6. Debris flow hazard modelling on medium scale: Valtellina di Tirano, Italy

    Directory of Open Access Journals (Sweden)

    J. Blahut

    2010-11-01

    Full Text Available Debris flow hazard modelling at medium (regional scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal, and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy. The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R, developed at the University of Lausanne (Switzerland. An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise

  7. Report 6: Guidance document. Man-made hazards and Accidental Aircraft Crash hazards modelling and implementation in extended PSA

    International Nuclear Information System (INIS)

    Kahia, S.; Brinkman, H.; Bareith, A.; Siklossy, T.; Vinot, T.; Mateescu, T.; Espargilliere, J.; Burgazzi, L.; Ivanov, I.; Bogdanov, D.; Groudev, P.; Ostapchuk, S.; Zhabin, O.; Stojka, T.; Alzbutas, R.; Kumar, M.; Nitoi, M.; Farcasiu, M.; Borysiewicz, M.; Kowal, K.; Potempski, S.

    2016-01-01

    The goal of this report is to provide guidance on practices to model man-made hazards (mainly external fires and explosions) and accidental aircraft crash hazards and implement them in extended Level 1 PSA. This report is a joint deliverable of work package 21 (WP21) and work package 22 (WP22). The general objective of WP21 is to provide guidance on all of the individual hazards selected at the first ASAMPSA-E End Users Workshop (May 2014, Uppsala, Sweden). The objective of WP22 is to provide the solutions for purposes of different parts of man-made hazards Level 1 PSA fulfilment. This guidance is focusing on man-made hazards, namely: external fires and explosions, and accidental aircraft crash hazards. Guidance developed refers to existing guidance whenever possible. The initial part of guidance (WP21 part) reflects current practices to assess the frequencies for each type of hazards or combination of hazards (including correlated hazards) as initiating event for PSAs. The sources and quality of hazard data, the elements of hazard assessment methodologies and relevant examples are discussed. Classification and criteria to properly assess hazard combinations as well as examples and methods for assessment of these combinations are included in this guidance. In appendixes additional material is presented with the examples of practical approaches to aircraft crash and man-made hazard. The following issues are addressed: 1) Hazard assessment methodologies, including issues related to hazard combinations. 2) Modelling equipment of safety related SSC, 3) HRA, 4) Emergency response, 5) Multi-unit issues. Recommendations and also limitations, gaps identified in the existing methodologies and a list of open issues are included. At all stages of this guidance and especially from an industrial end-user perspective, one must keep in mind that the development of man-made hazards probabilistic analysis must be conditioned to the ability to ultimately obtain a representative risk

  8. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    Science.gov (United States)

    Stirling, Mark; Bebbington, Mark; Brenna, Marco; Cronin, Shane; Christophersen, Annemarie; Deligne, Natalia; Hurst, Tony; Jolly, Art; Jolly, Gill; Kennedy, Ben; Kereszturi, Gabor; Lindsay, Jan; Neall, Vince; Procter, Jonathan; Rhoades, David; Scott, Brad; Shane, Phil; Smith, Ian; Smith, Richard; Wang, Ting; White, James D. L.; Wilson, Colin J. N.; Wilson, Tom

    2017-06-01

    We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g. short-term forecasting). The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  9. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    Directory of Open Access Journals (Sweden)

    Mark Stirling

    2017-06-01

    Full Text Available We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration, and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g., short-term forecasting. The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  10. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    Science.gov (United States)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  11. Flexible competing risks regression modeling and goodness-of-fit

    DEFF Research Database (Denmark)

    Scheike, Thomas; Zhang, Mei-Jie

    2008-01-01

    In this paper we consider different approaches for estimation and assessment of covariate effects for the cumulative incidence curve in the competing risks model. The classic approach is to model all cause-specific hazards and then estimate the cumulative incidence curve based on these cause...... models that is easy to fit and contains the Fine-Gray model as a special case. One advantage of this approach is that our regression modeling allows for non-proportional hazards. This leads to a new simple goodness-of-fit procedure for the proportional subdistribution hazards assumption that is very easy...... of the flexible regression models to analyze competing risks data when non-proportionality is present in the data....

  12. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    Science.gov (United States)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  13. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    Science.gov (United States)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  14. Fuzzy portfolio model with fuzzy-input return rates and fuzzy-output proportions

    Science.gov (United States)

    Tsaur, Ruey-Chyn

    2015-02-01

    In the finance market, a short-term investment strategy is usually applied in portfolio selection in order to reduce investment risk; however, the economy is uncertain and the investment period is short. Further, an investor has incomplete information for selecting a portfolio with crisp proportions for each chosen security. In this paper we present a new method of constructing fuzzy portfolio model for the parameters of fuzzy-input return rates and fuzzy-output proportions, based on possibilistic mean-standard deviation models. Furthermore, we consider both excess or shortage of investment in different economic periods by using fuzzy constraint for the sum of the fuzzy proportions, and we also refer to risks of securities investment and vagueness of incomplete information during the period of depression economics for the portfolio selection. Finally, we present a numerical example of a portfolio selection problem to illustrate the proposed model and a sensitivity analysis is realised based on the results.

  15. Populational Growth Models Proportional to Beta Densities with Allee Effect

    Science.gov (United States)

    Aleixo, Sandra M.; Rocha, J. Leonel; Pestana, Dinis D.

    2009-05-01

    We consider populations growth models with Allee effect, proportional to beta densities with shape parameters p and 2, where the dynamical complexity is related with the Malthusian parameter r. For p>2, these models exhibit a population dynamics with natural Allee effect. However, in the case of 1models do not include this effect. In order to inforce it, we present some alternative models and investigate their dynamics, presenting some important results.

  16. A New Seismic Hazard Model for Mainland China

    Science.gov (United States)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.

    2017-12-01

    We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.

  17. Modeling and Hazard Analysis Using STPA

    Science.gov (United States)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  18. PHAZE, Parametric Hazard Function Estimation

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: Phaze performs statistical inference calculations on a hazard function (also called a failure rate or intensity function) based on reported failure times of components that are repaired and restored to service. Three parametric models are allowed: the exponential, linear, and Weibull hazard models. The inference includes estimation (maximum likelihood estimators and confidence regions) of the parameters and of the hazard function itself, testing of hypotheses such as increasing failure rate, and checking of the model assumptions. 2 - Methods: PHAZE assumes that the failures of a component follow a time-dependent (or non-homogenous) Poisson process and that the failure counts in non-overlapping time intervals are independent. Implicit in the independence property is the assumption that the component is restored to service immediately after any failure, with negligible repair time. The failures of one component are assumed to be independent of those of another component; a proportional hazards model is used. Data for a component are called time censored if the component is observed for a fixed time-period, or plant records covering a fixed time-period are examined, and the failure times are recorded. The number of these failures is random. Data are called failure censored if the component is kept in service until a predetermined number of failures has occurred, at which time the component is removed from service. In this case, the number of failures is fixed, but the end of the observation period equals the final failure time and is random. A typical PHAZE session consists of reading failure data from a file prepared previously, selecting one of the three models, and performing data analysis (i.e., performing the usual statistical inference about the parameters of the model, with special emphasis on the parameter(s) that determine whether the hazard function is increasing). The final goals of the inference are a point estimate

  19. [Hazard function and life table: an introduction to the failure time analysis].

    Science.gov (United States)

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  20. Quantitative occupational risk model: Single hazard

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Aneziris, O.N.; Bellamy, L.J.; Ale, B.J.M.; Oh, J.

    2017-01-01

    A model for the quantification of occupational risk of a worker exposed to a single hazard is presented. The model connects the working conditions and worker behaviour to the probability of an accident resulting into one of three types of consequence: recoverable injury, permanent injury and death. Working conditions and safety barriers in place to reduce the likelihood of an accident are included. Logical connections are modelled through an influence diagram. Quantification of the model is based on two sources of information: a) number of accidents observed over a period of time and b) assessment of exposure data of activities and working conditions over the same period of time and the same working population. Effectiveness of risk reducing measures affecting the working conditions, worker behaviour and/or safety barriers can be quantified through the effect of these measures on occupational risk. - Highlights: • Quantification of occupational risk from a single hazard. • Influence diagram connects working conditions, worker behaviour and safety barriers. • Necessary data include the number of accidents and the total exposure of worker • Effectiveness of risk reducing measures is quantified through the impact on the risk • An example illustrates the methodology.

  1. The 2014 update to the National Seismic Hazard Model in California

    Science.gov (United States)

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  2. Integrate urban‐scale seismic hazard analyses with the U.S. National Seismic Hazard Model

    Science.gov (United States)

    Moschetti, Morgan P.; Luco, Nicolas; Frankel, Arthur; Petersen, Mark D.; Aagaard, Brad T.; Baltay, Annemarie S.; Blanpied, Michael; Boyd, Oliver; Briggs, Richard; Gold, Ryan D.; Graves, Robert; Hartzell, Stephen; Rezaeian, Sanaz; Stephenson, William J.; Wald, David J.; Williams, Robert A.; Withers, Kyle

    2018-01-01

    For more than 20 yrs, damage patterns and instrumental recordings have highlighted the influence of the local 3D geologic structure on earthquake ground motions (e.g., M">M 6.7 Northridge, California, Gao et al., 1996; M">M 6.9 Kobe, Japan, Kawase, 1996; M">M 6.8 Nisqually, Washington, Frankel, Carver, and Williams, 2002). Although this and other local‐scale features are critical to improving seismic hazard forecasts, historically they have not been explicitly incorporated into the U.S. National Seismic Hazard Model (NSHM, national model and maps), primarily because the necessary basin maps and methodologies were not available at the national scale. Instead,...

  3. Risk factors for the hazard of lameness in Danish Standardbred trotters

    DEFF Research Database (Denmark)

    Vigre, Håkan; Chriel, M.; Hesselholt, M.

    2002-01-01

    associations between the hazard of lameness and selected risk factors. The study population was dynamic and contained data of 265 Standardbred trotters monitored during 5 months in 1997 and 1998. The horses were greater than or equal to2 years old. Optimal training was defined as when the horse followed......-frequent cause of interruption of optimal training: 84 events in 69 horses (0.09 events per horse-month). Respiratory diseases (16 events) and muscular problems (seven events) were the second and third most-frequent causes of interrupted training. The effects of trainer, gender, age-group, time with a trainer......, participation in races and current month on the hazard of lameness were estimated in a multivariable Cox proportional-hazard model. The effects of trainer, gender and age-group were modelled as time-independent. The effects of time with a trainer, participation in races and the current month were modelled...

  4. Opinion: The use of natural hazard modeling for decision making under uncertainty

    Science.gov (United States)

    David E. Calkin; Mike Mentis

    2015-01-01

    Decision making to mitigate the effects of natural hazards is a complex undertaking fraught with uncertainty. Models to describe risks associated with natural hazards have proliferated in recent years. Concurrently, there is a growing body of work focused on developing best practices for natural hazard modeling and to create structured evaluation criteria for complex...

  5. Macro-level vulnerable road users crash analysis: A Bayesian joint modeling approach of frequency and proportion.

    Science.gov (United States)

    Cai, Qing; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2017-10-01

    This study aims at contributing to the literature on pedestrian and bicyclist safety by building on the conventional count regression models to explore exogenous factors affecting pedestrian and bicyclist crashes at the macroscopic level. In the traditional count models, effects of exogenous factors on non-motorist crashes were investigated directly. However, the vulnerable road users' crashes are collisions between vehicles and non-motorists. Thus, the exogenous factors can affect the non-motorist crashes through the non-motorists and vehicle drivers. To accommodate for the potentially different impact of exogenous factors we convert the non-motorist crash counts as the product of total crash counts and proportion of non-motorist crashes and formulate a joint model of the negative binomial (NB) model and the logit model to deal with the two parts, respectively. The formulated joint model is estimated using non-motorist crash data based on the Traffic Analysis Districts (TADs) in Florida. Meanwhile, the traditional NB model is also estimated and compared with the joint model. The result indicates that the joint model provides better data fit and can identify more significant variables. Subsequently, a novel joint screening method is suggested based on the proposed model to identify hot zones for non-motorist crashes. The hot zones of non-motorist crashes are identified and divided into three types: hot zones with more dangerous driving environment only, hot zones with more hazardous walking and cycling conditions only, and hot zones with both. It is expected that the joint model and screening method can help decision makers, transportation officials, and community planners to make more efficient treatments to proactively improve pedestrian and bicyclist safety. Published by Elsevier Ltd.

  6. Hazard assessment and risk management of offshore production chemicals

    International Nuclear Information System (INIS)

    Schobben, H.P.M.; Scholten, M.C.T.; Vik, E.A.; Bakke, S.

    1994-01-01

    There is a clear need for harmonization of the regulations with regard to the use and discharge of drilling and production chemicals in the North Sea. Therefore the CHARM (Chemical Hazard Assessment and Risk Management) model was developed. Both government (of several countries) and industry (E and P and chemical suppliers) participated in the project. The CHARM model is discussed and accepted by OSPARCON. The CHARM model consists of several modules. The model starts with a prescreening on the basis of hazardous properties like persistency, accumulation potential and the appearance on black lists. The core of the model.consists of modules for hazard assessment and risk analysis. Hazard assessment covers a general environmental evaluation of a chemical on the basis of intrinsic properties of that chemical. Risk analysis covers a more specific evaluation of the environmental impact from the use of a production chemical, or a combination of chemicals, under actual conditions. In the risk management module the user is guided to reduce the total risk of all chemicals used on a platform by the definition of measures in the most cost-effective way. The model calculates the environmental impact for the marine environment. Thereto three parts are distinguished: pelagic, benthic and food chain. Both hazard assessment and risk analysis are based on a proportional comparison of an estimated PEC with an estimated NEC. The PEC is estimated from the use, release, dilution and fate of the chemical and the NEC is estimated from the available toxicity data of the chemicals

  7. Race and Beta-Blocker Survival Benefit in Patients With Heart Failure: An Investigation of Self-Reported Race and Proportion of African Genetic Ancestry.

    Science.gov (United States)

    Luzum, Jasmine A; Peterson, Edward; Li, Jia; She, Ruicong; Gui, Hongsheng; Liu, Bin; Spertus, John A; Pinto, Yigal M; Williams, L Keoki; Sabbah, Hani N; Lanfear, David E

    2018-05-08

    It remains unclear whether beta-blockade is similarly effective in black patients with heart failure and reduced ejection fraction as in white patients, but self-reported race is a complex social construct with both biological and environmental components. The objective of this study was to compare the reduction in mortality associated with beta-blocker exposure in heart failure and reduced ejection fraction patients by both self-reported race and by proportion African genetic ancestry. Insured patients with heart failure and reduced ejection fraction (n=1122) were included in a prospective registry at Henry Ford Health System. This included 575 self-reported blacks (129 deaths, 22%) and 547 self-reported whites (126 deaths, 23%) followed for a median 3.0 years. Beta-blocker exposure (BBexp) was calculated from pharmacy claims, and the proportion of African genetic ancestry was determined from genome-wide array data. Time-dependent Cox proportional hazards regression was used to separately test the association of BBexp with all-cause mortality by self-reported race or by proportion of African genetic ancestry. Both sets of models were evaluated unadjusted and then adjusted for baseline risk factors and beta-blocker propensity score. BBexp effect estimates were protective and of similar magnitude both by self-reported race and by African genetic ancestry (adjusted hazard ratio=0.56 in blacks and adjusted hazard ratio=0.48 in whites). The tests for interactions with BBexp for both self-reported race and for African genetic ancestry were not statistically significant in any model ( P >0.1 for all). Among black and white patients with heart failure and reduced ejection fraction, reduction in all-cause mortality associated with BBexp was similar, regardless of self-reported race or proportion African genetic ancestry. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  8. Bayes estimation of the general hazard rate model

    International Nuclear Information System (INIS)

    Sarhan, A.

    1999-01-01

    In reliability theory and life testing models, the life time distributions are often specified by choosing a relevant hazard rate function. Here a general hazard rate function h(t)=a+bt c-1 , where c, a, b are constants greater than zero, is considered. The parameter c is assumed to be known. The Bayes estimators of (a,b) based on the data of type II/item-censored testing without replacement are obtained. A large simulation study using Monte Carlo Method is done to compare the performance of Bayes with regression estimators of (a,b). The criterion for comparison is made based on the Bayes risk associated with the respective estimator. Also, the influence of the number of failed items on the accuracy of the estimators (Bayes and regression) is investigated. Estimations for the parameters (a,b) of the linearly increasing hazard rate model h(t)=a+bt, where a, b are greater than zero, can be obtained as the special case, letting c=2

  9. Modelling Multi Hazard Mapping in Semarang City Using GIS-Fuzzy Method

    Science.gov (United States)

    Nugraha, A. L.; Awaluddin, M.; Sasmito, B.

    2018-02-01

    One important aspect of disaster mitigation planning is hazard mapping. Hazard mapping can provide spatial information on the distribution of locations that are threatened by disaster. Semarang City as the capital of Central Java Province is one of the cities with high natural disaster intensity. Frequent natural disasters Semarang city is tidal flood, floods, landslides, and droughts. Therefore, Semarang City needs spatial information by doing multi hazard mapping to support disaster mitigation planning in Semarang City. Multi Hazards map modelling can be derived from parameters such as slope maps, rainfall, land use, and soil types. This modelling is done by using GIS method with scoring and overlay technique. However, the accuracy of modelling would be better if the GIS method is combined with Fuzzy Logic techniques to provide a good classification in determining disaster threats. The Fuzzy-GIS method will build a multi hazards map of Semarang city can deliver results with good accuracy and with appropriate threat class spread so as to provide disaster information for disaster mitigation planning of Semarang city. from the multi-hazard modelling using GIS-Fuzzy can be known type of membership that has a good accuracy is the type of membership Gauss with RMSE of 0.404 the smallest of the other membership and VAF value of 72.909% of the largest of the other membership.

  10. Analysis of time to event outcomes in randomized controlled trials by generalized additive models.

    Directory of Open Access Journals (Sweden)

    Christos Argyropoulos

    Full Text Available Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking.By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population.PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data.By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial results under proportional and

  11. Global river flood hazard maps: hydraulic modelling methods and appropriate uses

    Science.gov (United States)

    Townend, Samuel; Smith, Helen; Molloy, James

    2014-05-01

    Flood hazard is not well understood or documented in many parts of the world. Consequently, the (re-)insurance sector now needs to better understand where the potential for considerable river flooding aligns with significant exposure. For example, international manufacturing companies are often attracted to countries with emerging economies, meaning that events such as the 2011 Thailand floods have resulted in many multinational businesses with assets in these regions incurring large, unexpected losses. This contribution addresses and critically evaluates the hydraulic methods employed to develop a consistent global scale set of river flood hazard maps, used to fill the knowledge gap outlined above. The basis of the modelling approach is an innovative, bespoke 1D/2D hydraulic model (RFlow) which has been used to model a global river network of over 5.3 million kilometres. Estimated flood peaks at each of these model nodes are determined using an empirically based rainfall-runoff approach linking design rainfall to design river flood magnitudes. The hydraulic model is used to determine extents and depths of floodplain inundation following river bank overflow. From this, deterministic flood hazard maps are calculated for several design return periods between 20-years and 1,500-years. Firstly, we will discuss the rationale behind the appropriate hydraulic modelling methods and inputs chosen to produce a consistent global scaled river flood hazard map. This will highlight how a model designed to work with global datasets can be more favourable for hydraulic modelling at the global scale and why using innovative techniques customised for broad scale use are preferable to modifying existing hydraulic models. Similarly, the advantages and disadvantages of both 1D and 2D modelling will be explored and balanced against the time, computer and human resources available, particularly when using a Digital Surface Model at 30m resolution. Finally, we will suggest some

  12. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  13. Competing Risks Copula Models for Unemployment Duration

    DEFF Research Database (Denmark)

    Lo, Simon M. S.; Stephan, Gesine; Wilke, Ralf

    2017-01-01

    The copula graphic estimator (CGE) for competing risks models has received little attention in empirical research, despite having been developed into a comprehensive research method. In this paper, we bridge the gap between theoretical developments and applied research by considering a general...... class of competing risks copula models, which nests popular models such as the Cox proportional hazards model, the semiparametric multivariate mixed proportional hazards model (MMPHM), and the CGE as special cases. Analyzing the effects of a German Hartz reform on unemployment duration, we illustrate...

  14. Converging or Crossing Curves: Untie the Gordian Knot or Cut it? Appropriate Statistics for Non-Proportional Hazards in Decitabine DACO-016 Study (AML).

    Science.gov (United States)

    Tomeczkowski, Jörg; Lange, Ansgar; Güntert, Andreas; Thilakarathne, Pushpike; Diels, Joris; Xiu, Liang; De Porre, Peter; Tapprich, Christoph

    2015-09-01

    Among patients with acute myeloid leukemia (AML), the DACO-016 randomized study showed reduction in mortality for decitabine [Dacogen(®) (DAC), Eisai Inc., Woodcliff Lake, NJ, USA] compared with treatment choice (TC): at primary analysis the hazard ratio (HR) was 0.85 (95% confidence interval 0.69-1.04; stratified log-rank P = 0.108). With two interim analyses, two-sided alpha was adjusted to 0.0462. With 1-year additional follow-up the HR reached 0.82 (nominal P = 0.0373). These data resulted in approval of DAC in the European Union, though not in the United States. Though pre-specified, the log-rank test could be considered not optimal to assess the observed survival difference because of the non-proportional hazard nature of the survival curves. We applied the Wilcoxon test as a sensitivity analysis. Patients were randomized to DAC (N = 242) or TC (N = 243). One-hundred and eight (44.4%) patients in the TC arm and 91 (37.6%) patients in the DAC arm selectively crossed over to subsequent disease modifying therapies at progression, which might impact the survival beyond the median with resultant converging curves (and disproportional hazards). The stratified Wilcoxon test showed a significant improvement in median (CI 95%) overall survival with DAC [7.7 (6.2; 9.2) months] versus TC [5.0 (4.3; 6.3) months; P = 0.0458]. Wilcoxon test indicated significant increase in survival for DAC versus TC compared to log-rank test. Janssen-Cilag GmbH.

  15. Bayesian Predictive Inference of a Proportion Under a Twofold Small-Area Model

    Directory of Open Access Journals (Sweden)

    Nandram Balgobin

    2016-03-01

    Full Text Available We extend the twofold small-area model of Stukel and Rao (1997; 1999 to accommodate binary data. An example is the Third International Mathematics and Science Study (TIMSS, in which pass-fail data for mathematics of students from US schools (clusters are available at the third grade by regions and communities (small areas. We compare the finite population proportions of these small areas. We present a hierarchical Bayesian model in which the firststage binary responses have independent Bernoulli distributions, and each subsequent stage is modeled using a beta distribution, which is parameterized by its mean and a correlation coefficient. This twofold small-area model has an intracluster correlation at the first stage and an intercluster correlation at the second stage. The final-stage mean and all correlations are assumed to be noninformative independent random variables. We show how to infer the finite population proportion of each area. We have applied our models to synthetic TIMSS data to show that the twofold model is preferred over a onefold small-area model that ignores the clustering within areas. We further compare these models using a simulation study, which shows that the intracluster correlation is particularly important.

  16. Religiousness and hazardous alcohol use: a conditional indirect effects model.

    Science.gov (United States)

    Jankowski, Peter J; Hardy, Sam A; Zamboanga, Byron L; Ham, Lindsay S

    2013-08-01

    The current study examined a conditional indirect effects model of the association between religiousness and adolescents' hazardous alcohol use. In doing so, we responded to the need to include both mediators and moderators, and the need for theoretically informed models when examining religiousness and adolescents' alcohol use. The sample consisted of 383 adolescents, aged 15-18, who completed an online questionnaire. Results of structural equation modeling supported the proposed model. Religiousness was indirectly associated with hazardous alcohol use through both positive alcohol expectancy outcomes and negative alcohol expectancy valuations. Significant moderating effects for alcohol expectancy valuations on the association between alcohol expectancies and alcohol use were also found. The effects for alcohol expectancy valuations confirm valuations as a distinct construct to that of alcohol expectancy outcomes, and offer support for the protective role of internalized religiousness on adolescents' hazardous alcohol use as a function of expectancy valuations. Copyright © 2013 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  17. Rockfall hazard analysis using LiDAR and spatial modeling

    Science.gov (United States)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  18. A nonparametric mixture model for cure rate estimation.

    Science.gov (United States)

    Peng, Y; Dear, K B

    2000-03-01

    Nonparametric methods have attracted less attention than their parametric counterparts for cure rate analysis. In this paper, we study a general nonparametric mixture model. The proportional hazards assumption is employed in modeling the effect of covariates on the failure time of patients who are not cured. The EM algorithm, the marginal likelihood approach, and multiple imputations are employed to estimate parameters of interest in the model. This model extends models and improves estimation methods proposed by other researchers. It also extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion. The model and its estimation method are investigated by simulations. An application to breast cancer data, including comparisons with previous analyses using a parametric model and an existing nonparametric model by other researchers, confirms the conclusions from the parametric model but not those from the existing nonparametric model.

  19. Modeling of Marine Natural Hazards in the Lesser Antilles

    Science.gov (United States)

    Zahibo, Narcisse; Nikolkina, Irina; Pelinovsky, Efim

    2010-05-01

    The Caribbean Sea countries are often affected by various marine natural hazards: hurricanes and cyclones, tsunamis and flooding. The historical data of marine natural hazards for the Lesser Antilles and specially, for Guadeloupe are presented briefly. Numerical simulation of several historical tsunamis in the Caribbean Sea (1755 Lisbon trans-Atlantic tsunami, 1867 Virgin Island earthquake tsunami, 2003 Montserrat volcano tsunami) are performed within the framework of the nonlinear-shallow theory. Numerical results demonstrate the importance of the real bathymetry variability with respect to the direction of propagation of tsunami wave and its characteristics. The prognostic tsunami wave height distribution along the Caribbean Coast is computed using various forms of seismic and hydrodynamics sources. These results are used to estimate the far-field potential for tsunami hazards at coastal locations in the Caribbean Sea. The nonlinear shallow-water theory is also applied to model storm surges induced by tropical cyclones, in particular, cyclones "Lilli" in 2002 and "Dean" in 2007. Obtained results are compared with observed data. The numerical models have been tested against known analytical solutions of the nonlinear shallow-water wave equations. Obtained results are described in details in [1-7]. References [1] N. Zahibo and E. Pelinovsky, Natural Hazards and Earth System Sciences, 1, 221 (2001). [2] N. Zahibo, E. Pelinovsky, A. Yalciner, A. Kurkin, A. Koselkov and A. Zaitsev, Oceanologica Acta, 26, 609 (2003). [3] N. Zahibo, E. Pelinovsky, A. Kurkin and A. Kozelkov, Science Tsunami Hazards. 21, 202 (2003). [4] E. Pelinovsky, N. Zahibo, P. Dunkley, M. Edmonds, R. Herd, T. Talipova, A. Kozelkov and I. Nikolkina, Science of Tsunami Hazards, 22, 44 (2004). [5] N. Zahibo, E. Pelinovsky, E. Okal, A. Yalciner, C. Kharif, T. Talipova and A. Kozelkov, Science of Tsunami Hazards, 23, 25 (2005). [6] N. Zahibo, E. Pelinovsky, T. Talipova, A. Rabinovich, A. Kurkin and I

  20. Decision-Tree Models of Categorization Response Times, Choice Proportions, and Typicality Judgments

    Science.gov (United States)

    Lafond, Daniel; Lacouture, Yves; Cohen, Andrew L.

    2009-01-01

    The authors present 3 decision-tree models of categorization adapted from T. Trabasso, H. Rollins, and E. Shaughnessy (1971) and use them to provide a quantitative account of categorization response times, choice proportions, and typicality judgments at the individual-participant level. In Experiment 1, the decision-tree models were fit to…

  1. Geospatial subsidence hazard modelling at Sterkfontein Caves ...

    African Journals Online (AJOL)

    The geo-hazard subsidence model includes historic subsidence occurrances, terrain (water flow) and water accumulation. Water accumulating on the surface will percolate and reduce the strength of the soil mass, possibly inducing subsidence. Areas for further geotechnical investigation are identified, demonstrating that a ...

  2. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    Science.gov (United States)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  3. Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    In natural hazard risk assessment, a resolution mismatch between hazard data and aggregated exposure data is often observed. A possible solution to this issue is the disaggregation of exposure data to match the spatial resolution of hazard data. Disaggregation models available in literature...... disaggregation model that considers the uncertainty in the disaggregation, taking basis in the scaled Dirichlet distribution. The proposed probabilistic disaggregation model is applied to a portfolio of residential buildings in the Canton Bern, Switzerland, subject to flood risk. Thereby, the model is verified...... are usually deterministic and make use of auxiliary indicator, such as land cover, to spatially distribute exposures. As the dependence between auxiliary indicator and disaggregated number of exposures is generally imperfect, uncertainty arises in disaggregation. This paper therefore proposes a probabilistic...

  4. A conflict model for the international hazardous waste disposal dispute

    International Nuclear Information System (INIS)

    Hu Kaixian; Hipel, Keith W.; Fang, Liping

    2009-01-01

    A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.

  5. A conflict model for the international hazardous waste disposal dispute.

    Science.gov (United States)

    Hu, Kaixian; Hipel, Keith W; Fang, Liping

    2009-12-15

    A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.

  6. Multivariate Models for Prediction of Human Skin Sensitization Hazard

    Science.gov (United States)

    Strickland, Judy; Zang, Qingda; Paris, Michael; Lehmann, David M.; Allen, David; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Casey, Warren; Kleinstreuer, Nicole

    2016-01-01

    One of ICCVAM’s top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays—the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT), and KeratinoSens™ assay—six physicochemical properties, and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches, logistic regression (LR) and support vector machine (SVM), to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three LR and three SVM) with the highest accuracy (92%) used: (1) DPRA, h-CLAT, and read-across; (2) DPRA, h-CLAT, read-across, and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens, and log P. The models performed better at predicting human skin sensitization hazard than the murine local lymph node assay (accuracy = 88%), any of the alternative methods alone (accuracy = 63–79%), or test batteries combining data from the individual methods (accuracy = 75%). These results suggest that computational methods are promising tools to effectively identify potential human skin sensitizers without animal testing. PMID:27480324

  7. Phylogenetic tree reconstruction accuracy and model fit when proportions of variable sites change across the tree.

    Science.gov (United States)

    Shavit Grievink, Liat; Penny, David; Hendy, Michael D; Holland, Barbara R

    2010-05-01

    Commonly used phylogenetic models assume a homogeneous process through time in all parts of the tree. However, it is known that these models can be too simplistic as they do not account for nonhomogeneous lineage-specific properties. In particular, it is now widely recognized that as constraints on sequences evolve, the proportion and positions of variable sites can vary between lineages causing heterotachy. The extent to which this model misspecification affects tree reconstruction is still unknown. Here, we evaluate the effect of changes in the proportions and positions of variable sites on model fit and tree estimation. We consider 5 current models of nucleotide sequence evolution in a Bayesian Markov chain Monte Carlo framework as well as maximum parsimony (MP). We show that for a tree with 4 lineages where 2 nonsister taxa undergo a change in the proportion of variable sites tree reconstruction under the best-fitting model, which is chosen using a relative test, often results in the wrong tree. In this case, we found that an absolute test of model fit is a better predictor of tree estimation accuracy. We also found further evidence that MP is not immune to heterotachy. In addition, we show that increased sampling of taxa that have undergone a change in proportion and positions of variable sites is critical for accurate tree reconstruction.

  8. Time-predictable model application in probabilistic seismic hazard analysis of faults in Taiwan

    Directory of Open Access Journals (Sweden)

    Yu-Wen Chang

    2017-01-01

    Full Text Available Given the probability distribution function relating to the recurrence interval and the occurrence time of the previous occurrence of a fault, a time-dependent model of a particular fault for seismic hazard assessment was developed that takes into account the active fault rupture cyclic characteristics during a particular lifetime up to the present time. The Gutenberg and Richter (1944 exponential frequency-magnitude relation uses to describe the earthquake recurrence rate for a regional source. It is a reference for developing a composite procedure modelled the occurrence rate for the large earthquake of a fault when the activity information is shortage. The time-dependent model was used to describe the fault characteristic behavior. The seismic hazards contribution from all sources, including both time-dependent and time-independent models, were then added together to obtain the annual total lifetime hazard curves. The effects of time-dependent and time-independent models of fault [e.g., Brownian passage time (BPT and Poisson, respectively] in hazard calculations are also discussed. The proposed fault model result shows that the seismic demands of near fault areas are lower than the current hazard estimation where the time-dependent model was used on those faults, particularly, the elapsed time since the last event of the faults (such as the Chelungpu fault are short.

  9. COMPARISON of FUZZY-BASED MODELS in LANDSLIDE HAZARD MAPPING

    Directory of Open Access Journals (Sweden)

    N. Mijani

    2017-09-01

    Full Text Available Landslide is one of the main geomorphic processes which effects on the development of prospect in mountainous areas and causes disastrous accidents. Landslide is an event which has different uncertain criteria such as altitude, slope, aspect, land use, vegetation density, precipitation, distance from the river and distance from the road network. This research aims to compare and evaluate different fuzzy-based models including Fuzzy Analytic Hierarchy Process (Fuzzy-AHP, Fuzzy Gamma and Fuzzy-OR. The main contribution of this paper reveals to the comprehensive criteria causing landslide hazard considering their uncertainties and comparison of different fuzzy-based models. The quantify of evaluation process are calculated by Density Ratio (DR and Quality Sum (QS. The proposed methodology implemented in Sari, one of the city of Iran which has faced multiple landslide accidents in recent years due to the particular environmental conditions. The achieved results of accuracy assessment based on the quantifier strated that Fuzzy-AHP model has higher accuracy compared to other two models in landslide hazard zonation. Accuracy of zoning obtained from Fuzzy-AHP model is respectively 0.92 and 0.45 based on method Precision (P and QS indicators. Based on obtained landslide hazard maps, Fuzzy-AHP, Fuzzy Gamma and Fuzzy-OR respectively cover 13, 26 and 35 percent of the study area with a very high risk level. Based on these findings, fuzzy-AHP model has been selected as the most appropriate method of zoning landslide in the city of Sari and the Fuzzy-gamma method with a minor difference is in the second order.

  10. Analyzing Right-Censored Length-Biased Data with Additive Hazards Model

    Institute of Scientific and Technical Information of China (English)

    Mu ZHAO; Cun-jie LIN; Yong ZHOU

    2017-01-01

    Length-biased data are often encountered in observational studies,when the survival times are left-truncated and right-censored and the truncation times follow a uniform distribution.In this article,we propose to analyze such data with the additive hazards model,which specifies that the hazard function is the sum of an arbitrary baseline hazard function and a regression function of covariates.We develop estimating equation approaches to estimate the regression parameters.The resultant estimators are shown to be consistent and asymptotically normal.Some simulation studies and a real data example are used to evaluate the finite sample properties of the proposed estimators.

  11. Modelling human interactions in the assessment of man-made hazards

    International Nuclear Information System (INIS)

    Nitoi, M.; Farcasiu, M.; Apostol, M.

    2016-01-01

    The human reliability assessment tools are not currently capable to model adequately the human ability to adapt, to innovate and to manage under extreme situations. The paper presents the results obtained by ICN PSA team in the frame of FP7 Advanced Safety Assessment Methodologies: extended PSA (ASAMPSA_E) project regarding the investigation of conducting HRA in human-made hazards. The paper proposes to use a 4-steps methodology for the assessment of human interactions in the external events (Definition and modelling of human interactions; Quantification of human failure events; Recovery analysis; Review). The most relevant factors with respect to HRA for man-made hazards (response execution complexity; existence of procedures with respect to the scenario in question; time available for action; timing of cues; accessibility of equipment; harsh environmental conditions) are presented and discussed thoroughly. The challenges identified in relation to man-made hazards HRA are highlighted. (authors)

  12. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    International Nuclear Information System (INIS)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study

  13. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  14. Flood hazard mapping of Palembang City by using 2D model

    Science.gov (United States)

    Farid, Mohammad; Marlina, Ayu; Kusuma, Muhammad Syahril Badri

    2017-11-01

    Palembang as the capital city of South Sumatera Province is one of the metropolitan cities in Indonesia that flooded almost every year. Flood in the city is highly related to Musi River Basin. Based on Indonesia National Agency of Disaster Management (BNPB), the level of flood hazard is high. Many natural factors caused flood in the city such as high intensity of rainfall, inadequate drainage capacity, and also backwater flow due to spring tide. Furthermore, anthropogenic factors such as population increase, land cover/use change, and garbage problem make flood problem become worse. The objective of this study is to develop flood hazard map of Palembang City by using two dimensional model. HEC-RAS 5.0 is used as modelling tool which is verified with field observation data. There are 21 sub catchments of Musi River Basin in the flood simulation. The level of flood hazard refers to Head Regulation of BNPB number 2 in 2012 regarding general guideline of disaster risk assessment. The result for 25 year return per iod of flood shows that with 112.47 km2 area of inundation, 14 sub catchments are categorized in high hazard level. It is expected that the hazard map can be used for risk assessment.

  15. A clinical trial design using the concept of proportional time using the generalized gamma ratio distribution.

    Science.gov (United States)

    Phadnis, Milind A; Wetmore, James B; Mayo, Matthew S

    2017-11-20

    Traditional methods of sample size and power calculations in clinical trials with a time-to-event end point are based on the logrank test (and its variations), Cox proportional hazards (PH) assumption, or comparison of means of 2 exponential distributions. Of these, sample size calculation based on PH assumption is likely the most common and allows adjusting for the effect of one or more covariates. However, when designing a trial, there are situations when the assumption of PH may not be appropriate. Additionally, when it is known that there is a rapid decline in the survival curve for a control group, such as from previously conducted observational studies, a design based on the PH assumption may confer only a minor statistical improvement for the treatment group that is neither clinically nor practically meaningful. For such scenarios, a clinical trial design that focuses on improvement in patient longevity is proposed, based on the concept of proportional time using the generalized gamma ratio distribution. Simulations are conducted to evaluate the performance of the proportional time method and to identify the situations in which such a design will be beneficial as compared to the standard design using a PH assumption, piecewise exponential hazards assumption, and specific cases of a cure rate model. A practical example in which hemorrhagic stroke patients are randomized to 1 of 2 arms in a putative clinical trial demonstrates the usefulness of this approach by drastically reducing the number of patients needed for study enrollment. Copyright © 2017 John Wiley & Sons, Ltd.

  16. A Partial Proportional Odds Model for Pedestrian Crashes at Mid-Blocks in Melbourne Metropolitan Area

    Directory of Open Access Journals (Sweden)

    Toran Pour Alireza

    2016-01-01

    Full Text Available Pedestrian crashes account for 11% of all reported traffic crashes in Melbourne metropolitan area between 2004 and 2013. There are very limited studies on pedestrian accidents at mid-blocks. Mid-block crashes account for about 46% of the total pedestrian crashes in Melbourne metropolitan area. Meanwhile, about 50% of all pedestrian fatalities occur at mid-blocks. In this research, Partial Proportional Odds (PPO model is applied to examine vehicle-pedestrian crash severity at mid-blocks in Melbourne metropolitan area. The PPO model is a logistic regression model that allows the covariates that meet the proportional odds assumption to affect different crash severity levels with the same magnitude; whereas the covariates that do not meet the proportional odds assumption can have different effects on different severity levels. In this research vehicle-pedestrian crashes at mid-blocks are analysed for first time. In addition, some factors such as distance of crashes to public transport stops, average road slope and some social characteristics are considered to develop the model in this research for first time. Results of PPO model show that speed limit, light condition, pedestrian age and gender, and vehicle type are the most significant factors that influence vehicle-pedestrian crash severity at mid-blocks.

  17. Bayesian inference method for stochastic damage accumulation modeling

    International Nuclear Information System (INIS)

    Jiang, Xiaomo; Yuan, Yong; Liu, Xian

    2013-01-01

    Damage accumulation based reliability model plays an increasingly important role in successful realization of condition based maintenance for complicated engineering systems. This paper developed a Bayesian framework to establish stochastic damage accumulation model from historical inspection data, considering data uncertainty. Proportional hazards modeling technique is developed to model the nonlinear effect of multiple influencing factors on system reliability. Different from other hazard modeling techniques such as normal linear regression model, the approach does not require any distribution assumption for the hazard model, and can be applied for a wide variety of distribution models. A Bayesian network is created to represent the nonlinear proportional hazards models and to estimate model parameters by Bayesian inference with Markov Chain Monte Carlo simulation. Both qualitative and quantitative approaches are developed to assess the validity of the established damage accumulation model. Anderson–Darling goodness-of-fit test is employed to perform the normality test, and Box–Cox transformation approach is utilized to convert the non-normality data into normal distribution for hypothesis testing in quantitative model validation. The methodology is illustrated with the seepage data collected from real-world subway tunnels.

  18. Traffic Incident Clearance Time and Arrival Time Prediction Based on Hazard Models

    Directory of Open Access Journals (Sweden)

    Yang beibei Ji

    2014-01-01

    Full Text Available Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an effective input for travel time prediction. In this paper, the hazard based prediction models are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.

  19. Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios

    OpenAIRE

    Custer, Rocco; Nishijima, Kazuyoshi

    2012-01-01

    In natural hazard risk assessment, a resolution mismatch between hazard data and aggregated exposure data is often observed. A possible solution to this issue is the disaggregation of exposure data to match the spatial resolution of hazard data. Disaggregation models available in literature are usually deterministic and make use of auxiliary indicator, such as land cover, to spatially distribute exposures. As the dependence between auxiliary indicator and disaggregated number of exposures is ...

  20. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  1. Modelling multi-hazard hurricane damages on an urbanized coast with a Bayesian Network approach

    Science.gov (United States)

    van Verseveld, H.C.W.; Van Dongeren, A. R.; Plant, Nathaniel G.; Jäger, W.S.; den Heijer, C.

    2015-01-01

    Hurricane flood impacts to residential buildings in coastal zones are caused by a number of hazards, such as inundation, overflow currents, erosion, and wave attack. However, traditional hurricane damage models typically make use of stage-damage functions, where the stage is related to flooding depth only. Moreover, these models are deterministic and do not consider the large amount of uncertainty associated with both the processes themselves and with the predictions. This uncertainty becomes increasingly important when multiple hazards (flooding, wave attack, erosion, etc.) are considered simultaneously. This paper focusses on establishing relationships between observed damage and multiple hazard indicators in order to make better probabilistic predictions. The concept consists of (1) determining Local Hazard Indicators (LHIs) from a hindcasted storm with use of a nearshore morphodynamic model, XBeach, and (2) coupling these LHIs and building characteristics to the observed damages. We chose a Bayesian Network approach in order to make this coupling and used the LHIs ‘Inundation depth’, ‘Flow velocity’, ‘Wave attack’, and ‘Scour depth’ to represent flooding, current, wave impacts, and erosion related hazards.The coupled hazard model was tested against four thousand damage observations from a case site at the Rockaway Peninsula, NY, that was impacted by Hurricane Sandy in late October, 2012. The model was able to accurately distinguish ‘Minor damage’ from all other outcomes 95% of the time and could distinguish areas that were affected by the storm, but not severely damaged, 68% of the time. For the most heavily damaged buildings (‘Major Damage’ and ‘Destroyed’), projections of the expected damage underestimated the observed damage. The model demonstrated that including multiple hazards doubled the prediction skill, with Log-Likelihood Ratio test (a measure of improved accuracy and reduction in uncertainty) scores between 0.02 and 0

  2. Impossibility Theorem in Proportional Representation Problem

    International Nuclear Information System (INIS)

    Karpov, Alexander

    2010-01-01

    The study examines general axiomatics of Balinski and Young and analyzes existed proportional representation methods using this approach. The second part of the paper provides new axiomatics based on rational choice models. New system of axioms is applied to study known proportional representation systems. It is shown that there is no proportional representation method satisfying a minimal set of the axioms (monotonicity and neutrality).

  3. Ground motion models used in the 2014 U.S. National Seismic Hazard Maps

    Science.gov (United States)

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.

    2015-01-01

    The National Seismic Hazard Maps (NSHMs) are an important component of seismic design regulations in the United States. This paper compares hazard using the new suite of ground motion models (GMMs) relative to hazard using the suite of GMMs applied in the previous version of the maps. The new source characterization models are used for both cases. A previous paper (Rezaeian et al. 2014) discussed the five NGA-West2 GMMs used for shallow crustal earthquakes in the Western United States (WUS), which are also summarized here. Our focus in this paper is on GMMs for earthquakes in stable continental regions in the Central and Eastern United States (CEUS), as well as subduction interface and deep intraslab earthquakes. We consider building code hazard levels for peak ground acceleration (PGA), 0.2-s, and 1.0-s spectral accelerations (SAs) on uniform firm-rock site conditions. The GMM modifications in the updated version of the maps created changes in hazard within 5% to 20% in WUS; decreases within 5% to 20% in CEUS; changes within 5% to 15% for subduction interface earthquakes; and changes involving decreases of up to 50% and increases of up to 30% for deep intraslab earthquakes for most U.S. sites. These modifications were combined with changes resulting from modifications in the source characterization models to obtain the new hazard maps.

  4. Time-aggregation effects on the baseline of continuous-time and discrete-time hazard models

    NARCIS (Netherlands)

    ter Hofstede, F.; Wedel, M.

    In this study we reinvestigate the effect of time-aggregation for discrete- and continuous-time hazard models. We reanalyze the results of a previous Monte Carlo study by ter Hofstede and Wedel (1998), in which the effects of time-aggregation on the parameter estimates of hazard models were

  5. Efficient pan-European river flood hazard modelling through a combination of statistical and physical models

    NARCIS (Netherlands)

    Paprotny, D.; Morales Napoles, O.; Jonkman, S.N.

    2017-01-01

    Flood hazard is currently being researched on continental and global scales, using models of increasing complexity. In this paper we investigate a different, simplified approach, which combines statistical and physical models in place of conventional rainfall-run-off models to carry out flood

  6. Taxonomic analysis of perceived risk: modeling individual and group perceptions within homogeneous hazard domains

    International Nuclear Information System (INIS)

    Kraus, N.N.; Slovic, P.

    1988-01-01

    Previous studies of risk perception have typically focused on the mean judgments of a group of people regarding the riskiness (or safety) of a diverse set of hazardous activities, substances, and technologies. This paper reports the results of two studies that take a different path. Study 1 investigated whether models within a single technological domain were similar to previous models based on group means and diverse hazards. Study 2 created a group taxonomy of perceived risk for only one technological domain, railroads, and examined whether the structure of that taxonomy corresponded with taxonomies derived from prior studies of diverse hazards. Results from Study 1 indicated that the importance of various risk characteristics in determining perceived risk differed across individuals and across hazards, but not so much as to invalidate the results of earlier studies based on group means and diverse hazards. In Study 2, the detailed analysis of railroad hazards produced a structure that had both important similarities to, and dissimilarities from, the structure obtained in prior research with diverse hazard domains. The data also indicated that railroad hazards are really quite diverse, with some approaching nuclear reactors in their perceived seriousness. These results suggest that information about the diversity of perceptions within a single domain of hazards could provide valuable input to risk-management decisions

  7. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest.

    Science.gov (United States)

    Suh, Jangwon; Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon

    2017-11-27

    In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further.

  8. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest

    Science.gov (United States)

    Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon

    2017-01-01

    In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further. PMID:29186922

  9. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest

    Directory of Open Access Journals (Sweden)

    Jangwon Suh

    2017-11-01

    Full Text Available In this study, current geographic information system (GIS-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further.

  10. Complete hazard ranking to analyze right-censored data: An ALS survival study.

    Science.gov (United States)

    Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang

    2017-12-01

    Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.

  11. Cognitive and Metacognitive Aspects of Proportional Reasoning

    Science.gov (United States)

    Modestou, Modestina; Gagatsis, Athanasios

    2010-01-01

    In this study we attempt to propose a new model of proportional reasoning based both on bibliographical and research data. This is impelled with the help of three written tests involving analogical, proportional, and non-proportional situations that were administered to pupils from grade 7 to 9. The results suggest the existence of a…

  12. [Hazard evaluation modeling of particulate matters emitted by coal-fired boilers and case analysis].

    Science.gov (United States)

    Shi, Yan-Ting; Du, Qian; Gao, Jian-Min; Bian, Xin; Wang, Zhi-Pu; Dong, He-Ming; Han, Qiang; Cao, Yang

    2014-02-01

    In order to evaluate the hazard of PM2.5 emitted by various boilers, in this paper, segmentation of particulate matters with sizes of below 2. 5 microm was performed based on their formation mechanisms and hazard level to human beings and environment. Meanwhile, taking into account the mass concentration, number concentration, enrichment factor of Hg, and content of Hg element in different coal ashes, a comprehensive model aimed at evaluating hazard of PM2.5 emitted by coal-fired boilers was established in this paper. Finally, through utilizing filed experimental data of previous literatures, a case analysis of the evaluation model was conducted, and the concept of hazard reduction coefficient was proposed, which can be used to evaluate the performance of dust removers.

  13. Building a risk-targeted regional seismic hazard model for South-East Asia

    Science.gov (United States)

    Woessner, J.; Nyst, M.; Seyhan, E.

    2015-12-01

    The last decade has tragically shown the social and economic vulnerability of countries in South-East Asia to earthquake hazard and risk. While many disaster mitigation programs and initiatives to improve societal earthquake resilience are under way with the focus on saving lives and livelihoods, the risk management sector is challenged to develop appropriate models to cope with the economic consequences and impact on the insurance business. We present the source model and ground motions model components suitable for a South-East Asia earthquake risk model covering Indonesia, Malaysia, the Philippines and Indochine countries. The source model builds upon refined modelling approaches to characterize 1) seismic activity from geologic and geodetic data on crustal faults and 2) along the interface of subduction zones and within the slabs and 3) earthquakes not occurring on mapped fault structures. We elaborate on building a self-consistent rate model for the hazardous crustal fault systems (e.g. Sumatra fault zone, Philippine fault zone) as well as the subduction zones, showcase some characteristics and sensitivities due to existing uncertainties in the rate and hazard space using a well selected suite of ground motion prediction equations. Finally, we analyze the source model by quantifying the contribution by source type (e.g., subduction zone, crustal fault) to typical risk metrics (e.g.,return period losses, average annual loss) and reviewing their relative impact on various lines of businesses.

  14. Analogical proportions: another logical view

    Science.gov (United States)

    Prade, Henri; Richard, Gilles

    This paper investigates the logical formalization of a restricted form of analogical reasoning based on analogical proportions, i.e. statements of the form a is to b as c is to d. Starting from a naive set theoretic interpretation, we highlight the existence of two noticeable companion proportions: one states that a is to b the converse of what c is to d (reverse analogy), while the other called paralogical proportion expresses that what a and b have in common, c and d have it also. We identify the characteristic postulates of the three types of proportions and examine their consequences from an abstract viewpoint. We further study the properties of the set theoretic interpretation and of the Boolean logic interpretation, and we provide another light on the understanding of the role of permutations in the modeling of the three types of proportions. Finally, we address the use of these proportions as a basis for inference in a propositional setting, and relate it to more general schemes of analogical reasoning. The differences between analogy, reverse-analogy, and paralogy is still emphasized in a three-valued setting, which is also briefly presented.

  15. A Mathematical Model for the Industrial Hazardous Waste Location-Routing Problem

    Directory of Open Access Journals (Sweden)

    Omid Boyer

    2013-01-01

    Full Text Available Technology progress is a cause of industrial hazardous wastes increasing in the whole world . Management of hazardous waste is a significant issue due to the imposed risk on environment and human life. This risk can be a result of location of undesirable facilities and also routing hazardous waste. In this paper a biobjective mixed integer programing model for location-routing industrial hazardous waste with two objectives is developed. First objective is total cost minimization including transportation cost, operation cost, initial investment cost, and cost saving from selling recycled waste. Second objective is minimization of transportation risk. Risk of population exposure within bandwidth along route is used to measure transportation risk. This model can help decision makers to locate treatment, recycling, and disposal centers simultaneously and also to route waste between these facilities considering risk and cost criteria. The results of the solved problem prove conflict between two objectives. Hence, it is possible to decrease the cost value by marginally increasing the transportation risk value and vice versa. A weighted sum method is utilized to combine two objectives function into one objective function. To solve the problem GAMS software with CPLEX solver is used. The problem is applied in Markazi province in Iran.

  16. The additive hazards model with high-dimensional regressors

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas

    2009-01-01

    This paper considers estimation and prediction in the Aalen additive hazards model in the case where the covariate vector is high-dimensional such as gene expression measurements. Some form of dimension reduction of the covariate space is needed to obtain useful statistical analyses. We study...... model. A standard PLS algorithm can also be constructed, but it turns out that the resulting predictor can only be related to the original covariates via time-dependent coefficients. The methods are applied to a breast cancer data set with gene expression recordings and to the well known primary biliary...

  17. Complete hazard ranking to analyze right-censored data: An ALS survival study.

    Directory of Open Access Journals (Sweden)

    Zhengnan Huang

    2017-12-01

    Full Text Available Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.

  18. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  19. Report 3: Guidance document on practices to model and implement Extreme Weather hazards in extended PSA

    International Nuclear Information System (INIS)

    Alzbutas, R.; Ostapchuk, S.; Borysiewicz, M.; Decker, K.; Kumar, Manorma; Haeggstroem, A.; Nitoi, M.; Groudev, P.; Parey, S.; Potempski, S.; Raimond, E.; Siklossy, T.

    2016-01-01

    The goal of this report is to provide guidance on practices to model Extreme Weather hazards and implement them in extended level 1 PSA. This report is a joint deliverable of work package 21 (WP21) and work package 22 (WP22). The general objective of WP21 is to provide guidance on all of the individual hazards selected at the End Users Workshop. This guidance is focusing on extreme weather hazards, namely: extreme wind, extreme temperature and snow pack. Other hazards, however, are considered in cases where they are correlated/ associated with the hazard under discussion. Guidance developed refers to existing guidance whenever possible. As it was recommended by end users this guidance covers questions of developing integrated and/or separated extreme weathers PSA models. (authors)

  20. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    Science.gov (United States)

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  1. Defaultable Game Options in a Hazard Process Model

    Directory of Open Access Journals (Sweden)

    Tomasz R. Bielecki

    2009-01-01

    Full Text Available The valuation and hedging of defaultable game options is studied in a hazard process model of credit risk. A convenient pricing formula with respect to a reference filteration is derived. A connection of arbitrage prices with a suitable notion of hedging is obtained. The main result shows that the arbitrage prices are the minimal superhedging prices with sigma martingale cost under a risk neutral measure.

  2. Relating arithmetical techniques of proportion to geometry

    DEFF Research Database (Denmark)

    Wijayanti, Dyana

    2015-01-01

    The purpose of this study is to investigate how textbooks introduce and treat the theme of proportion in geometry (similarity) and arithmetic (ratio and proportion), and how these themes are linked to each other in the books. To pursue this aim, we use the anthropological theory of the didactic....... Considering 6 common Indonesian textbooks in use, we describe how proportion is explained and appears in examples and exercises, using an explicit reference model of the mathematical organizations of both themes. We also identify how the proportion themes of the geometry and arithmetic domains are linked. Our...

  3. Introduction: Hazard mapping

    Science.gov (United States)

    Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M

    2014-01-01

    Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.

  4. An estimator of the survival function based on the semi-Markov model under dependent censorship.

    Science.gov (United States)

    Lee, Seung-Yeoun; Tsai, Wei-Yann

    2005-06-01

    Lee and Wolfe (Biometrics vol. 54 pp. 1176-1178, 1998) proposed the two-stage sampling design for testing the assumption of independent censoring, which involves further follow-up of a subset of lost-to-follow-up censored subjects. They also proposed an adjusted estimator for the survivor function for a proportional hazards model under the dependent censoring model. In this paper, a new estimator for the survivor function is proposed for the semi-Markov model under the dependent censorship on the basis of the two-stage sampling data. The consistency and the asymptotic distribution of the proposed estimator are derived. The estimation procedure is illustrated with an example of lung cancer clinical trial and simulation results are reported of the mean squared errors of estimators under a proportional hazards and two different nonproportional hazards models.

  5. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  6. A Dirichlet process mixture of generalized Dirichlet distributions for proportional data modeling.

    Science.gov (United States)

    Bouguila, Nizar; Ziou, Djemel

    2010-01-01

    In this paper, we propose a clustering algorithm based on both Dirichlet processes and generalized Dirichlet distribution which has been shown to be very flexible for proportional data modeling. Our approach can be viewed as an extension of the finite generalized Dirichlet mixture model to the infinite case. The extension is based on nonparametric Bayesian analysis. This clustering algorithm does not require the specification of the number of mixture components to be given in advance and estimates it in a principled manner. Our approach is Bayesian and relies on the estimation of the posterior distribution of clusterings using Gibbs sampler. Through some applications involving real-data classification and image databases categorization using visual words, we show that clustering via infinite mixture models offers a more powerful and robust performance than classic finite mixtures.

  7. The Framework of a Coastal Hazards Model - A Tool for Predicting the Impact of Severe Storms

    Science.gov (United States)

    Barnard, Patrick L.; O'Reilly, Bill; van Ormondt, Maarten; Elias, Edwin; Ruggiero, Peter; Erikson, Li H.; Hapke, Cheryl; Collins, Brian D.; Guza, Robert T.; Adams, Peter N.; Thomas, Julie

    2009-01-01

    The U.S. Geological Survey (USGS) Multi-Hazards Demonstration Project in Southern California (Jones and others, 2007) is a five-year project (FY2007-FY2011) integrating multiple USGS research activities with the needs of external partners, such as emergency managers and land-use planners, to produce products and information that can be used to create more disaster-resilient communities. The hazards being evaluated include earthquakes, landslides, floods, tsunamis, wildfires, and coastal hazards. For the Coastal Hazards Task of the Multi-Hazards Demonstration Project in Southern California, the USGS is leading the development of a modeling system for forecasting the impact of winter storms threatening the entire Southern California shoreline from Pt. Conception to the Mexican border. The modeling system, run in real-time or with prescribed scenarios, will incorporate atmospheric information (that is, wind and pressure fields) with a suite of state-of-the-art physical process models (that is, tide, surge, and wave) to enable detailed prediction of currents, wave height, wave runup, and total water levels. Additional research-grade predictions of coastal flooding, inundation, erosion, and cliff failure will also be performed. Initial model testing, performance evaluation, and product development will be focused on a severe winter-storm scenario developed in collaboration with the Winter Storm Working Group of the USGS Multi-Hazards Demonstration Project in Southern California. Additional offline model runs and products will include coastal-hazard hindcasts of selected historical winter storms, as well as additional severe winter-storm simulations based on statistical analyses of historical wave and water-level data. The coastal-hazards model design will also be appropriate for simulating the impact of storms under various sea level rise and climate-change scenarios. The operational capabilities of this modeling system are designed to provide emergency planners with

  8. Hazard Ranking Method for Populations Exposed to Arsenic in Private Water Supplies: Relation to Bedrock Geology.

    Science.gov (United States)

    Crabbe, Helen; Fletcher, Tony; Close, Rebecca; Watts, Michael J; Ander, E Louise; Smedley, Pauline L; Verlander, Neville Q; Gregory, Martin; Middleton, Daniel R S; Polya, David A; Studden, Mike; Leonardi, Giovanni S

    2017-12-01

    Approximately one million people in the UK are served by private water supplies (PWS) where main municipal water supply system connection is not practical or where PWS is the preferred option. Chronic exposure to contaminants in PWS may have adverse effects on health. South West England is an area with elevated arsenic concentrations in groundwater and over 9000 domestic dwellings here are supplied by PWS. There remains uncertainty as to the extent of the population exposed to arsenic (As), and the factors predicting such exposure. We describe a hazard assessment model based on simplified geology with the potential to predict exposure to As in PWS. Households with a recorded PWS in Cornwall were recruited to take part in a water sampling programme from 2011 to 2013. Bedrock geologies were aggregated and classified into nine Simplified Bedrock Geological Categories (SBGC), plus a cross-cutting "mineralized" area. PWS were sampled by random selection within SBGCs and some 508 households volunteered for the study. Transformations of the data were explored to estimate the distribution of As concentrations for PWS by SBGC. Using the distribution per SBGC, we predict the proportion of dwellings that would be affected by high concentrations and rank the geologies according to hazard. Within most SBGCs, As concentrations were found to have log-normal distributions. Across these areas, the proportion of dwellings predicted to have drinking water over the prescribed concentration value (PCV) for As ranged from 0% to 20%. From these results, a pilot predictive model was developed calculating the proportion of PWS above the PCV for As and hazard ranking supports local decision making and prioritization. With further development and testing, this can help local authorities predict the number of dwellings that might fail the PCV for As, based on bedrock geology. The model presented here for Cornwall could be applied in areas with similar geologies. Application of the method

  9. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  10. Modeling survival: application of the Andersen-Gill model to Yellowstone grizzly bears

    Science.gov (United States)

    Johnson, Christopher J.; Boyce, Mark S.; Schwartz, Charles C.; Haroldson, Mark A.

    2004-01-01

     Wildlife ecologists often use the Kaplan-Meier procedure or Cox proportional hazards model to estimate survival rates, distributions, and magnitude of risk factors. The Andersen-Gill formulation (A-G) of the Cox proportional hazards model has seen limited application to mark-resight data but has a number of advantages, including the ability to accommodate left-censored data, time-varying covariates, multiple events, and discontinuous intervals of risks. We introduce the A-G model including structure of data, interpretation of results, and assessment of assumptions. We then apply the model to 22 years of radiotelemetry data for grizzly bears (Ursus arctos) of the Greater Yellowstone Grizzly Bear Recovery Zone in Montana, Idaho, and Wyoming, USA. We used Akaike's Information Criterion (AICc) and multi-model inference to assess a number of potentially useful predictive models relative to explanatory covariates for demography, human disturbance, and habitat. Using the most parsimonious models, we generated risk ratios, hypothetical survival curves, and a map of the spatial distribution of high-risk areas across the recovery zone. Our results were in agreement with past studies of mortality factors for Yellowstone grizzly bears. Holding other covariates constant, mortality was highest for bears that were subjected to repeated management actions and inhabited areas with high road densities outside Yellowstone National Park. Hazard models developed with covariates descriptive of foraging habitats were not the most parsimonious, but they suggested that high-elevation areas offered lower risks of mortality when compared to agricultural areas.

  11. Developments in consequence modelling of accidental releases of hazardous materials

    NARCIS (Netherlands)

    Boot, H.

    2012-01-01

    The modelling of consequences of releases of hazardous materials in the Netherlands has mainly been based on the “Yellow Book”. Although there is no updated version of this official publication, new insights have been developed during the last decades. This article will give an overview of new

  12. Proportionality lost - proportionality regained?

    DEFF Research Database (Denmark)

    Werlauff, Erik

    2010-01-01

    In recent years, the European Court of Justice (the ECJ) seems to have accepted restrictions on the freedom of establishment and other basic freedoms, despite the fact that a more thorough proportionality test would have revealed that the restriction in question did not pass the 'rule of reason' ...

  13. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest

    OpenAIRE

    Suh, Jangwon; Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon

    2017-01-01

    In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or t...

  14. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    Science.gov (United States)

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  15. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    Science.gov (United States)

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological

  16. Modeling emergency evacuation for major hazard industrial sites

    International Nuclear Information System (INIS)

    Georgiadou, Paraskevi S.; Papazoglou, Ioannis A.; Kiranoudis, Chris T.; Markatos, Nikolaos C.

    2007-01-01

    A model providing the temporal and spatial distribution of the population under evacuation around a major hazard facility is developed. A discrete state stochastic Markov process simulates the movement of the evacuees. The area around the hazardous facility is divided into nodes connected among themselves with links representing the road system of the area. Transition from node-to-node is simulated as a random process where the probability of transition depends on the dynamically changed states of the destination and origin nodes and on the link between them. Solution of the Markov process provides the expected distribution of the evacuees in the nodes of the area as a function of time. A Monte Carlo solution of the model provides in addition a sample of actual trajectories of the evacuees. This information coupled with an accident analysis which provides the spatial and temporal distribution of the extreme phenomenon following an accident, determines a sample of the actual doses received by the evacuees. Both the average dose and the actual distribution of doses are then used as measures in evaluating alternative emergency response strategies. It is shown that in some cases the estimation of the health consequences by the average dose might be either too conservative or too non-conservative relative to the one corresponding to the distribution of the received dose and hence not a suitable measure to evaluate alternative evacuation strategies

  17. Analyzing sickness absence with statistical models for survival data

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Andersen, Per Kragh; Smith-Hansen, Lars

    2007-01-01

    OBJECTIVES: Sickness absence is the outcome in many epidemiologic studies and is often based on summary measures such as the number of sickness absences per year. In this study the use of modern statistical methods was examined by making better use of the available information. Since sickness...... absence data deal with events occurring over time, the use of statistical models for survival data has been reviewed, and the use of frailty models has been proposed for the analysis of such data. METHODS: Three methods for analyzing data on sickness absences were compared using a simulation study...... involving the following: (i) Poisson regression using a single outcome variable (number of sickness absences), (ii) analysis of time to first event using the Cox proportional hazards model, and (iii) frailty models, which are random effects proportional hazards models. Data from a study of the relation...

  18. The effects of building design on hazard of first service in Norwegian dairy cows.

    Science.gov (United States)

    Martin, A D; Kielland, C; Nelson, S T; Østerås, O

    2015-12-01

    Reproductive inefficiency is one of the major production and economic constraints on modern dairy farms. The environment affects onset of ovarian activity in a cow postcalving and influences estrus behavior, which in turn affects a stockperson's ability to inseminate her at the correct time. This study used survival analysis to investigate effects of building design and animal factors on the postpartum hazard of first service (HFS) in freestall-housed Norwegian Red cows. The study was performed on 232 Norwegian dairy farms between 2004 and 2007. Data were obtained through on farm measurements and by accessing the Norwegian Dairy Herd Recording System. The final data set contained data on 38,436 calvings and 27,127 services. Univariate Cox proportional hazard analyses showed that herd size and milk yield were positively associated with HFS. Total free accessible area and free accessible area available per cow year were positively associated with the HFS, as was the number of freestalls available per cow. Cows housed on slatted floors had a lower HFS than those housed on solid floors. Conversely, cows housed on rubber floors had a higher HFS than cows on concrete floors. Dead-ending alleyways reduced the hazard of AI after calving. A multivariable Cox proportional hazards model, accounting for herd management by including a frailty term for herd, showed relationships between hazard of postpartum service and explanatory variables. Animals in herds with more than 50 cows had a higher HFS [hazard ratio (HR)=3.0] compared with those in smaller herds. The HFS was also higher (HR=4.3) if more than 8.8 m(2) of space was available per cow year compared with herds in which animals had less space. The HFS after calving increased with parity (parity 2 HR=0.5, parity ≥3 HR=1.7), and was reduced if a lactation began with dystocia (HR=0.82) or was a breed other than Norwegian Red (HR=0.2). The frailty term, herd, was large and highly significant indicating a significant

  19. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    Science.gov (United States)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability

  20. Guidance document on practices to model and implement Earthquake hazards in extended PSA (final version). Volume 1

    International Nuclear Information System (INIS)

    Decker, K.; Hirata, K.; Groudev, P.

    2016-01-01

    The current report provides guidance for the assessment of seismo-tectonic hazards in level 1 and 2 PSA. The objective is to review existing guidance, identify methodological challenges, and to propose novel guidance on key issues. Guidance for the assessment of vibratory ground motion and fault capability comprises the following: - listings of data required for the hazard assessment and methods to estimate data quality and completeness; - in-depth discussion of key input parameters required for hazard models; - discussions on commonly applied hazard assessment methodologies; - references to recent advances of science and technology. Guidance on the assessment of correlated or coincident hazards comprises of chapters on: - screening of correlated hazards; - assessment of correlated hazards (natural and man-made); - assessment of coincident hazards. (authors)

  1. Multiple Landslide-Hazard Scenarios Modeled for the Oakland-Berkeley Area, Northern California

    Science.gov (United States)

    Pike, Richard J.; Graymer, Russell W.

    2008-01-01

    With the exception of Los Angeles, perhaps no urban area in the United States is more at risk from landsliding, triggered by either precipitation or earthquake, than the San Francisco Bay region of northern California. By January each year, seasonal winter storms usually bring moisture levels of San Francisco Bay region hillsides to the point of saturation, after which additional heavy rainfall may induce landslides of various types and levels of severity. In addition, movement at any time along one of several active faults in the area may generate an earthquake large enough to trigger landslides. The danger to life and property rises each year as local populations continue to expand and more hillsides are graded for development of residential housing and its supporting infrastructure. The chapters in the text consist of: *Introduction by Russell W. Graymer *Chapter 1 Rainfall Thresholds for Landslide Activity, San Francisco Bay Region, Northern California by Raymond C. Wilson *Chapter 2 Susceptibility to Deep-Seated Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike and Steven Sobieszczyk *Chapter 3 Susceptibility to Shallow Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Kevin M. Schmidt and Steven Sobieszczyk *Chapter 4 Landslide Hazard Modeled for the Cities of Oakland, Piedmont, and Berkeley, Northern California, from a M=7.1 Scenario Earthquake on the Hayward Fault Zone by Scott B. Miles and David K. Keefer *Chapter 5 Synthesis of Landslide-Hazard Scenarios Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike The plates consist of: *Plate 1 Susceptibility to Deep-Seated Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike, Russell W. Graymer, Sebastian Roberts, Naomi B. Kalman, and Steven Sobieszczyk *Plate 2 Susceptibility to Shallow Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Kevin M. Schmidt and Steven

  2. Nonlinear kinematic hardening under non-proportional loading

    International Nuclear Information System (INIS)

    Ottosen, N.S.

    1979-07-01

    Within the framework of conventional plasticity theory, it is first determined under which conditions Melan-Prager's and Ziegler's kinematic hardening rules result in identical material behaviour. Next, assuming initial isotropy and adopting the von Mises yield criterion, a nonlinear kinematic hardening function is proposed for prediction of metal behaviour. The model assumes that hardening at a specific stress point depends on the direction of the new incremental loading. Hereby a realistic response is obtained for general reversed loading, and a smooth behaviour is assured, even when loading deviates more and more from proportional loading and ultimately results in reversed loading. The predictions of the proposed model for non-proportional loading under plane stress conditions are compared with those of the classical linear kinematic model, the isotropic model and with published experimental data. Finally, the limitations of the proposaed model are discussed. (author)

  3. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    Science.gov (United States)

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated

  4. Predicting water main failures using Bayesian model averaging and survival modelling approach

    International Nuclear Information System (INIS)

    Kabir, Golam; Tesfamariam, Solomon; Sadiq, Rehan

    2015-01-01

    To develop an effective preventive or proactive repair and replacement action plan, water utilities often rely on water main failure prediction models. However, in predicting the failure of water mains, uncertainty is inherent regardless of the quality and quantity of data used in the model. To improve the understanding of water main failure, a Bayesian framework is developed for predicting the failure of water mains considering uncertainties. In this study, Bayesian model averaging method (BMA) is presented to identify the influential pipe-dependent and time-dependent covariates considering model uncertainties whereas Bayesian Weibull Proportional Hazard Model (BWPHM) is applied to develop the survival curves and to predict the failure rates of water mains. To accredit the proposed framework, it is implemented to predict the failure of cast iron (CI) and ductile iron (DI) pipes of the water distribution network of the City of Calgary, Alberta, Canada. Results indicate that the predicted 95% uncertainty bounds of the proposed BWPHMs capture effectively the observed breaks for both CI and DI water mains. Moreover, the performance of the proposed BWPHMs are better compare to the Cox-Proportional Hazard Model (Cox-PHM) for considering Weibull distribution for the baseline hazard function and model uncertainties. - Highlights: • Prioritize rehabilitation and replacements (R/R) strategies of water mains. • Consider the uncertainties for the failure prediction. • Improve the prediction capability of the water mains failure models. • Identify the influential and appropriate covariates for different models. • Determine the effects of the covariates on failure

  5. GIS and RS-based modelling of potential natural hazard areas in Pehchevo municipality, Republic of Macedonia

    Directory of Open Access Journals (Sweden)

    Milevski Ivica

    2013-01-01

    Full Text Available In this paper, one approach of Geographic Information System (GIS and Remote Sensing (RS assessment of potential natural hazard areas (excess erosion, landslides, flash floods and fires is presented. For that purpose Pehchevo Municipality in the easternmost part of the Republic of Macedonia is selected as a case study area because of high local impact of natural hazards on the environment, social-demographic situation and local economy. First of all, most relevant static factors for each type of natural hazard are selected (topography, land cover, anthropogenic objects and infrastructure. With GIS and satellite imagery, multi-layer calculation is performed based on available traditional equations, clustering or discreditation procedures. In such way suitable relatively “static” natural hazard maps (models are produced. Then, dynamic (mostly climate related factors are included in previous models resulting in appropriate scenarios correlated with different amounts of precipitation, temperature, wind direction etc. Finally, GIS based scenarios are evaluated and tested with field check or very fine resolution Google Earth imagery showing good accuracy. Further development of such GIS models in connection with automatic remote meteorological stations and dynamic satellite imagery (like MODIS will provide on-time warning for coming natural hazard avoiding potential damages or even causalities.

  6. Extended cox regression model: The choice of timefunction

    Science.gov (United States)

    Isik, Hatice; Tutkun, Nihal Ata; Karasoy, Durdu

    2017-07-01

    Cox regression model (CRM), which takes into account the effect of censored observations, is one the most applicative and usedmodels in survival analysis to evaluate the effects of covariates. Proportional hazard (PH), requires a constant hazard ratio over time, is the assumptionofCRM. Using extended CRM provides the test of including a time dependent covariate to assess the PH assumption or an alternative model in case of nonproportional hazards. In this study, the different types of real data sets are used to choose the time function and the differences between time functions are analyzed and discussed.

  7. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    Science.gov (United States)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by

  8. A decision model for the risk management of hazardous processes

    International Nuclear Information System (INIS)

    Holmberg, J.E.

    1997-03-01

    A decision model for risk management of hazardous processes as an optimisation problem of a point process is formulated in the study. In the approach, the decisions made by the management are divided into three categories: (1) planned process lifetime, (2) selection of the design and, (3) operational decisions. These three controlling methods play quite different roles in the practical risk management, which is also reflected in our approach. The optimisation of the process lifetime is related to the licensing problem of the process. It provides a boundary condition for a feasible utility function that is used as the actual objective function, i.e., maximizing the process lifetime utility. By design modifications, the management can affect the inherent accident hazard rate of the process. This is usually a discrete optimisation task. The study particularly concentrates upon the optimisation of the operational strategies given a certain design and licensing time. This is done by a dynamic risk model (marked point process model) representing the stochastic process of events observable or unobservable to the decision maker. An optimal long term control variable guiding the selection of operational alternatives in short term problems is studied. The optimisation problem is solved by the stochastic quasi-gradient procedure. The approach is illustrated by a case study. (23 refs.)

  9. Proportional reasoning

    DEFF Research Database (Denmark)

    Dole, Shelley; Hilton, Annette; Hilton, Geoff

    2015-01-01

    Proportional reasoning is widely acknowledged as a key to success in school mathematics, yet students’ continual difficulties with proportion-related tasks are well documented. This paper draws on a large research study that aimed to support 4th to 9th grade teachers to design and implement tasks...

  10. MCNP modelling of the wall effects observed in tissue-equivalent proportional counters.

    Science.gov (United States)

    Hoff, J L; Townsend, L W

    2002-01-01

    Tissue-equivalent proportional counters (TEPCs) utilise tissue-equivalent materials to depict homogeneous microscopic volumes of human tissue. Although both the walls and gas simulate the same medium, they respond to radiation differently. Density differences between the two materials cause distortions, or wall effects, in measurements, with the most dominant effect caused by delta rays. This study uses a Monte Carlo transport code, MCNP, to simulate the transport of secondary electrons within a TEPC. The Rudd model, a singly differential cross section with no dependence on electron direction, is used to describe the energy spectrum obtained by the impact of two iron beams on water. Based on the models used in this study, a wall-less TEPC had a higher lineal energy (keV.micron-1) as a function of impact parameter than a solid-wall TEPC for the iron beams under consideration. An important conclusion of this study is that MCNP has the ability to model the wall effects observed in TEPCs.

  11. Testing the Conditional Mean Function of Autoregressive Conditional Duration Models

    DEFF Research Database (Denmark)

    Hautsch, Nikolaus

    be subject to censoring structures. In an empirical study based on financial transaction data we present an application of the model to estimate conditional asset price change probabilities. Evaluating the forecasting properties of the model, it is shown that the proposed approach is a promising competitor......This paper proposes a dynamic proportional hazard (PH) model with non-specified baseline hazard for the modelling of autoregressive duration processes. A categorization of the durations allows us to reformulate the PH model as an ordered response model based on extreme value distributed errors...

  12. Butt rot defect and potential hazard in lodgepole pine on selected California recreational areas

    Science.gov (United States)

    Lee A. Paine

    1966-01-01

    Within the area sampled, potentially hazardous lodgepole pine were common on recreational sites. The incidence of decayed and mechanically weak trees was correlated with fire damage. Two-thirds of fire-scarred trees were decayed; one-third were rated potentially hazardous. Fire scars occurred roughly in proportion to level of plot recreational use.

  13. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  14. Dynamic modelling of a forward osmosis-nanofiltration integrated process for treating hazardous wastewater.

    Science.gov (United States)

    Pal, Parimal; Das, Pallabi; Chakrabortty, Sankha; Thakura, Ritwik

    2016-11-01

    Dynamic modelling and simulation of a nanofiltration-forward osmosis integrated complete system was done along with economic evaluation to pave the way for scale up of such a system for treating hazardous pharmaceutical wastes. The system operated in a closed loop not only protects surface water from the onslaught of hazardous industrial wastewater but also saves on cost of fresh water by turning wastewater recyclable at affordable price. The success of dynamic modelling in capturing the relevant transport phenomena is well reflected in high overall correlation coefficient value (R 2  > 0.98), low relative error (osmosis loop at a reasonably high flux of 56-58 l per square meter per hour.

  15. Conceptual geoinformation model of natural hazards risk assessment

    Science.gov (United States)

    Kulygin, Valerii

    2016-04-01

    Natural hazards are the major threat to safe interactions between nature and society. The assessment of the natural hazards impacts and their consequences is important in spatial planning and resource management. Today there is a challenge to advance our understanding of how socio-economical and climate changes will affect the frequency and magnitude of hydro-meteorological hazards and associated risks. However, the impacts from different types of natural hazards on various marine and coastal economic activities are not of the same type. In this study, the conceptual geomodel of risk assessment is presented to highlight the differentiation by the type of economic activities in extreme events risk assessment. The marine and coastal ecosystems are considered as the objects of management, on the one hand, and as the place of natural hazards' origin, on the other hand. One of the key elements in describing of such systems is the spatial characterization of their components. Assessment of ecosystem state is based on ecosystem indicators (indexes). They are used to identify the changes in time. The scenario approach is utilized to account for the spatio-temporal dynamics and uncertainty factors. Two types of scenarios are considered: scenarios of using ecosystem services by economic activities and scenarios of extreme events and related hazards. The reported study was funded by RFBR, according to the research project No. 16-35-60043 mol_a_dk.

  16. An Optimal Allocation Model of Public Transit Mode Proportion for the Low-Carbon Transportation

    Directory of Open Access Journals (Sweden)

    Linjun Lu

    2015-01-01

    Full Text Available Public transit has been widely recognized as a potential way to develop low-carbon transportation. In this paper, an optimal allocation model of public transit mode proportion (MPMP has been built to achieve the low-carbon public transit. Optimal ratios of passenger traffic for rail, bus, and taxi are derived by running the model using typical data. With different values of traffic demand, construction cost, travel time, and accessibilities, MPMP can generate corresponding optimal ratios, benefiting decision impacts analysis and decision makers. Instead of considering public transit as a united system, it is separated into units in this paper. And Shanghai is used to test model validity and practicality.

  17. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    Science.gov (United States)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  18. The Origins of Scintillator Non-Proportionality

    Science.gov (United States)

    Moses, W. W.; Bizarri, G. A.; Williams, R. T.; Payne, S. A.; Vasil'ev, A. N.; Singh, J.; Li, Q.; Grim, J. Q.; Choong, W.-S.

    2012-10-01

    Recent years have seen significant advances in both theoretically understanding and mathematically modeling the underlying causes of scintillator non-proportionality. The core cause is that the interaction of radiation with matter invariably leads to a non-uniform ionization density in the scintillator, coupled with the fact that the light yield depends on the ionization density. The mechanisms that lead to the luminescence dependence on ionization density are incompletely understood, but several important features have been identified, notably Auger-like processes (where two carriers of excitation interact with each other, causing one to de-excite non-radiatively), the inability of excitation carriers to recombine (caused either by trapping or physical separation), and the carrier mobility. This paper reviews the present understanding of the fundamental origins of scintillator non-proportionality, specifically the various theories that have been used to explain non-proportionality.

  19. The unconvincing product - Consumer versus expert hazard identification: A mental models study of novel foods

    DEFF Research Database (Denmark)

    Hagemann, Kit; Scholderer, Joachim

    and experts understanding of benefits and risks associated with three Novel foods (a potato, rice and functional food ingredients) using a relatively new methodology for the study of risk perception called Mental models. Mental models focus on the way people conceptualise hazardous processes and allows...... researchers to pit a normative analysis (expert mental models) against a descriptive analysis (consumer mental models). Expert models were elicited by means of a three-wave Delphi procedure from altogether 24 international experts and consumers models from in-dept interviews with Danish consumers. The results...... revealed that consumers´ and experts' mental models differed in connection to scope. Experts focused on the types of hazards for which risk assessments can be conducted under current legal frameworks whereas consumers were concerned about issues that lay outside the scope of current legislation. Experts...

  20. ASCHFLOW - A dynamic landslide run-out model for medium scale hazard analysis

    Czech Academy of Sciences Publication Activity Database

    Quan Luna, B.; Blahůt, Jan; van Asch, T.W.J.; van Westen, C.J.; Kappes, M.

    2016-01-01

    Roč. 3, 12 December (2016), č. článku 29. E-ISSN 2197-8670 Institutional support: RVO:67985891 Keywords : landslides * run-out models * medium scale hazard analysis * quantitative risk assessment Subject RIV: DE - Earth Magnetism, Geodesy, Geography

  1. Agent-based simulation for human-induced hazard analysis.

    Science.gov (United States)

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  2. Seismic hazard in the Intermountain West

    Science.gov (United States)

    Haller, Kathleen; Moschetti, Morgan P.; Mueller, Charles; Rezaeian, Sanaz; Petersen, Mark D.; Zeng, Yuehua

    2015-01-01

    The 2014 national seismic-hazard model for the conterminous United States incorporates new scientific results and important model adjustments. The current model includes updates to the historical catalog, which is spatially smoothed using both fixed-length and adaptive-length smoothing kernels. Fault-source characterization improved by adding faults, revising rates of activity, and incorporating new results from combined inversions of geologic and geodetic data. The update also includes a new suite of published ground motion models. Changes in probabilistic ground motion are generally less than 10% in most of the Intermountain West compared to the prior assessment, and ground-motion hazard in four Intermountain West cities illustrates the range and magnitude of change in the region. Seismic hazard at reference sites in Boise and Reno increased as much as 10%, whereas hazard in Salt Lake City decreased 5–6%. The largest change was in Las Vegas, where hazard increased 32–35%.

  3. Large scale debris-flow hazard assessment: a geotechnical approach and GIS modelling

    Directory of Open Access Journals (Sweden)

    G. Delmonaco

    2003-01-01

    Full Text Available A deterministic distributed model has been developed for large-scale debris-flow hazard analysis in the basin of River Vezza (Tuscany Region – Italy. This area (51.6 km 2 was affected by over 250 landslides. These were classified as debris/earth flow mainly involving the metamorphic geological formations outcropping in the area, triggered by the pluviometric event of 19 June 1996. In the last decades landslide hazard and risk analysis have been favoured by the development of GIS techniques permitting the generalisation, synthesis and modelling of stability conditions on a large scale investigation (>1:10 000. In this work, the main results derived by the application of a geotechnical model coupled with a hydrological model for the assessment of debris flows hazard analysis, are reported. This analysis has been developed starting by the following steps: landslide inventory map derived by aerial photo interpretation, direct field survey, generation of a database and digital maps, elaboration of a DTM and derived themes (i.e. slope angle map, definition of a superficial soil thickness map, geotechnical soil characterisation through implementation of a backanalysis on test slopes, laboratory test analysis, inference of the influence of precipitation, for distinct return times, on ponding time and pore pressure generation, implementation of a slope stability model (infinite slope model and generalisation of the safety factor for estimated rainfall events with different return times. Such an approach has allowed the identification of potential source areas of debris flow triggering. This is used to detected precipitation events with estimated return time of 10, 50, 75 and 100 years. The model shows a dramatic decrease of safety conditions for the simulation when is related to a 75 years return time rainfall event. It corresponds to an estimated cumulated daily intensity of 280–330 mm. This value can be considered the hydrological triggering

  4. A Bayesian sampling strategy for hazardous waste site characterization

    International Nuclear Information System (INIS)

    Skalski, J.R.

    1987-12-01

    Prior knowledge based on historical records or physical evidence often suggests the existence of a hazardous waste site. Initial surveys may provide additional or even conflicting evidence of site contamination. This article presents a Bayes sampling strategy that allocates sampling at a site using this prior knowledge. This sampling strategy minimizes the environmental risks of missing chemical or radionuclide hot spots at a waste site. The environmental risk is shown to be proportional to the size of the undetected hot spot or inversely proportional to the probability of hot spot detection. 12 refs., 2 figs

  5. Worklife expectancy in a cohort of Danish employees aged 55-65 years - comparing a multi-state Cox proportional hazard approach with conventional multi-state life tables.

    Science.gov (United States)

    Pedersen, Jacob; Bjorner, Jakob Bue

    2017-11-15

    Work life expectancy (WLE) expresses the expected time a person will remain in the labor market until he or she retires. This paper compares a life table approach to estimating WLE to an approach based on multi-state proportional hazards models. The two methods are used to estimate WLE in Danish members and non-members of an early retirement pensioning (ERP) scheme according to levels of health. In 2008, data on self-rated health (SRH) was collected from 5212 employees 55-65 years of age. Data on previous and subsequent long-term sickness absence, unemployment, returning to work, and disability pension was collected from national registers. WLE was estimated from multi-state life tables and through multi-state models. Results from the multi-state model approach agreed with the life table approach but provided narrower confidence intervals for small groups. The shortest WLE was seen for employees with poor SRH and ERP membership while the longest WLE was seen for those with good SRH and no ERP membership. Employees aged 55-56 years with poor SRH but no ERP membership had shorter WLE than employees with good SRH and ERP membership. Relative WLE reversed for the two groups after age 57. At age 55, employees with poor SRH could be expected to spend approximately 12 months on long-term sick leave and 9-10 months unemployed before they retired - regardless of ERP membership. ERP members with poor SRH could be expected to spend 4.6 years working, while non-members could be expected to spend 7.1 years working. WLE estimated through multi-state models provided an effective way to summarize complex data on labor market affiliation. WLE differed noticeably between members and non-members of the ERP scheme. It has been hypothesized that while ERP membership would prompt some employees to retire earlier than they would have done otherwise, this effect would be partly offset by reduced time spent on long-term sick leave or unemployment. Our data showed no indication of

  6. Genetic analysis of days from calving to first insemination and days open in Danish Holsteins using different models and censoring scenarios

    DEFF Research Database (Denmark)

    Hou, Y; Madsen, P; Labouriau, R

    2009-01-01

    that dealt with censored records in different ways: 1) a conventional linear model (LM) in which a penalty of 21 d was added to censored records; 2) a bivariate threshold-linear model (TLM), which included a threshold model for censoring status (0, 1) of the observations, and a linear model for ICF or DO...... without any penalty on censored records; 3) a right-censored linear model (CLM); 4) a Weibull proportional hazard model (SMW); and 5) a Cox proportional hazard model (SMC) constructed with piecewise constant baseline hazard function. The variance components for ICF and DO estimated from LM and TLM were...... similar, whereas CLM gave higher estimates of both additive genetic and residual components. Estimates of heritability from models LM, TLM, and CLM were very similar (0.102 to 0.108 for ICF, and 0.066 to 0.069 for DO). Heritabilities estimated using model SMW were 0.213 for ICF and 0.121 for DO...

  7. Physics-Based Simulations of Natural Hazards

    Science.gov (United States)

    Schultz, Kasey William

    Earthquakes and tsunamis are some of the most damaging natural disasters that we face. Just two recent events, the 2004 Indian Ocean earthquake and tsunami and the 2011 Haiti earthquake, claimed more than 400,000 lives. Despite their catastrophic impacts on society, our ability to predict these natural disasters is still very limited. The main challenge in studying the earthquake cycle is the non-linear and multi-scale properties of fault networks. Earthquakes are governed by physics across many orders of magnitude of spatial and temporal scales; from the scale of tectonic plates and their evolution over millions of years, down to the scale of rock fracturing over milliseconds to minutes at the sub-centimeter scale during an earthquake. Despite these challenges, there are useful patterns in earthquake occurrence. One such pattern, the frequency-magnitude relation, relates the number of large earthquakes to small earthquakes and forms the basis for assessing earthquake hazard. However the utility of these relations is proportional to the length of our earthquake records, and typical records span at most a few hundred years. Utilizing physics based interactions and techniques from statistical physics, earthquake simulations provide rich earthquake catalogs allowing us to measure otherwise unobservable statistics. In this dissertation I will discuss five applications of physics-based simulations of natural hazards, utilizing an earthquake simulator called Virtual Quake. The first is an overview of computing earthquake probabilities from simulations, focusing on the California fault system. The second uses simulations to help guide satellite-based earthquake monitoring methods. The third presents a new friction model for Virtual Quake and describes how we tune simulations to match reality. The fourth describes the process of turning Virtual Quake into an open source research tool. This section then focuses on a resulting collaboration using Virtual Quake for a detailed

  8. Development of a Probabilistic Tornado Wind Hazard Model for the Continental United States Volume I: Main Report

    International Nuclear Information System (INIS)

    Boissonnade, A; Hossain, Q; Kimball, J

    2000-01-01

    Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526, in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States

  9. Probabilistic seismic hazard assessment. Gentilly 2

    International Nuclear Information System (INIS)

    1996-03-01

    Results of this probabilistic seismic hazard assessment were determined using a suite of conservative assumptions. The intent of this study was to perform a limited hazard assessment that incorporated a range of technically defensible input parameters. To best achieve this goal, input selected for the hazard assessment tended to be conservative with respect to selection of attenuation modes, and seismicity parameters. Seismic hazard estimates at Gentilly 2 were most affected by selection of the attenuation model. Alternative definitions of seismic source zones had a relatively small impact on seismic hazard. A St. Lawrence Rift model including a maximum magnitude of 7.2 m b in the zone containing the site had little effect on the hazard estimate relative to other seismic source zonation models. Mean annual probabilities of exceeding the design peak ground acceleration, and the design response spectrum for the Gentilly 2 site were computed to lie in the range of 0.001 to 0.0001. This hazard result falls well within the range determined to be acceptable for nuclear reactor sites located throughout the eastern United States. (author) 34 refs., 6 tabs., 28 figs

  10. [Application of occupational hazard risk index model in occupational health risk assessment in a decorative coating manufacturing enterprises].

    Science.gov (United States)

    He, P L; Zhao, C X; Dong, Q Y; Hao, S B; Xu, P; Zhang, J; Li, J G

    2018-01-20

    Objective: To evaluate the occupational health risk of decorative coating manufacturing enterprises and to explore the applicability of occupational hazard risk index model in the health risk assessment, so as to provide basis for the health management of enterprises. Methods: A decorative coating manufacturing enterprise in Hebei Province was chosen as research object, following the types of occupational hazards and contact patterns, the occupational hazard risk index model was used to evaluate occupational health risk factors of occupational hazards in the key positions of the decorative coating manufacturing enterprise, and measured with workplace test results and occupational health examination. Results: The positions of oily painters, water-borne painters, filling workers and packers who contacted noise were moderate harm. And positions of color workers who contacted chromic acid salts, oily painters who contacted butyl acetate were mild harm. Other positions were harmless. The abnormal rate of contacting noise in physical examination results was 6.25%, and the abnormality was not checked by other risk factors. Conclusion: The occupational hazard risk index model can be used in the occupational health risk assessment of decorative coating manufacturing enterprises, and noise was the key harzard among occupational harzards in this enterprise.

  11. Report 2: Guidance document on practices to model and implement external flooding hazards in extended PSA

    International Nuclear Information System (INIS)

    Rebour, V.; Georgescu, G.; Leteinturier, D.; Raimond, E.; La Rovere, S.; Bernadara, P.; Vasseur, D.; Brinkman, H.; Groudev, P.; Ivanov, I.; Turschmann, M.; Sperbeck, S.; Potempski, S.; Hirata, K.; Kumar, Manorma

    2016-01-01

    This report provides a review of existing practices to model and implement external flooding hazards in existing level 1 PSA. The objective is to identify good practices on the modelling of initiating events (internal and external hazards) with a perspective of development of extended PSA and implementation of external events modelling in extended L1 PSA, its limitations/difficulties as far as possible. The views presented in this report are based on the ASAMPSA-E partners' experience and available publications. The report includes discussions on the following issues: - how to structure a L1 PSA for external flooding events, - information needed from geosciences in terms of hazards modelling and to build relevant modelling for PSA, - how to define and model the impact of each flooding event on SSCs with distinction between the flooding protective structures and devices and the effect of protection failures on other SSCs, - how to identify and model the common cause failures in one reactor or between several reactors, - how to apply HRA methodology for external flooding events, - how to credit additional emergency response (post-Fukushima measures like mobile equipment), - how to address the specific issues of L2 PSA, - how to perform and present risk quantification. (authors)

  12. Independent screening for single-index hazard rate models with ultrahigh dimensional features

    DEFF Research Database (Denmark)

    Gorst-Rasmussen, Anders; Scheike, Thomas

    2013-01-01

    can be viewed as the natural survival equivalent of correlation screening. We state conditions under which the method admits the sure screening property within a class of single-index hazard rate models with ultrahigh dimensional features and describe the generally detrimental effect of censoring...

  13. Non-proportional odds multivariate logistic regression of ordinal family data.

    Science.gov (United States)

    Zaloumis, Sophie G; Scurrah, Katrina J; Harrap, Stephen B; Ellis, Justine A; Gurrin, Lyle C

    2015-03-01

    Methods to examine whether genetic and/or environmental sources can account for the residual variation in ordinal family data usually assume proportional odds. However, standard software to fit the non-proportional odds model to ordinal family data is limited because the correlation structure of family data is more complex than for other types of clustered data. To perform these analyses we propose the non-proportional odds multivariate logistic regression model and take a simulation-based approach to model fitting using Markov chain Monte Carlo methods, such as partially collapsed Gibbs sampling and the Metropolis algorithm. We applied the proposed methodology to male pattern baldness data from the Victorian Family Heart Study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Seismic hazard in the eastern United States

    Science.gov (United States)

    Mueller, Charles; Boyd, Oliver; Petersen, Mark D.; Moschetti, Morgan P.; Rezaeian, Sanaz; Shumway, Allison

    2015-01-01

    The U.S. Geological Survey seismic hazard maps for the central and eastern United States were updated in 2014. We analyze results and changes for the eastern part of the region. Ratio maps are presented, along with tables of ground motions and deaggregations for selected cities. The Charleston fault model was revised, and a new fault source for Charlevoix was added. Background seismicity sources utilized an updated catalog, revised completeness and recurrence models, and a new adaptive smoothing procedure. Maximum-magnitude models and ground motion models were also updated. Broad, regional hazard reductions of 5%–20% are mostly attributed to new ground motion models with stronger near-source attenuation. The revised Charleston fault geometry redistributes local hazard, and the new Charlevoix source increases hazard in northern New England. Strong increases in mid- to high-frequency hazard at some locations—for example, southern New Hampshire, central Virginia, and eastern Tennessee—are attributed to updated catalogs and/or smoothing.

  15. An Uncertain Wage Contract Model with Adverse Selection and Moral Hazard

    Directory of Open Access Journals (Sweden)

    Xiulan Wang

    2014-01-01

    it can be characterized as an uncertain variable. Moreover, the employee's effort is unobservable to the employer, and the employee can select her effort level to maximize her utility. Thus, an uncertain wage contract model with adverse selection and moral hazard is established to maximize the employer's expected profit. And the model analysis mainly focuses on the equivalent form of the proposed wage contract model and the optimal solution to this form. The optimal solution indicates that both the employee's effort level and the wage increase with the employee's ability. Lastly, a numerical example is given to illustrate the effectiveness of the proposed model.

  16. Model of predicting proportion of diesel fuel and engine oil in diesel ...

    African Journals Online (AJOL)

    Viscosity of diesel adulterated SAE 40 engine oil at varying proportions of the mixture is presented. Regression, variation of intercept and the power parameters methods are used for developing polynomial and power law functions for predicting proportion of either diesel or engine oil in diesel adulterated SAE 40 engine oil ...

  17. Linear non-threshold (LNT) radiation hazards model and its evaluation

    International Nuclear Information System (INIS)

    Min Rui

    2011-01-01

    In order to introduce linear non-threshold (LNT) model used in study on the dose effect of radiation hazards and to evaluate its application, the analysis of comprehensive literatures was made. The results show that LNT model is more suitable to describe the biological effects in accuracy for high dose than that for low dose. Repairable-conditionally repairable model of cell radiation effects can be well taken into account on cell survival curve in the all conditions of high, medium and low absorbed dose range. There are still many uncertainties in assessment model of effective dose of internal radiation based on the LNT assumptions and individual mean organ equivalent, and it is necessary to establish gender-specific voxel human model, taking gender differences into account. From above, the advantages and disadvantages of various models coexist. Before the setting of the new theory and new model, LNT model is still the most scientific attitude. (author)

  18. General methods for analyzing bounded proportion data

    OpenAIRE

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  19. Tornado hazard model with the variation effects of tornado intensity along the path length

    International Nuclear Information System (INIS)

    Hirakuchi, Hiromaru; Nohara, Daisuke; Sugimoto, Soichiro; Eguchi, Yuzuru; Hattori, Yasuo

    2015-01-01

    Most of Japanese tornados have been reported near the coast line, where all of Japanese nuclear power plants are located. It is necessary for Japanese electric power companies to assess tornado risks on the plants according to a new regulation in 2013. The new regulatory guide exemplifies a tornado hazard model, which cannot consider the variation of tornado intensity along the path length and consequently produces conservative risk estimates. The guide also recommends the long narrow strip area along the coast line with the width of 5-10 km as a region of interest, although the model tends to estimate inadequate wind speeds due to the limit of application. The purpose of this study is to propose a new tornado hazard model which can be apply to the long narrow strip area. The new model can also consider the variation of tornado intensity along the path length and across the path width. (author)

  20. Preparing a seismic hazard model for Switzerland: the view from PEGASOS Expert Group 3 (EG1c)

    Energy Technology Data Exchange (ETDEWEB)

    Musson, R. M. W. [British Geological Survey, West Mains Road, Edinburgh, EH9 3LA (United Kingdom); Sellami, S. [Swiss Seismological Service, ETH-Hoenggerberg, Zuerich (Switzerland); Bruestle, W. [Regierungspraesidium Freiburg, Abt. 9: Landesamt fuer Geologie, Rohstoffe und Bergbau, Ref. 98: Landeserdbebendienst, Freiburg im Breisgau (Germany)

    2009-05-15

    The seismic hazard model used in the PEGASOS project for assessing earth-quake hazard at four NPP sites was a composite of four sub-models, each produced by a team of three experts. In this paper, one of these models is described in detail by the authors. A criticism sometimes levelled at probabilistic seismic hazard studies is that the process by which seismic source zones are arrived at is obscure, subjective and inconsistent. Here, we attempt to recount the stages by which the model evolved, and the decisions made along the way. In particular, a macro-to-micro approach was used, in which three main stages can be described. The first was the characterisation of the overall kinematic model, the 'big picture' of regional seismo-genesis. Secondly, this was refined to a more detailed seismotectonic model. Lastly, this was used as the basis of individual sources, for which parameters can be assessed. Some basic questions had also to be answered about aspects of the approach to modelling to be used: for instance, is spatial smoothing an appropriate tool to apply? Should individual fault sources be modelled in an intra-plate environment? Also, the extent to which alternative modelling decisions should be expressed in a logic tree structure has to be considered. (author)

  1. Preparing a seismic hazard model for Switzerland: the view from PEGASOS Expert Group 3 (EG1c)

    International Nuclear Information System (INIS)

    Musson, R. M. W.; Sellami, S.; Bruestle, W.

    2009-01-01

    The seismic hazard model used in the PEGASOS project for assessing earth-quake hazard at four NPP sites was a composite of four sub-models, each produced by a team of three experts. In this paper, one of these models is described in detail by the authors. A criticism sometimes levelled at probabilistic seismic hazard studies is that the process by which seismic source zones are arrived at is obscure, subjective and inconsistent. Here, we attempt to recount the stages by which the model evolved, and the decisions made along the way. In particular, a macro-to-micro approach was used, in which three main stages can be described. The first was the characterisation of the overall kinematic model, the 'big picture' of regional seismo-genesis. Secondly, this was refined to a more detailed seismotectonic model. Lastly, this was used as the basis of individual sources, for which parameters can be assessed. Some basic questions had also to be answered about aspects of the approach to modelling to be used: for instance, is spatial smoothing an appropriate tool to apply? Should individual fault sources be modelled in an intra-plate environment? Also, the extent to which alternative modelling decisions should be expressed in a logic tree structure has to be considered. (author)

  2. Multi-hazard risk assessment of the Republic of Mauritius

    Science.gov (United States)

    Mysiak, Jaroslav; Galli, Alberto; Amadio, Mattia; Teatini, Chiara

    2013-04-01

    The Republic of Mauritius (ROM) is a small island developing state (SIDS), part of the Mascarene Islands in West Indian Ocean, comprised by Mauritius, Rodrigues, Agalega and St. Brandon islands and several islets. ROM is exposed to many natural hazards notably cyclones, tsunamis, torrential precipitation, landslides, and droughts; and highly vulnerable sea level rise (SLR) driven by human induced climate change. The multihazard risk assessment presented in this paper is aimed at identifying the areas prone to flood, inundation and landslide hazard, and inform the development of strategy for disaster risk reduction (DRR) and climate change adaptation (CCA). Climate risk analysis - a central component of the analysis - is one of the first comprehensive climate modelling studies conducted for the country. Climate change may lift the temperature by 1-2 degree Celsius by 2060-2070, and increase sizably the intensity and frequency of extreme precipitation events. According to the IPCC Forth Assessment Report (AR4), the expected Sea Level Rise (SLR) ranges between 16 and 49 cm. Individually or in combination, the inland flood, coastal inundation and landslide hazards affect large proportion of the country. Sea level rise and the changes in precipitation regimes will amplified existing vulnerabilities and create new ones. The paper outlines an Action plan for Disaster Risk Reduction that takes into account the likely effects of climate change. The Action Plan calls on the government to establish a National Platform for Disaster Risk Reduction as recommended by the Hyogo Framework for Action (HFA) 2005-2015. It consists of nine recommendations which, if put in practice, will significantly reduce the annual damage to natural hazard and produce additional (ancillary) benefits in economic, social and environmental terms.

  3. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    International Nuclear Information System (INIS)

    Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey

    2015-01-01

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships

  4. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Harper, Bryan [Oregon State University (United States); Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan [Pacific Northwest National Laboratory (United States); Tang, Kaizhi [Intelligent Automation, Inc. (United States); Heredia-Langner, Alejandro [Pacific Northwest National Laboratory (United States); Lins, Roberto [CPqAM, Oswaldo Cruz Foundation, FIOCRUZ-PE (Brazil); Harper, Stacey, E-mail: stacey.harper@oregonstate.edu [Oregon State University (United States)

    2015-06-15

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships.

  5. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    Science.gov (United States)

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  6. Worklife expectancy in a cohort of Danish employees aged 55–65 years - comparing a multi-state Cox proportional hazard approach with conventional multi-state life tables

    Directory of Open Access Journals (Sweden)

    Jacob Pedersen

    2017-11-01

    Full Text Available Abstract Background Work life expectancy (WLE expresses the expected time a person will remain in the labor market until he or she retires. This paper compares a life table approach to estimating WLE to an approach based on multi-state proportional hazards models. The two methods are used to estimate WLE in Danish members and non-members of an early retirement pensioning (ERP scheme according to levels of health. Methods In 2008, data on self-rated health (SRH was collected from 5212 employees 55–65 years of age. Data on previous and subsequent long-term sickness absence, unemployment, returning to work, and disability pension was collected from national registers. WLE was estimated from multi-state life tables and through multi-state models. Results Results from the multi-state model approach agreed with the life table approach but provided narrower confidence intervals for small groups. The shortest WLE was seen for employees with poor SRH and ERP membership while the longest WLE was seen for those with good SRH and no ERP membership. Employees aged 55–56 years with poor SRH but no ERP membership had shorter WLE than employees with good SRH and ERP membership. Relative WLE reversed for the two groups after age 57. At age 55, employees with poor SRH could be expected to spend approximately 12 months on long-term sick leave and 9–10 months unemployed before they retired – regardless of ERP membership. ERP members with poor SRH could be expected to spend 4.6 years working, while non-members could be expected to spend 7.1 years working. Conclusion WLE estimated through multi-state models provided an effective way to summarize complex data on labor market affiliation. WLE differed noticeably between members and non-members of the ERP scheme. It has been hypothesized that while ERP membership would prompt some employees to retire earlier than they would have done otherwise, this effect would be partly offset by reduced time spent on

  7. User's manual of a computer code for seismic hazard evaluation for assessing the threat to a facility by fault model. SHEAT-FM

    International Nuclear Information System (INIS)

    Sugino, Hideharu; Onizawa, Kunio; Suzuki, Masahide

    2005-09-01

    To establish the reliability evaluation method for aged structural component, we developed a probabilistic seismic hazard evaluation code SHEAT-FM (Seismic Hazard Evaluation for Assessing the Threat to a facility site - Fault Model) using a seismic motion prediction method based on fault model. In order to improve the seismic hazard evaluation, this code takes the latest knowledge in the field of earthquake engineering into account. For example, the code involves a group delay time of observed records and an update process model of active fault. This report describes the user's guide of SHEAT-FM, including the outline of the seismic hazard evaluation, specification of input data, sample problem for a model site, system information and execution method. (author)

  8. Three-part joint modeling methods for complex functional data mixed with zero-and-one-inflated proportions and zero-inflated continuous outcomes with skewness.

    Science.gov (United States)

    Li, Haocheng; Staudenmayer, John; Wang, Tianying; Keadle, Sarah Kozey; Carroll, Raymond J

    2018-02-20

    We take a functional data approach to longitudinal studies with complex bivariate outcomes. This work is motivated by data from a physical activity study that measured 2 responses over time in 5-minute intervals. One response is the proportion of time active in each interval, a continuous proportions with excess zeros and ones. The other response, energy expenditure rate in the interval, is a continuous variable with excess zeros and skewness. This outcome is complex because there are 3 possible activity patterns in each interval (inactive, partially active, and completely active), and those patterns, which are observed, induce both nonrandom and random associations between the responses. More specifically, the inactive pattern requires a zero value in both the proportion for active behavior and the energy expenditure rate; a partially active pattern means that the proportion of activity is strictly between zero and one and that the energy expenditure rate is greater than zero and likely to be moderate, and the completely active pattern means that the proportion of activity is exactly one, and the energy expenditure rate is greater than zero and likely to be higher. To address these challenges, we propose a 3-part functional data joint modeling approach. The first part is a continuation-ratio model to reorder the ordinal valued 3 activity patterns. The second part models the proportions when they are in interval (0,1). The last component specifies the skewed continuous energy expenditure rate with Box-Cox transformations when they are greater than zero. In this 3-part model, the regression structures are specified as smooth curves measured at various time points with random effects that have a correlation structure. The smoothed random curves for each variable are summarized using a few important principal components, and the association of the 3 longitudinal components is modeled through the association of the principal component scores. The difficulties in

  9. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    Science.gov (United States)

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  10. LAV@HAZARD: a Web-GIS Framework for Real-Time Forecasting of Lava Flow Hazards

    Science.gov (United States)

    Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.

    2014-12-01

    Crucial to lava flow hazard assessment is the development of tools for real-time prediction of flow paths, flow advance rates, and final flow lengths. Accurate prediction of flow paths and advance rates requires not only rapid assessment of eruption conditions (especially effusion rate) but also improved models of lava flow emplacement. Here we present the LAV@HAZARD web-GIS framework, which combines spaceborne remote sensing techniques and numerical simulations for real-time forecasting of lava flow hazards. By using satellite-derived discharge rates to drive a lava flow emplacement model, LAV@HAZARD allows timely definition of parameters and maps essential for hazard assessment, including the propagation time of lava flows and the maximum run-out distance. We take advantage of the flexibility of the HOTSAT thermal monitoring system to process satellite images coming from sensors with different spatial, temporal and spectral resolutions. HOTSAT was designed to ingest infrared satellite data acquired by the MODIS and SEVIRI sensors to output hot spot location, lava thermal flux and discharge rate. We use LAV@HAZARD to merge this output with the MAGFLOW physics-based model to simulate lava flow paths and to update, in a timely manner, flow simulations. Thus, any significant changes in lava discharge rate are included in the predictions. A significant benefit in terms of computational speed was obtained thanks to the parallel implementation of MAGFLOW on graphic processing units (GPUs). All this useful information has been gathered into the LAV@HAZARD platform which, due to the high degree of interactivity, allows generation of easily readable maps and a fast way to explore alternative scenarios. We will describe and demonstrate the operation of this framework using a variety of case studies pertaining to Mt Etna, Sicily. Although this study was conducted on Mt Etna, the approach used is designed to be applicable to other volcanic areas around the world.

  11. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model

    Science.gov (United States)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza

    2017-08-01

    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  12. BEHAVIORAL HAZARD IN HEALTH INSURANCE*

    Science.gov (United States)

    Baicker, Katherine; Mullainathan, Sendhil; Schwartzstein, Joshua

    2015-01-01

    A fundamental implication of standard moral hazard models is overuse of low-value medical care because copays are lower than costs. In these models, the demand curve alone can be used to make welfare statements, a fact relied on by much empirical work. There is ample evidence, though, that people misuse care for a different reason: mistakes, or “behavioral hazard.” Much high-value care is underused even when patient costs are low, and some useless care is bought even when patients face the full cost. In the presence of behavioral hazard, welfare calculations using only the demand curve can be off by orders of magnitude or even be the wrong sign. We derive optimal copay formulas that incorporate both moral and behavioral hazard, providing a theoretical foundation for value-based insurance design and a way to interpret behavioral “nudges.” Once behavioral hazard is taken into account, health insurance can do more than just provide financial protection—it can also improve health care efficiency. PMID:23930294

  13. Environmental justice implications of industrial hazardous waste generation in India: a national scale analysis

    Science.gov (United States)

    Basu, Pratyusha; Chakraborty, Jayajit

    2016-12-01

    While rising air and water pollution have become issues of widespread public concern in India, the relationship between spatial distribution of environmental pollution and social disadvantage has received less attention. This lack of attention becomes particularly relevant in the context of industrial pollution, as India continues to pursue industrial development policies without sufficient regard to its adverse social impacts. This letter examines industrial pollution in India from an environmental justice (EJ) perspective by presenting a national scale study of social inequities in the distribution of industrial hazardous waste generation. Our analysis connects district-level data from the 2009 National Inventory of Hazardous Waste Generating Industries with variables representing urbanization, social disadvantage, and socioeconomic status from the 2011 Census of India. Our results indicate that more urbanized and densely populated districts with a higher proportion of socially and economically disadvantaged residents are significantly more likely to generate hazardous waste. The quantity of hazardous waste generated is significantly higher in more urbanized but sparsely populated districts with a higher proportion of economically disadvantaged households, after accounting for other relevant explanatory factors such as literacy and social disadvantage. These findings underscore the growing need to incorporate EJ considerations in future industrial development and waste management in India.

  14. A structure for models of hazardous materials with complex behavior

    International Nuclear Information System (INIS)

    Rodean, H.C.

    1991-01-01

    Most atmospheric dispersion models used to assess the environmental consequences of accidental releases of hazardous chemicals do not have the capability to simulate the pertinent chemical and physical processes associated with the release of the material and its mixing with the atmosphere. The purpose of this paper is to present a materials sub-model with the flexibility to simulate the chemical and physical behaviour of a variety of materials released into the atmosphere. The model, which is based on thermodynamic equilibrium, incorporates the ideal gas law, temperature-dependent vapor pressure equations, temperature-dependent dissociation reactions, and reactions with atmospheric water vapor. The model equations, written in terms of pressure ratios and dimensionless parameters, are used to construct equilibrium diagrams with temperature and the mass fraction of the material in the mixture as coordinates. The model's versatility is demonstrated by its application to the release of UF 6 and N 2 O 4 , two materials with very different physical and chemical properties. (author)

  15. A novel concurrent pictorial choice model of mood-induced relapse in hazardous drinkers.

    Science.gov (United States)

    Hardy, Lorna; Hogarth, Lee

    2017-12-01

    This study tested whether a novel concurrent pictorial choice procedure, inspired by animal self-administration models, is sensitive to the motivational effect of negative mood induction on alcohol-seeking in hazardous drinkers. Forty-eight hazardous drinkers (scoring ≥7 on the Alcohol Use Disorders Inventory) recruited from the community completed measures of alcohol dependence, depression, and drinking coping motives. Baseline alcohol-seeking was measured by percent choice to enlarge alcohol- versus food-related thumbnail images in two alternative forced-choice trials. Negative and positive mood was then induced in succession by means of self-referential affective statements and music, and percent alcohol choice was measured after each induction in the same way as baseline. Baseline alcohol choice correlated with alcohol dependence severity, r = .42, p = .003, drinking coping motives (in two questionnaires, r = .33, p = .02 and r = .46, p = .001), and depression symptoms, r = .31, p = .03. Alcohol choice was increased by negative mood over baseline (p choice was not related to gender, alcohol dependence, drinking to cope, or depression symptoms (ps ≥ .37). The concurrent pictorial choice measure is a sensitive index of the relative value of alcohol, and provides an accessible experimental model to study negative mood-induced relapse mechanisms in hazardous drinkers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  17. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    Science.gov (United States)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    Reinsurance companies are stating a high increase in natural hazard related losses, both insured and economic losses, within the last decades on a global scale. This ongoing trend can be described as a product of the dynamic in the natural and in the anthroposphere. To analyze the potential impact of natural hazard process to a certain insurance portfolio or to the society in general, reinsurance companies or risk management consultants have developed loss models. However, those models are generally not fitting the scale dependent demand on regional scales like it is appropriate (i) for analyses on the scale of a specific province or (ii) for portfolio analyses of regional insurance companies. Moreover, the scientific basis of most of the models is not transparent documented and therefore scientific evaluations concerning the methodology concepts are not possible (black box). This is contrary to the scientific principles of transparency and traceability. Especially in mountain regions like the European Alps with their inherent (i) specific characteristic on small scales, (ii) the relative high process dynamics in general, (iii) the occurrence of gravitative mass movements which are related to high relief energy and thus only exists in mountain regions, (iv) the small proportion of the area of permanent settlement on the overall area, (v) the high value concentration in the valley floors, (vi) the exposition of important infrastructures and lifelines, and others, analyses must consider these circumstances adequately. Therefore, risk-based analyses are methodically estimating the potential consequences of hazard process on the built environment standardized with the risk components (i) hazard, (ii) elements at risk, and (iii) vulnerability. However, most research and progress have been made in the field of hazard analyses, whereas the other both components are not developed accordingly. Since these three general components are influencing factors without any

  18. Application of decision tree model for the ground subsidence hazard mapping near abandoned underground coal mines.

    Science.gov (United States)

    Lee, Saro; Park, Inhye

    2013-09-30

    Subsidence of ground caused by underground mines poses hazards to human life and property. This study analyzed the hazard to ground subsidence using factors that can affect ground subsidence and a decision tree approach in a geographic information system (GIS). The study area was Taebaek, Gangwon-do, Korea, where many abandoned underground coal mines exist. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 50/50 for training and validation of the models. A data-mining classification technique was applied to the GSH mapping, and decision trees were constructed using the chi-squared automatic interaction detector (CHAID) and the quick, unbiased, and efficient statistical tree (QUEST) algorithms. The frequency ratio model was also applied to the GSH mapping for comparing with probabilistic model. The resulting GSH maps were validated using area-under-the-curve (AUC) analysis with the subsidence area data that had not been used for training the model. The highest accuracy was achieved by the decision tree model using CHAID algorithm (94.01%) comparing with QUEST algorithms (90.37%) and frequency ratio model (86.70%). These accuracies are higher than previously reported results for decision tree. Decision tree methods can therefore be used efficiently for GSH analysis and might be widely used for prediction of various spatial events. Copyright © 2013. Published by Elsevier Ltd.

  19. A model used to derive hazardous waste concentration limits aiming at the reduction of toxic and hazardous wastes. Applications to illustrate the discharge of secondary categories types B and C

    International Nuclear Information System (INIS)

    Paris, P.

    1989-11-01

    This report describes a model which may be used to derive hazardous waste concentration limits in order to prevent ground water pollution from a landfill disposal. First the leachate concentration limits are determined taking into account the attenuation capacity of the landfill-site as a whole; waste concentrations are then derived by an elution model which assumes a constant ratio between liquid-solid concentrations. In the example two types of landfill have been considered and in each case concentration limits have been calculated for some hazardous substances and compared with the corresponding regulatory limits. (author)

  20. Combining computational models for landslide hazard assessment of Guantánamo province, Cuba

    NARCIS (Netherlands)

    Castellanos Abella, E.A.

    2008-01-01

    As part of the Cuban system for landslide disaster management, a methodology was developed for regional scale landslide hazard assessment, which is a combination of different models. The method was applied in Guantánamo province at 1:100 000 scale. The analysis started with an extensive aerial

  1. Semi-parametric proportional intensity models robustness for right-censored recurrent failure data

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, S.T. [College of Engineering, University of Oklahoma, 202 West Boyd St., Room 107, Norman, OK 73019 (United States); Landers, T.L. [College of Engineering, University of Oklahoma, 202 West Boyd St., Room 107, Norman, OK 73019 (United States)]. E-mail: landers@ou.edu; Rhoads, T.R. [College of Engineering, University of Oklahoma, 202 West Boyd St., Room 107, Norman, OK 73019 (United States)

    2005-10-01

    This paper reports the robustness of the four proportional intensity (PI) models: Prentice-Williams-Peterson-gap time (PWP-GT), PWP-total time (PWP-TT), Andersen-Gill (AG), and Wei-Lin-Weissfeld (WLW), for right-censored recurrent failure event data. The results are beneficial to practitioners in anticipating the more favorable engineering application domains and selecting appropriate PI models. The PWP-GT and AG prove to be models of choice over ranges of sample sizes, shape parameters, and censoring severity. At the smaller sample size (U=60), where there are 30 per class for a two-level covariate, the PWP-GT proves to perform well for moderate right-censoring (P {sub c}{<=}0.8), where 80% of the units have some censoring, and moderately decreasing, constant, and moderately increasing rates of occurrence of failures (power-law NHPP shape parameter in the range of 0.8{<=}{delta}{<=}1.8). For the large sample size (U=180), the PWP-GT performs well for severe right-censoring (0.8

    model proves to outperform the PWP-TT and WLW for stationary processes (HPP) across a wide range of right-censorship (0.0{<=}P {sub c}{<=}1.0) and for sample sizes of 60 or more.

  2. Three-dimensional displays for natural hazards analysis, using classified Landsat Thematic Mapper digital data and large-scale digital elevation models

    Science.gov (United States)

    Butler, David R.; Walsh, Stephen J.; Brown, Daniel G.

    1991-01-01

    Methods are described for using Landsat Thematic Mapper digital data and digital elevation models for the display of natural hazard sites in a mountainous region of northwestern Montana, USA. Hazard zones can be easily identified on the three-dimensional images. Proximity of facilities such as highways and building locations to hazard sites can also be easily displayed. A temporal sequence of Landsat TM (or similar) satellite data sets could also be used to display landscape changes associated with dynamic natural hazard processes.

  3. TRENT2D WG: a smart web infrastructure for debris-flow modelling and hazard assessment

    Science.gov (United States)

    Zorzi, Nadia; Rosatti, Giorgio; Zugliani, Daniel; Rizzi, Alessandro; Piffer, Stefano

    2016-04-01

    Mountain regions are naturally exposed to geomorphic flows, which involve large amounts of sediments and induce significant morphological modifications. The physical complexity of this class of phenomena represents a challenging issue for modelling, leading to elaborate theoretical frameworks and sophisticated numerical techniques. In general, geomorphic-flows models proved to be valid tools in hazard assessment and management. However, model complexity seems to represent one of the main obstacles to the diffusion of advanced modelling tools between practitioners and stakeholders, although the UE Flood Directive (2007/60/EC) requires risk management and assessment to be based on "best practices and best available technologies". Furthermore, several cutting-edge models are not particularly user-friendly and multiple stand-alone software are needed to pre- and post-process modelling data. For all these reasons, users often resort to quicker and rougher approaches, leading possibly to unreliable results. Therefore, some effort seems to be necessary to overcome these drawbacks, with the purpose of supporting and encouraging a widespread diffusion of the most reliable, although sophisticated, modelling tools. With this aim, this work presents TRENT2D WG, a new smart modelling solution for the state-of-the-art model TRENT2D (Armanini et al., 2009, Rosatti and Begnudelli, 2013), which simulates debris flows and hyperconcentrated flows adopting a two-phase description over a mobile bed. TRENT2D WG is a web infrastructure joining advantages offered by the software-delivering model SaaS (Software as a Service) and by WebGIS technology and hosting a complete and user-friendly working environment for modelling. In order to develop TRENT2D WG, the model TRENT2D was converted into a service and exposed on a cloud server, transferring computational burdens from the user hardware to a high-performing server and reducing computational time. Then, the system was equipped with an

  4. VHub - Cyberinfrastructure for volcano eruption and hazards modeling and simulation

    Science.gov (United States)

    Valentine, G. A.; Jones, M. D.; Bursik, M. I.; Calder, E. S.; Gallo, S. M.; Connor, C.; Carn, S. A.; Rose, W. I.; Moore-Russo, D. A.; Renschler, C. S.; Pitman, B.; Sheridan, M. F.

    2009-12-01

    Volcanic risk is increasing as populations grow in active volcanic regions, and as national economies become increasingly intertwined. In addition to their significance to risk, volcanic eruption processes form a class of multiphase fluid dynamics with rich physics on many length and time scales. Risk significance, physics complexity, and the coupling of models to complex dynamic spatial datasets all demand the development of advanced computational techniques and interdisciplinary approaches to understand and forecast eruption dynamics. Innovative cyberinfrastructure is needed to enable global collaboration and novel scientific creativity, while simultaneously enabling computational thinking in real-world risk mitigation decisions - an environment where quality control, documentation, and traceability are key factors. Supported by NSF, we are developing a virtual organization, referred to as VHub, to address this need. Overarching goals of the VHub project are: Dissemination. Make advanced modeling and simulation capabilities and key data sets readily available to researchers, students, and practitioners around the world. Collaboration. Provide a mechanism for participants not only to be users but also co-developers of modeling capabilities, and contributors of experimental and observational data sets for use in modeling and simulation, in a collaborative environment that reaches far beyond local work groups. Comparison. Facilitate comparison between different models in order to provide the practitioners with guidance for choosing the "right" model, depending upon the intended use, and provide a platform for multi-model analysis of specific problems and incorporation into probabilistic assessments. Application. Greatly accelerate access and application of a wide range of modeling tools and related data sets to agencies around the world that are charged with hazard planning, mitigation, and response. Education. Provide resources that will promote the training of the

  5. Molten salt hazardous waste disposal process utilizing gas/liquid contact for salt recovery

    International Nuclear Information System (INIS)

    Grantham, L.F.; McKenzie, D.E.

    1984-01-01

    The products of a molten salt combustion of hazardous wastes are converted into a cooled gas, which can be filtered to remove hazardous particulate material, and a dry flowable mixture of salts, which can be recycled for use in the molten salt combustion, by means of gas/liquid contact between the gaseous products of combustion of the hazardous waste and a solution produced by quenching the spent melt from such molten salt combustion. The process results in maximizing the proportion of useful materials recovered from the molten salt combustion and minimizing the volume of material which must be discarded. In a preferred embodiment a spray dryer treatment is used to achieve the desired gas/liquid contact

  6. A "mental models" approach to the communication of subsurface hydrology and hazards

    Science.gov (United States)

    Gibson, Hazel; Stewart, Iain S.; Pahl, Sabine; Stokes, Alison

    2016-05-01

    Communicating information about geological and hydrological hazards relies on appropriately worded communications targeted at the needs of the audience. But what are these needs, and how does the geoscientist discern them? This paper adopts a psychological "mental models" approach to assess the public perception of the geological subsurface, presenting the results of attitudinal studies and surveys in three communities in the south-west of England. The findings reveal important preconceptions and misconceptions regarding the impact of hydrological systems and hazards on the geological subsurface, notably in terms of the persistent conceptualisation of underground rivers and the inferred relations between flooding and human activity. The study demonstrates how such mental models can provide geoscientists with empirical, detailed and generalised data of perceptions surrounding an issue, as well reveal unexpected outliers in perception that they may not have considered relevant, but which nevertheless may locally influence communication. Using this approach, geoscientists can develop information messages that more directly engage local concerns and create open engagement pathways based on dialogue, which in turn allow both geoscience "experts" and local "non-experts" to come together and understand each other more effectively.

  7. Proportionality for Military Leaders

    National Research Council Canada - National Science Library

    Brown, Gary D

    2000-01-01

    .... Especially lacking is a realization that there are four distinct types of proportionality. In determining whether a particular resort to war is just, national leaders must consider the proportionality of the conflict (i.e...

  8. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    Science.gov (United States)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  9. Modeling Flood Hazard Zones at the Sub-District Level with the Rational Model Integrated with GIS and Remote Sensing Approaches

    Directory of Open Access Journals (Sweden)

    Daniel Asare-Kyei

    2015-07-01

    Full Text Available Robust risk assessment requires accurate flood intensity area mapping to allow for the identification of populations and elements at risk. However, available flood maps in West Africa lack spatial variability while global datasets have resolutions too coarse to be relevant for local scale risk assessment. Consequently, local disaster managers are forced to use traditional methods such as watermarks on buildings and media reports to identify flood hazard areas. In this study, remote sensing and Geographic Information System (GIS techniques were combined with hydrological and statistical models to delineate the spatial limits of flood hazard zones in selected communities in Ghana, Burkina Faso and Benin. The approach involves estimating peak runoff concentrations at different elevations and then applying statistical methods to develop a Flood Hazard Index (FHI. Results show that about half of the study areas fall into high intensity flood zones. Empirical validation using statistical confusion matrix and the principles of Participatory GIS show that flood hazard areas could be mapped at an accuracy ranging from 77% to 81%. This was supported with local expert knowledge which accurately classified 79% of communities deemed to be highly susceptible to flood hazard. The results will assist disaster managers to reduce the risk to flood disasters at the community level where risk outcomes are first materialized.

  10. On adjustment for auxiliary covariates in additive hazard models for the analysis of randomized experiments

    DEFF Research Database (Denmark)

    Vansteelandt, S.; Martinussen, Torben; Tchetgen, E. J Tchetgen

    2014-01-01

    We consider additive hazard models (Aalen, 1989) for the effect of a randomized treatment on a survival outcome, adjusting for auxiliary baseline covariates. We demonstrate that the Aalen least-squares estimator of the treatment effect parameter is asymptotically unbiased, even when the hazard...... that, in view of its robustness against model misspecification, Aalen least-squares estimation is attractive for evaluating treatment effects on a survival outcome in randomized experiments, and the primary reasons to consider baseline covariate adjustment in such settings could be interest in subgroup......'s dependence on time or on the auxiliary covariates is misspecified, and even away from the null hypothesis of no treatment effect. We furthermore show that adjustment for auxiliary baseline covariates does not change the asymptotic variance of the estimator of the effect of a randomized treatment. We conclude...

  11. Landslides Hazard Assessment Using Different Approaches

    Directory of Open Access Journals (Sweden)

    Coman Cristina

    2017-06-01

    Full Text Available Romania represents one of Europe’s countries with high landslides occurrence frequency. Landslide hazard maps are designed by considering the interaction of several factors which, by their joint action may affect the equilibrium state of the natural slopes. The aim of this paper is landslides hazard assessment using the methodology provided by the Romanian national legislation and a very largely used statistical method. The final results of these two analyses are quantitative or semi-quantitative landslides hazard maps, created in geographic information system environment. The data base used for this purpose includes: geological and hydrogeological data, digital terrain model, hydrological data, land use, seismic action, anthropic action and an inventory of active landslides. The GIS landslides hazard models were built for the geographical area of the Iasi city, located in the north-east side of Romania.

  12. Multi-hazard risk analysis related to hurricanes

    Science.gov (United States)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is

  13. Semiparametric accelerated failure time cure rate mixture models with competing risks.

    Science.gov (United States)

    Choi, Sangbum; Zhu, Liang; Huang, Xuelin

    2018-01-15

    Modern medical treatments have substantially improved survival rates for many chronic diseases and have generated considerable interest in developing cure fraction models for survival data with a non-ignorable cured proportion. Statistical analysis of such data may be further complicated by competing risks that involve multiple types of endpoints. Regression analysis of competing risks is typically undertaken via a proportional hazards model adapted on cause-specific hazard or subdistribution hazard. In this article, we propose an alternative approach that treats competing events as distinct outcomes in a mixture. We consider semiparametric accelerated failure time models for the cause-conditional survival function that are combined through a multinomial logistic model within the cure-mixture modeling framework. The cure-mixture approach to competing risks provides a means to determine the overall effect of a treatment and insights into how this treatment modifies the components of the mixture in the presence of a cure fraction. The regression and nonparametric parameters are estimated by a nonparametric kernel-based maximum likelihood estimation method. Variance estimation is achieved through resampling methods for the kernel-smoothed likelihood function. Simulation studies show that the procedures work well in practical settings. Application to a sarcoma study demonstrates the use of the proposed method for competing risk data with a cure fraction. Copyright © 2017 John Wiley & Sons, Ltd.

  14. A Novel Load Capacity Model with a Tunable Proportion of Load Redistribution against Cascading Failures

    Directory of Open Access Journals (Sweden)

    Zhen-Hao Zhang

    2018-01-01

    Full Text Available Defence against cascading failures is of great theoretical and practical significance. A novel load capacity model with a tunable proportion is proposed. We take degree and clustering coefficient into account to redistribute the loads of broken nodes. The redistribution is local, where the loads of broken nodes are allocated to their nearest neighbours. Our model has been applied on artificial networks as well as two real networks. Simulation results show that networks get more vulnerable and sensitive to intentional attacks along with the decrease of average degree. In addition, the critical threshold from collapse to intact states is affected by the tunable parameter. We can adjust the tunable parameter to get the optimal critical threshold and make the systems more robust against cascading failures.

  15. Seismic hazard, risk, and design for South America

    Science.gov (United States)

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best

  16. Assessment of Debris Flow Potential Hazardous Zones Using Numerical Models in the Mountain Foothills of Santiago, Chile

    Science.gov (United States)

    Celis, C.; Sepulveda, S. A.; Castruccio, A.; Lara, M.

    2017-12-01

    Debris and mudflows are some of the main geological hazards in the mountain foothills of Central Chile. The risk of flows triggered in the basins of ravines that drain the Andean frontal range into the capital city, Santiago, increases with time due to accelerated urban expansion. Susceptibility assessments were made by several authors to detect the main active ravines in the area. Macul and San Ramon ravines have a high to medium debris flow susceptibility, whereas Lo Cañas, Apoquindo and Las Vizcachas ravines have a medium to low debris flow susceptibility. This study emphasizes in delimiting the potential hazardous zones using the numerical simulation program RAMMS-Debris Flows with the Voellmy model approach, and the debris-flow model LAHARZ. This is carried out by back-calculating the frictional parameters in the depositional zone with a known event as the debris and mudflows in Macul and San Ramon ravines, on May 3rd, 1993, for the RAMMS approach. In the same scenario, we calibrate the coefficients to match conditions of the mountain foothills of Santiago for the LAHARZ model. We use the information obtained for every main ravine in the study area, mainly for the similarity in slopes and material transported. Simulations were made for the worst-case scenario, caused by the combination of intense rainfall storms, a high 0°C isotherm level and material availability in the basins where the flows are triggered. The results show that the runout distances are well simulated, therefore a debris-flow hazard map could be developed with these models. Correlation issues concerning the run-up, deposit thickness and transversal areas are reported. Hence, the models do not represent entirely the complexity of the phenomenon, but they are a reliable approximation for preliminary hazard maps.

  17. Seismic hazard analysis. A methodology for the Eastern United States

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D L

    1980-08-01

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  18. Equality of Shapley value and fair proportion index in phylogenetic trees.

    Science.gov (United States)

    Fuchs, Michael; Jin, Emma Yu

    2015-11-01

    The Shapley value and the fair proportion index of phylogenetic trees have been introduced recently for the purpose of making conservation decisions in genetics. Moreover, also very recently, Hartmann (J Math Biol 67:1163-1170, 2013) has presented data which shows that there is a strong correlation between a slightly modified version of the Shapley value (which we call the modified Shapley value) and the fair proportion index. He gave an explanation of this correlation by showing that the contribution of both indices to an edge of the tree becomes identical as the number of taxa tends to infinity. In this note, we show that the Shapley value and the fair proportion index are in fact the same. Moreover, we also consider the modified Shapley value and show that its covariance with the fair proportion index in random phylogenetic trees under the Yule-Harding model and uniform model is indeed close to one.

  19. Tsunami-hazard assessment based on subaquatic slope-failure susceptibility and tsunami-inundation modeling

    Science.gov (United States)

    Anselmetti, Flavio; Hilbe, Michael; Strupler, Michael; Baumgartner, Christoph; Bolz, Markus; Braschler, Urs; Eberli, Josef; Liniger, Markus; Scheiwiller, Peter; Strasser, Michael

    2015-04-01

    Due to their smaller dimensions and confined bathymetry, lakes act as model oceans that may be used as analogues for the much larger oceans and their margins. Numerous studies in the perialpine lakes of Central Europe have shown that their shores were repeatedly struck by several-meters-high tsunami waves, which were caused by subaquatic slides usually triggered by earthquake shaking. A profound knowledge of these hazards, their intensities and recurrence rates is needed in order to perform thorough tsunami-hazard assessment for the usually densely populated lake shores. In this context, we present results of a study combining i) basinwide slope-stability analysis of subaquatic sediment-charged slopes with ii) identification of scenarios for subaquatic slides triggered by seismic shaking, iii) forward modeling of resulting tsunami waves and iv) mapping of intensity of onshore inundation in populated areas. Sedimentological, stratigraphical and geotechnical knowledge of the potentially unstable sediment drape on the slopes is required for slope-stability assessment. Together with critical ground accelerations calculated from already failed slopes and paleoseismic recurrence rates, scenarios for subaquatic sediment slides are established. Following a previously used approach, the slides are modeled as a Bingham plastic on a 2D grid. The effect on the water column and wave propagation are simulated using the shallow-water equations (GeoClaw code), which also provide data for tsunami inundation, including flow depth, flow velocity and momentum as key variables. Combining these parameters leads to so called «intensity maps» for flooding that provide a link to the established hazard mapping framework, which so far does not include these phenomena. The current versions of these maps consider a 'worst case' deterministic earthquake scenario, however, similar maps can be calculated using probabilistic earthquake recurrence rates, which are expressed in variable amounts of

  20. Survival analysis of a treatment data for cancer of the larynx

    International Nuclear Information System (INIS)

    Khan, K.

    2002-01-01

    In this paper a survival analysis of the survival time is done. The Cox regression model is fitted to the survival time with the assumption of proportional hazard. A model is selected after inclusion and exclusion of factors and variables as explanatory variables. The assumption of proportional hazards is tested in the manner suggested by Harrell (1986). The assumption of proportional hazards is supported by these tests. However the plot of Schoenfeld residuals against dose gave a little evidence of non validity of the proportional hazard assumption. The assumption seems to be satisfied for variable time. The martingale residuals suggest no pattern for variable age. The functional form of dose is not linear. Hence the quadratic dose is used as an explanatory variable. A comparison of logistic regression analysis and survival analysis is also made in this paper. It can be concluded that Cox proportional hazards model is a better model than the logistic model as it is more parsimonious and utilizes more information. (author)

  1. Toward risk assessment 2.0: Safety supervisory control and model-based hazard monitoring for risk-informed safety interventions

    International Nuclear Information System (INIS)

    Favarò, Francesca M.; Saleh, Joseph H.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a staple in the engineering risk community, and it has become to some extent synonymous with the entire quantitative risk assessment undertaking. Limitations of PRA continue to occupy researchers, and workarounds are often proposed. After a brief review of this literature, we propose to address some of PRA's limitations by developing a novel framework and analytical tools for model-based system safety, or safety supervisory control, to guide safety interventions and support a dynamic approach to risk assessment and accident prevention. Our work shifts the emphasis from the pervading probabilistic mindset in risk assessment toward the notions of danger indices and hazard temporal contingency. The framework and tools here developed are grounded in Control Theory and make use of the state-space formalism in modeling dynamical systems. We show that the use of state variables enables the definition of metrics for accident escalation, termed hazard levels or danger indices, which measure the “proximity” of the system state to adverse events, and we illustrate the development of such indices. Monitoring of the hazard levels provides diagnostic information to support both on-line and off-line safety interventions. For example, we show how the application of the proposed tools to a rejected takeoff scenario provides new insight to support pilots’ go/no-go decisions. Furthermore, we augment the traditional state-space equations with a hazard equation and use the latter to estimate the times at which critical thresholds for the hazard level are (b)reached. This estimation process provides important prognostic information and produces a proxy for a time-to-accident metric or advance notice for an impending adverse event. The ability to estimate these two hazard coordinates, danger index and time-to-accident, offers many possibilities for informing system control strategies and improving accident prevention and risk mitigation

  2. Fatigue damage assessment under multi-axial non-proportional cyclic loading

    International Nuclear Information System (INIS)

    Mohta, Keshav; Gupta, Suneel K.; Jadhav, P.A.; Bhasin, V.; Vijayan, P.K.

    2016-01-01

    Detailed fatigue analysis is carried out for class I Nuclear Power Plant (NPP) components to rule out the fatigue failure during their design lifetime. ASME Boiler and Pressure Vessel code Section III NB, has provided two schemes for fatigue assessment, one for fixed principal directions (proportional) loading and the other for varying principal directions (non-proportional) loading conditions. Recent literature on multi-axial fatigue tests has revealed lower fatigue lives under nonproportional loading conditions. In an attempt to understand the loading parameter lowering the fatigue life, a finite element based study has been carried out. Here, fatigue damage in a tube has been correlated with the applied axial to shear strain ratio and phase difference between them. The FE analysis has used Chaboche nonlinear kinematic hardening rule to model material's realistic cyclic plastic deformation behavior. The ASME alternating stress intensity (based on linear elastic FEA) and the plastic strain energy dissipation (based on elastic-plastic FEA) have been considered to assess the per cycle fatigue damage. The study has revealed that ASME criteria predicts lower alternating stress intensity (fatigue damage parameter S alt ) for some cases of non-proportional loading than that predicted for corresponding proportional loading case. However, the actual fatigue damage is higher in non-proportional loading than that in corresponding proportional loading case. Further the fatigue damage of an NPP component under realistic multi-axial cyclic loading conditions has been assessed using some popular critical plane based models vis-à-vis ASME Sec. III criteria. (author)

  3. A methodology for physically based rockfall hazard assessment

    Directory of Open Access Journals (Sweden)

    G. B. Crosta

    2003-01-01

    Full Text Available Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.

  4. Natural phenomena hazards project for Department of Energy sites

    International Nuclear Information System (INIS)

    Coats, D.W.

    1985-01-01

    Lawrence Livermore National Laboratory (LLNL) has developed seismic and wind hazard models for the Office of Nuclear Safety (ONS), Department of Energy (DOE). The work is part of a three-phase effort aimed at establishing uniform building design criteria for seismic and wind hazards at DOE sites throughout the United States. In Phase 1, LLNL gathered information on the sites and their critical facilities, including nuclear reactors, fuel-reprocessing plants, high-level waste storage and treatment facilities, and special nuclear material facilities. In Phase 2, development of seismic and wind hazard models, was initiated. These hazard models express the annual probability that the site will experience an earthquake or wind speed greater than some specified magnitude. In the final phase, it is anticipated that the DOE will use the hazard models to establish uniform criteria for the design and evaluation of critical facilities. 13 references, 2 figures, 1 table

  5. Lindley frailty model for a class of compound Poisson processes

    Science.gov (United States)

    Kadilar, Gamze Özel; Ata, Nihal

    2013-10-01

    The Lindley distribution gain importance in survival analysis for the similarity of exponential distribution and allowance for the different shapes of hazard function. Frailty models provide an alternative to proportional hazards model where misspecified or omitted covariates are described by an unobservable random variable. Despite of the distribution of the frailty is generally assumed to be continuous, it is appropriate to consider discrete frailty distributions In some circumstances. In this paper, frailty models with discrete compound Poisson process for the Lindley distributed failure time are introduced. Survival functions are derived and maximum likelihood estimation procedures for the parameters are studied. Then, the fit of the models to the earthquake data set of Turkey are examined.

  6. Human hazards

    International Nuclear Information System (INIS)

    Delpla, M.; Vignes, S.; Wolber, G.

    1976-01-01

    Among health hazards from ionizing radiations, a distinction is made of observed, likely and theoretical risks. Theoretical risks, derived from extrapolation of observations on sublethal exposures to low doses may frighten. However, they have nothing in common with reality as shown for instance, by the study of carcinogenesis risks at Nagasaki. By extrapolation to low doses, theoretical mutation risks are derived by geneticians from the observation of some characters especially deleterious in the progeny of parents exposed to sublethal doses. One cannot agree when by calculation they express a population exposure by a shift of its genetic balance with an increase of the proportion of disabled individuals. As a matter of fact, experimental exposure of successive generations of laboratory animals shows no accumulation of deleterious genes, sublethal doses excepted. Large nuclear plants should not be overwhelmed by horrible charges on sanitary grounds, whereas small sources have but too often shown they may originate mortal risks [fr

  7. The mediation proportion: a structural equation approach for estimating the proportion of exposure effect on outcome explained by an intermediate variable

    DEFF Research Database (Denmark)

    Ditlevsen, Susanne; Christensen, Ulla; Lynch, John

    2005-01-01

    It is often of interest to assess how much of the effect of an exposure on a response is mediated through an intermediate variable. However, systematic approaches are lacking, other than assessment of a surrogate marker for the endpoint of a clinical trial. We review a measure of "proportion...... of several intermediate variables. Binary or categorical variables can be included directly through threshold models. We call this measure the mediation proportion, that is, the part of an exposure effect on outcome explained by a third, intermediate variable. Two examples illustrate the approach. The first...... example is a randomized clinical trial of the effects of interferon-alpha on visual acuity in patients with age-related macular degeneration. In this example, the exposure, mediator and response are all binary. The second example is a common problem in social epidemiology-to find the proportion...

  8. Earthquake Hazard and Risk in Alaska

    Science.gov (United States)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  9. Considerations in comparing the U.S. Geological Survey one‐year induced‐seismicity hazard models with “Did You Feel It?” and instrumental data

    Science.gov (United States)

    White, Isabel; Liu, Taojun; Luco, Nicolas; Liel, Abbie

    2017-01-01

    The recent steep increase in seismicity rates in Oklahoma, southern Kansas, and other parts of the central United States led the U.S. Geological Survey (USGS) to develop, for the first time, a probabilistic seismic hazard forecast for one year (2016) that incorporates induced seismicity. In this study, we explore a process to ground‐truth the hazard model by comparing it with two databases of observations: modified Mercalli intensity (MMI) data from the “Did You Feel It?” (DYFI) system and peak ground acceleration (PGA) values from instrumental data. Because the 2016 hazard model was heavily based on earthquake catalogs from 2014 to 2015, this initial comparison utilized observations from these years. Annualized exceedance rates were calculated with the DYFI and instrumental data for direct comparison with the model. These comparisons required assessment of the options for converting hazard model results and instrumental data from PGA to MMI for comparison with the DYFI data. In addition, to account for known differences that affect the comparisons, the instrumental PGA and DYFI data were declustered, and the hazard model was adjusted for local site conditions. With these adjustments, examples at sites with the most data show reasonable agreement in the exceedance rates. However, the comparisons were complicated by the spatial and temporal completeness of the instrumental and DYFI observations. Furthermore, most of the DYFI responses are in the MMI II–IV range, whereas the hazard model is oriented toward forecasts at higher ground‐motion intensities, usually above about MMI IV. Nevertheless, the study demonstrates some of the issues that arise in making these comparisons, thereby informing future efforts to ground‐truth and improve hazard modeling for induced‐seismicity applications.

  10. Comparison of hypertabastic survival model with other unimodal hazard rate functions using a goodness-of-fit test.

    Science.gov (United States)

    Tahir, M Ramzan; Tran, Quang X; Nikulin, Mikhail S

    2017-05-30

    We studied the problem of testing a hypothesized distribution in survival regression models when the data is right censored and survival times are influenced by covariates. A modified chi-squared type test, known as Nikulin-Rao-Robson statistic, is applied for the comparison of accelerated failure time models. This statistic is used to test the goodness-of-fit for hypertabastic survival model and four other unimodal hazard rate functions. The results of simulation study showed that the hypertabastic distribution can be used as an alternative to log-logistic and log-normal distribution. In statistical modeling, because of its flexible shape of hazard functions, this distribution can also be used as a competitor of Birnbaum-Saunders and inverse Gaussian distributions. The results for the real data application are shown. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. An advanced model for spreading and evaporation of accidentally released hazardous liquids on land

    NARCIS (Netherlands)

    Trijssenaar-Buhre, I.J.M.; Sterkenburg, R.P.; Wijnant-Timmerman, S.I.

    2009-01-01

    Pool evaporation modelling is an important element in consequence assessment of accidentally released hazardous liquids. The evaporation rate determines the amount of toxic or flammable gas released into the atmosphere and is an important factor for the size of a pool fire. In this paper a

  12. An advanced model for spreading and evaporation of accidentally released hazardous liquids on land

    NARCIS (Netherlands)

    Trijssenaar-Buhre, I.J.M.; Wijnant-Timmerman, S.L.

    2008-01-01

    Pool evaporation modelling is an important element in consequence assessment of accidentally released hazardous liquids. The evaporation rate determines the amount of toxic or flammable gas released into the atmosphere and is an important factor for the size of a pool fire. In this paper a

  13. Level-Dependent Nonlinear Hearing Protector Model in the Auditory Hazard Assessment Algorithm for Humans

    Science.gov (United States)

    2015-04-01

    HPD model. In an article on measuring HPD attenuation, Berger (1986) points out that Real Ear Attenuation at Threshold (REAT) tests are...men. Audiology . 1991;30:345–356. Fedele P, Binseel M, Kalb J, Price GR. Using the auditory hazard assessment algorithm for humans (AHAAH) with

  14. Eastern US seismic hazard characterization update

    International Nuclear Information System (INIS)

    Savy, J.B.; Boissonnade, A.C.; Mensing, R.W.; Short, C.M.

    1993-06-01

    In January 1989, LLNL published the results of a multi-year project, funded by NRC, on estimating seismic hazard at nuclear plant sites east of the Rockies. The goal of this study was twofold: to develop a good central estimate (median) of the seismic hazard and to characterize the uncertainty in the estimates of this hazard. In 1989, LLNL was asked by DOE to develop site specific estimates of the seismic hazard at the Savannah River Site (SRS) in South Carolina as part of the New Production Reactor (NPR) project. For the purpose of the NPR, a complete review of the methodology and of the data acquisition process was performed. Work done under the NPR project has shown that first order improvement in the estimates of the uncertainty (i.e., lower mean hazard values) could be easily achieved by updating the modeling of the seismicity and ground motion attenuation uncertainty. To this effect, NRC sponsored LLNL to perform a reelicitation to update the seismicity and ground motion experts' inputs and to revise methods to combine seismicity and ground motion inputs in the seismic hazard analysis for nuclear power plant sites east of the Rocky Mountains. The objective of the recent study was to include the first order improvements that reflect the latest knowledge in seismicity and ground motion modeling and produce an update of all the hazard results produced in the 1989 study. In particular, it had been demonstrated that eliciting seismicity information in terms of rates of earthquakes rather than a- and b-values, and changing the elicitation format to a one-on-one interview, improved our ability to express the uncertainty of earthquake rates of occurrence at large magnitudes. Thus, NRC sponsored this update study to refine the model of uncertainty, and to re-elicitate of the experts' interpretations of the zonation and seismicity, as well as to reelicitate the ground motion models, based on current state of knowledge

  15. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    Science.gov (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  16. A transparent and data-driven global tectonic regionalization model for seismic hazard assessment

    Science.gov (United States)

    Chen, Yen-Shin; Weatherill, Graeme; Pagani, Marco; Cotton, Fabrice

    2018-05-01

    A key concept that is common to many assumptions inherent within seismic hazard assessment is that of tectonic similarity. This recognizes that certain regions of the globe may display similar geophysical characteristics, such as in the attenuation of seismic waves, the magnitude scaling properties of seismogenic sources or the seismic coupling of the lithosphere. Previous attempts at tectonic regionalization, particularly within a seismic hazard assessment context, have often been based on expert judgements; in most of these cases, the process for delineating tectonic regions is neither reproducible nor consistent from location to location. In this work, the regionalization process is implemented in a scheme that is reproducible, comprehensible from a geophysical rationale, and revisable when new relevant data are published. A spatial classification-scheme is developed based on fuzzy logic, enabling the quantification of concepts that are approximate rather than precise. Using the proposed methodology, we obtain a transparent and data-driven global tectonic regionalization model for seismic hazard applications as well as the subjective probabilities (e.g. degree of being active/degree of being cratonic) that indicate the degree to which a site belongs in a tectonic category.

  17. Use of a Bayesian isotope mixing model to estimate proportional contributions of multiple nitrate sources in surface water

    International Nuclear Information System (INIS)

    Xue Dongmei; De Baets, Bernard; Van Cleemput, Oswald; Hennessy, Carmel; Berglund, Michael; Boeckx, Pascal

    2012-01-01

    To identify different NO 3 − sources in surface water and to estimate their proportional contribution to the nitrate mixture in surface water, a dual isotope and a Bayesian isotope mixing model have been applied for six different surface waters affected by agriculture, greenhouses in an agricultural area, and households. Annual mean δ 15 N–NO 3 − were between 8.0 and 19.4‰, while annual mean δ 18 O–NO 3 − were given by 4.5–30.7‰. SIAR was used to estimate the proportional contribution of five potential NO 3 − sources (NO 3 − in precipitation, NO 3 − fertilizer, NH 4 + in fertilizer and rain, soil N, and manure and sewage). SIAR showed that “manure and sewage” contributed highest, “soil N”, “NO 3 − fertilizer” and “NH 4 + in fertilizer and rain” contributed middle, and “NO 3 − in precipitation” contributed least. The SIAR output can be considered as a “fingerprint” for the NO 3 − source contributions. However, the wide range of isotope values observed in surface water and of the NO 3 − sources limit its applicability. - Highlights: ► The dual isotope approach (δ 15 N- and δ 18 O–NO 3 − ) identify dominant nitrate sources in 6 surface waters. ► The SIAR model estimate proportional contributions for 5 nitrate sources. ► SIAR is a reliable approach to assess temporal and spatial variations of different NO 3 − sources. ► The wide range of isotope values observed in surface water and of the nitrate sources limit its applicability. - This paper successfully applied a dual isotope approach and Bayesian isotopic mixing model to identify and quantify 5 potential nitrate sources in surface water.

  18. Development and validation of a risk model for prediction of hazardous alcohol consumption in general practice attendees: the predictAL study.

    Science.gov (United States)

    King, Michael; Marston, Louise; Švab, Igor; Maaroos, Heidi-Ingrid; Geerlings, Mirjam I; Xavier, Miguel; Benjamin, Vicente; Torres-Gonzalez, Francisco; Bellon-Saameno, Juan Angel; Rotar, Danica; Aluoja, Anu; Saldivia, Sandra; Correa, Bernardo; Nazareth, Irwin

    2011-01-01

    Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL) for the development of hazardous drinking in safe drinkers. A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score ≥8 in men and ≥5 in women. 69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873). The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51). External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846) and Hedge's g of 0.68 (95% CI 0.57, 0.78). The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse.

  19. Development and validation of a risk model for prediction of hazardous alcohol consumption in general practice attendees: the predictAL study.

    Directory of Open Access Journals (Sweden)

    Michael King

    Full Text Available Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL for the development of hazardous drinking in safe drinkers.A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score ≥8 in men and ≥5 in women.69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873. The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51. External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846 and Hedge's g of 0.68 (95% CI 0.57, 0.78.The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse.

  20. Growth Optimal Portfolio Selection Under Proportional Transaction Costs with Obligatory Diversification

    International Nuclear Information System (INIS)

    Duncan, T.; Pasik Duncan, B.; Stettner, L.

    2011-01-01

    A continuous time long run growth optimal or optimal logarithmic utility portfolio with proportional transaction costs consisting of a fixed proportional cost and a cost proportional to the volume of transaction is considered. The asset prices are modeled as exponent of diffusion with jumps whose parameters depend on a finite state Markov process of economic factors. An obligatory portfolio diversification is introduced, accordingly to which it is required to invest at least a fixed small portion of our wealth in each asset.

  1. Introducing Geoscience Students to Numerical Modeling of Volcanic Hazards: The example of Tephra2 on VHub.org

    Directory of Open Access Journals (Sweden)

    Leah M. Courtland

    2012-07-01

    Full Text Available The Tephra2 numerical model for tephra fallout from explosive volcanic eruptions is specifically designed to enable students to probe ideas in model literacy, including code validation and verification, the role of simplifying assumptions, and the concepts of uncertainty and forecasting. This numerical model is implemented on the VHub.org website, a venture in cyberinfrastructure that brings together volcanological models and educational materials. The VHub.org resource provides students with the ability to explore and execute sophisticated numerical models like Tephra2. We present a strategy for using this model to introduce university students to key concepts in the use and evaluation of Tephra2 for probabilistic forecasting of volcanic hazards. Through this critical examination students are encouraged to develop a deeper understanding of the applicability and limitations of hazard models. Although the model and applications are intended for use in both introductory and advanced geoscience courses, they could easily be adapted to work in other disciplines, such as astronomy, physics, computational methods, data analysis, or computer science.

  2. Application Of Shared Gamma And Inverse-Gaussian Frailty Models ...

    African Journals Online (AJOL)

    Shared Gamma and Inverse-Gaussian Frailty models are used to analyze the survival times of patients who are clustered according to cancer/tumor types under Parametric Proportional Hazard framework. The result of the ... However, no evidence is strong enough for preference of either Gamma or Inverse Gaussian Frailty.

  3. The anchor-based minimal important change, based on receiver operating characteristic analysis or predictive modeling, may need to be adjusted for the proportion of improved patients.

    Science.gov (United States)

    Terluin, Berend; Eekhout, Iris; Terwee, Caroline B

    2017-03-01

    Patients have their individual minimal important changes (iMICs) as their personal benchmarks to determine whether a perceived health-related quality of life (HRQOL) change constitutes a (minimally) important change for them. We denote the mean iMIC in a group of patients as the "genuine MIC" (gMIC). The aims of this paper are (1) to examine the relationship between the gMIC and the anchor-based minimal important change (MIC), determined by receiver operating characteristic analysis or by predictive modeling; (2) to examine the impact of the proportion of improved patients on these MICs; and (3) to explore the possibility to adjust the MIC for the influence of the proportion of improved patients. Multiple simulations of patient samples involved in anchor-based MIC studies with different characteristics of HRQOL (change) scores and distributions of iMICs. In addition, a real data set is analyzed for illustration. The receiver operating characteristic-based and predictive modeling MICs equal the gMIC when the proportion of improved patients equals 0.5. The MIC is estimated higher than the gMIC when the proportion improved is greater than 0.5, and the MIC is estimated lower than the gMIC when the proportion improved is less than 0.5. Using an equation including the predictive modeling MIC, the log-odds of improvement, the standard deviation of the HRQOL change score, and the correlation between the HRQOL change score and the anchor results in an adjusted MIC reflecting the gMIC irrespective of the proportion of improved patients. Adjusting the predictive modeling MIC for the proportion of improved patients assures that the adjusted MIC reflects the gMIC. We assumed normal distributions and global perceived change scores that were independent on the follow-up score. Additionally, floor and ceiling effects were not taken into account. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Hierarchical Neural Regression Models for Customer Churn Prediction

    Directory of Open Access Journals (Sweden)

    Golshan Mohammadi

    2013-01-01

    Full Text Available As customers are the main assets of each industry, customer churn prediction is becoming a major task for companies to remain in competition with competitors. In the literature, the better applicability and efficiency of hierarchical data mining techniques has been reported. This paper considers three hierarchical models by combining four different data mining techniques for churn prediction, which are backpropagation artificial neural networks (ANN, self-organizing maps (SOM, alpha-cut fuzzy c-means (α-FCM, and Cox proportional hazards regression model. The hierarchical models are ANN + ANN + Cox, SOM + ANN + Cox, and α-FCM + ANN + Cox. In particular, the first component of the models aims to cluster data in two churner and nonchurner groups and also filter out unrepresentative data or outliers. Then, the clustered data as the outputs are used to assign customers to churner and nonchurner groups by the second technique. Finally, the correctly classified data are used to create Cox proportional hazards model. To evaluate the performance of the hierarchical models, an Iranian mobile dataset is considered. The experimental results show that the hierarchical models outperform the single Cox regression baseline model in terms of prediction accuracy, Types I and II errors, RMSE, and MAD metrics. In addition, the α-FCM + ANN + Cox model significantly performs better than the two other hierarchical models.

  5. The 2018 and 2020 Updates of the U.S. National Seismic Hazard Models

    Science.gov (United States)

    Petersen, M. D.

    2017-12-01

    During 2018 the USGS will update the 2014 National Seismic Hazard Models by incorporating new seismicity models, ground motion models, site factors, fault inputs, and by improving weights to ground motion models using empirical and other data. We will update the earthquake catalog for the U.S. and introduce new rate models. Additional fault data will be used to improve rate estimates on active faults. New ground motion models (GMMs) and site factors for Vs30 have been released by the Pacific Earthquake Engineering Research Center (PEER) and we will consider these in assessing ground motions in craton and extended margin regions of the central and eastern U.S. The USGS will also include basin-depth terms for selected urban areas of the western United States to improve long-period shaking assessments using published depth estimates to 1.0 and 2.5 km/s shear wave velocities. We will produce hazard maps for input into the building codes that span a broad range of periods (0.1 to 5 s) and site classes (shear wave velocity from 2000 m/s to 200 m/s in the upper 30 m of the crust, Vs30). In the 2020 update we plan on including: a new national crustal model that defines basin depths required in the latest GMMs, new 3-D ground motion simulations for several urban areas, new magnitude-area equations, and new fault geodetic and geologic strain rate models. The USGS will also consider including new 3-D ground motion simulations for inclusion in these long-period maps. These new models are being evaluated and will be discussed at one or more regional and topical workshops held at the beginning of 2018.

  6. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning.

    Directory of Open Access Journals (Sweden)

    Abe D Hofman

    Full Text Available We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779, and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808. For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development.

  7. Seismic hazard assessment of Iran

    Directory of Open Access Journals (Sweden)

    M. Ghafory-Ashtiany

    1999-06-01

    Full Text Available The development of the new seismic hazard map of Iran is based on probabilistic seismic hazard computation using the historical earthquakes data, geology, tectonics, fault activity and seismic source models in Iran. These maps have been prepared to indicate the earthquake hazard of Iran in the form of iso-acceleration contour lines, and seismic hazard zoning, by using current probabilistic procedures. They display the probabilistic estimates of Peak Ground Acceleration (PGA for the return periods of 75 and 475 years. The maps have been divided into intervals of 0.25 degrees in both latitudinal and longitudinal directions to calculate the peak ground acceleration values at each grid point and draw the seismic hazard curves. The results presented in this study will provide the basis for the preparation of seismic risk maps, the estimation of earthquake insurance premiums, and the preliminary site evaluation of critical facilities.

  8. Tsunami hazard map in eastern Bali

    International Nuclear Information System (INIS)

    Afif, Haunan; Cipta, Athanasius

    2015-01-01

    Bali is a popular tourist destination both for Indonesian and foreign visitors. However, Bali is located close to the collision zone between the Indo-Australian Plate and Eurasian Plate in the south and back-arc thrust off the northern coast of Bali resulted Bali prone to earthquake and tsunami. Tsunami hazard map is needed for better understanding of hazard level in a particular area and tsunami modeling is one of the most reliable techniques to produce hazard map. Tsunami modeling conducted using TUNAMI N2 and set for two tsunami sources scenarios which are subduction zone in the south of Bali and back thrust in the north of Bali. Tsunami hazard zone is divided into 3 zones, the first is a high hazard zones with inundation height of more than 3m. The second is a moderate hazard zone with inundation height 1 to 3m and the third is a low tsunami hazard zones with tsunami inundation heights less than 1m. Those 2 scenarios showed southern region has a greater potential of tsunami impact than the northern areas. This is obviously shown in the distribution of the inundated area in the south of Bali including the island of Nusa Penida, Nusa Lembongan and Nusa Ceningan is wider than in the northern coast of Bali although the northern region of the Nusa Penida Island more inundated due to the coastal topography

  9. Tsunami hazard map in eastern Bali

    Energy Technology Data Exchange (ETDEWEB)

    Afif, Haunan, E-mail: afif@vsi.esdm.go.id [Geological Agency, Bandung (Indonesia); Cipta, Athanasius [Geological Agency, Bandung (Indonesia); Australian National University, Canberra (Australia)

    2015-04-24

    Bali is a popular tourist destination both for Indonesian and foreign visitors. However, Bali is located close to the collision zone between the Indo-Australian Plate and Eurasian Plate in the south and back-arc thrust off the northern coast of Bali resulted Bali prone to earthquake and tsunami. Tsunami hazard map is needed for better understanding of hazard level in a particular area and tsunami modeling is one of the most reliable techniques to produce hazard map. Tsunami modeling conducted using TUNAMI N2 and set for two tsunami sources scenarios which are subduction zone in the south of Bali and back thrust in the north of Bali. Tsunami hazard zone is divided into 3 zones, the first is a high hazard zones with inundation height of more than 3m. The second is a moderate hazard zone with inundation height 1 to 3m and the third is a low tsunami hazard zones with tsunami inundation heights less than 1m. Those 2 scenarios showed southern region has a greater potential of tsunami impact than the northern areas. This is obviously shown in the distribution of the inundated area in the south of Bali including the island of Nusa Penida, Nusa Lembongan and Nusa Ceningan is wider than in the northern coast of Bali although the northern region of the Nusa Penida Island more inundated due to the coastal topography.

  10. Tsunami hazard map in eastern Bali

    Science.gov (United States)

    Afif, Haunan; Cipta, Athanasius

    2015-04-01

    Bali is a popular tourist destination both for Indonesian and foreign visitors. However, Bali is located close to the collision zone between the Indo-Australian Plate and Eurasian Plate in the south and back-arc thrust off the northern coast of Bali resulted Bali prone to earthquake and tsunami. Tsunami hazard map is needed for better understanding of hazard level in a particular area and tsunami modeling is one of the most reliable techniques to produce hazard map. Tsunami modeling conducted using TUNAMI N2 and set for two tsunami sources scenarios which are subduction zone in the south of Bali and back thrust in the north of Bali. Tsunami hazard zone is divided into 3 zones, the first is a high hazard zones with inundation height of more than 3m. The second is a moderate hazard zone with inundation height 1 to 3m and the third is a low tsunami hazard zones with tsunami inundation heights less than 1m. Those 2 scenarios showed southern region has a greater potential of tsunami impact than the northern areas. This is obviously shown in the distribution of the inundated area in the south of Bali including the island of Nusa Penida, Nusa Lembongan and Nusa Ceningan is wider than in the northern coast of Bali although the northern region of the Nusa Penida Island more inundated due to the coastal topography.

  11. Estimating hurricane hazards using a GIS system

    Directory of Open Access Journals (Sweden)

    A. Taramelli

    2008-08-01

    Full Text Available This paper develops a GIS-based integrated approach to the Multi-Hazard model method, with reference to hurricanes. This approach has three components: data integration, hazard assessment and score calculation to estimate elements at risk such as affected area and affected population. First, spatial data integration issues within a GIS environment, such as geographical scales and data models, are addressed. Particularly, the integration of physical parameters and population data is achieved linking remotely sensed data with a high resolution population distribution in GIS. In order to assess the number of affected people, involving heterogeneous data sources, the selection of spatial analysis units is basic. Second, specific multi-hazard tasks, such as hazard behaviour simulation and elements at risk assessment, are composed in order to understand complex hazard and provide support for decision making. Finally, the paper concludes that the integrated approach herein presented can be used to assist emergency management of hurricane consequences, in theory and in practice.

  12. Use of agent-based modelling in emergency management under a range of flood hazards

    Directory of Open Access Journals (Sweden)

    Tagg Andrew

    2016-01-01

    Full Text Available The Life Safety Model (LSM was developed some 15 years ago, originally for dam break assessments and for informing reservoir evacuation and emergency plans. Alongside other technological developments, the model has evolved into a very useful agent-based tool, with many applications for a range of hazards and receptor behaviour. HR Wallingford became involved in its use in 2006, and is now responsible for its technical development and commercialisation. Over the past 10 years the model has been applied to a range of flood hazards, including coastal surge, river flood, dam failure and tsunami, and has been verified against historical events. Commercial software licences are being used in Canada, Italy, Malaysia and Australia. A core group of LSM users and analysts has been specifying and delivering a programme of model enhancements. These include improvements to traffic behaviour at intersections, new algorithms for sheltering in high-rise buildings, and the addition of monitoring points to allow detailed analysis of vehicle and pedestrian movement. Following user feedback, the ability of LSM to handle large model ‘worlds’ and hydrodynamic meshes has been improved. Recent developments include new documentation, performance enhancements, better logging of run-time events and bug fixes. This paper describes some of the recent developments and summarises some of the case study applications, including dam failure analysis in Japan and mass evacuation simulation in England.

  13. Modelling of rational economic proportions of the balance sheet structure of the petrochemical enterprises

    Directory of Open Access Journals (Sweden)

    G. S. Tsvetkova

    2016-01-01

    Full Text Available The paper provides the assessment of the balance sheet structure of rival companies of a petrochemical complex of the Russian Federation. J. Aubert-Kriye's method is chosen as a main methodical tool. Practical demonstration of the method is offered on the example of the enterprises of petrochemical business of PJSC “Sibur”, PJSC “Nizhnekamskneftekhim” and JSC “Sterlitamak Petrochemical Plant”. The analysis of balance sheets showed that the enterprises have elements of irrational structure. “Sibur” differs in a low share of owner’s equity and a high share of long-term liabilities. “Nizhnekamskneftekhim” is characterized by the high share of owner’s equity which use for the purposes of development of the company and it is more expensive in comparison with liabilities. “Sterlitamak Petrochemical Plant” has excessive values of liquidity rates that demonstrates accumulation of a money, their derivation in receivables. At the same time, processes of ongoing investment in upgrade of the equipment and expansion of capacities require cause necessity of support of a rational balance sheet structure of the enterprises of a petrochemical complex. On the example of “Nizhnekamskneftekhim” modeling of a rational balance sheet structure of the company is carried out. The sequence of calculations included performing diagnostics of structural distribution of current assets and sources of means; determination of structure of financial and active elements of the entity; establishment of permissible limit of change of basic proportions and ratios by criterion of solvency and financial stability. Modeling of structure of a liability and current assets on the basis of the J. Aubert-Kriye's method showed a possibility of improvement of economic indicators of “Nizhnekamskneftekhim”. Further determination of range of tolerance for elements of the liabilities and current assets will allow to provide balance of economic proportions and

  14. Earthquake Hazard and Risk in New Zealand

    Science.gov (United States)

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates

  15. Hazards to nuclear plants from surface traffic accidents

    International Nuclear Information System (INIS)

    Hornyik, K.

    1975-01-01

    Analytic models have been developed for evaluating hazards to nuclear plants from hazardous-materials accidents in the vicinity of the plant. In particular, these models permit the evaluation of hazards from such accidents occurring on surface traffic routes near the plant. The analysis uses statistical information on accident rates, traffic frequency, and cargo-size distribution along with parameters describing properties of the hazardous cargo, plant design, and atmospheric conditions, to arrive at a conservative estimate of the annual probability of a catastrophic event. Two of the major effects associated with hazardous-materials accidents, explosion and release of toxic vapors, are treated by a common formalism which can be readily applied to any given case by means of a graphic procedure. As an example, for a typical case it is found that railroad shipments of chlorine in 55-ton tank cars constitute a greater hazard to a nearby nuclear plant than equally frequent rail shipments of explosives in amounts of 10 tons. 11 references. (U.S.)

  16. High-Dimensional Additive Hazards Regression for Oral Squamous Cell Carcinoma Using Microarray Data: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Omid Hamidi

    2014-01-01

    Full Text Available Microarray technology results in high-dimensional and low-sample size data sets. Therefore, fitting sparse models is substantial because only a small number of influential genes can reliably be identified. A number of variable selection approaches have been proposed for high-dimensional time-to-event data based on Cox proportional hazards where censoring is present. The present study applied three sparse variable selection techniques of Lasso, smoothly clipped absolute deviation and the smooth integration of counting, and absolute deviation for gene expression survival time data using the additive risk model which is adopted when the absolute effects of multiple predictors on the hazard function are of interest. The performances of used techniques were evaluated by time dependent ROC curve and bootstrap .632+ prediction error curves. The selected genes by all methods were highly significant (P<0.001. The Lasso showed maximum median of area under ROC curve over time (0.95 and smoothly clipped absolute deviation showed the lowest prediction error (0.105. It was observed that the selected genes by all methods improved the prediction of purely clinical model indicating the valuable information containing in the microarray features. So it was concluded that used approaches can satisfactorily predict survival based on selected gene expression measurements.

  17. STakeholder-Objective Risk Model (STORM): Determiningthe aggregated risk of multiple contaminant hazards in groundwater well catchments

    DEFF Research Database (Denmark)

    Enzenhoefer, R.; Binning, Philip John; Nowak, W.

    2015-01-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any......-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired...

  18. Modeling retrospective attribution of responsibility to hazard-managing institutions: an example involving a food contamination incident.

    Science.gov (United States)

    Johnson, Branden B; Hallman, William K; Cuite, Cara L

    2015-03-01

    Perceptions of institutions that manage hazards are important because they can affect how the public responds to hazard events. Antecedents of trust judgments have received far more attention than antecedents of attributions of responsibility for hazard events. We build upon a model of retrospective attribution of responsibility to individuals to examine these relationships regarding five classes of institutions that bear responsibility for food safety: producers (e.g., farmers), processors (e.g., packaging firms), watchdogs (e.g., government agencies), sellers (e.g., supermarkets), and preparers (e.g., restaurants). A nationally representative sample of 1,200 American adults completed an Internet-based survey in which a hypothetical scenario involving contamination of diverse foods with Salmonella served as the stimulus event. Perceived competence and good intentions of the institution moderately decreased attributions of responsibility. A stronger factor was whether an institution was deemed (potentially) aware of the contamination and free to act to prevent or mitigate it. Responsibility was rated higher the more aware and free the institution. This initial model for attributions of responsibility to impersonal institutions (as opposed to individual responsibility) merits further development. © 2014 Society for Risk Analysis.

  19. Bibliography - Existing Guidance for External Hazard Modelling

    International Nuclear Information System (INIS)

    Decker, Kurt

    2015-01-01

    The bibliography of deliverable D21.1 includes existing international and national guidance documents and standards on external hazard assessment together with a selection of recent scientific papers, which are regarded to provide useful information on the state of the art of external event modelling. The literature database is subdivided into International Standards, National Standards, and Science Papers. The deliverable is treated as a 'living document' which is regularly updated as necessary during the lifetime of ASAMPSA-E. The current content of the database is about 140 papers. Most of the articles are available as full-text versions in PDF format. The deliverable is available as an EndNote X4 database and as text files. The database includes the following information: Reference, Key words, Abstract (if available), PDF file of the original paper (if available), Notes (comments by the ASAMPSA-E consortium if available) The database is stored at the ASAMPSA-E FTP server hosted by IRSN. PDF files of original papers are accessible through the EndNote software

  20. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    Science.gov (United States)

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  1. Playing against nature: improving earthquake hazard mitigation

    Science.gov (United States)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the

  2. Tsunami Hazard Preventing Based Land Use Planning Model Using GIS Techniques in Muang Krabi, Thailand

    Directory of Open Access Journals (Sweden)

    Abdul Salam Soomro

    2012-10-01

    Full Text Available The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region.

  3. Tsunami hazard preventing based land use planing model using GIS technique in Muang Krabi, Thailand

    International Nuclear Information System (INIS)

    Soormo, A.S.

    2012-01-01

    The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems) based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region. (author)

  4. Stochastic modeling of a hazard detection and avoidance maneuver—The planetary landing case

    International Nuclear Information System (INIS)

    Witte, Lars

    2013-01-01

    Hazard Detection and Avoidance (HDA) functionalities, thus the ability to recognize and avoid potential hazardous terrain features, is regarded as an enabling technology for upcoming robotic planetary landing missions. In the forefront of any landing mission the landing site safety assessment is an important task in the systems and mission engineering process. To contribute to this task, this paper presents a mathematical framework to consider the HDA strategy and system constraints in this mission engineering aspect. Therefore the HDA maneuver is modeled as a stochastic decision process based on Markov chains to map an initial dispersion at an arrival gate to a new dispersion pattern affected by the divert decision-making and system constraints. The implications for an efficient numerical implementation are addressed. An example case study is given to demonstrate the implementation and use of the proposed scheme

  5. Seismic hazard map of the western hemisphere

    Science.gov (United States)

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the

  6. Seismic hazard map of the western hemisphere

    Directory of Open Access Journals (Sweden)

    J. G. Tanner

    1999-06-01

    Full Text Available Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.. Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($ 6 billion, 1994 Northridge, CA ($ 25 billion, and 1995 Kobe, Japan (> $ 100 billion earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes, emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions

  7. Flow-R, a model for susceptibility mapping of debris flows and other gravitational hazards at a regional scale

    Directory of Open Access Journals (Sweden)

    P. Horton

    2013-04-01

    Full Text Available The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM. The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time

  8. Model-free approach to the estimation of radiation hazards. I. Theory

    International Nuclear Information System (INIS)

    Zaider, M.; Brenner, D.J.

    1986-01-01

    The experience of the Japanese atomic bomb survivors constitutes to date the major data base for evaluating the effects of low doses of ionizing radiation on human populations. Although numerous analyses have been performed and published concerning this experience, it is clear that no consensus has emerged as to the conclusions that may be drawn to assist in setting realistic radiation protection guidelines. In part this is an inherent consequences of the rather limited amount of data available. In this paper the authors address an equally important problem; namely, the use of arbitrary parametric risk models which have little theoretical foundation, yet almost totally determine the final conclusions drawn. They propose the use of a model-free approach to the estimation of radiation hazards

  9. Evaluation of MEDALUS model for desertification hazard zonation using GIS; study area: Iyzad Khast plain, Iran.

    Science.gov (United States)

    Farajzadeh, Manuchehr; Egbal, Mahbobeh Nik

    2007-08-15

    In this study, the MEDALUS model along with GIS mapping techniques are used to determine desertification hazards for a province of Iran to determine the desertification hazard. After creating a desertification database including 20 parameters, the first steps consisted of developing maps of four indices for the MEDALUS model including climate, soil, vegetation and land use were prepared. Since these parameters have mostly been presented for the Mediterranean region in the past, the next step included the addition of other indicators such as ground water and wind erosion. Then all of the layers weighted by environmental conditions present in the area were used (following the same MEDALUS framework) before a desertification map was prepared. The comparison of two maps based on the original and modified MEDALUS models indicates that the addition of more regionally-specific parameters into the model allows for a more accurate representation of desertification processes across the Iyzad Khast plain. The major factors affecting desertification in the area are climate, wind erosion and low land quality management, vegetation degradation and the salinization of soil and water resources.

  10. Assessment of Meteorological Drought Hazard Area using GIS in ...

    African Journals Online (AJOL)

    Michael Horsfall

    The purpose of this study was to make a model of the meteorological drought hazard area using GIS. ... overlaying different hazard indicator maps in the GIS, deploying the new model. The final ..... Northeast Thailand Project Bangkok. Min. of.

  11. Socio-economic vulnerability to natural hazards - proposal for an indicator-based model

    Science.gov (United States)

    Eidsvig, U.; McLean, A.; Vangelsten, B. V.; Kalsnes, B.; Ciurean, R. L.; Argyroudis, S.; Winter, M.; Corominas, J.; Mavrouli, O. C.; Fotopoulou, S.; Pitilakis, K.; Baills, A.; Malet, J. P.

    2012-04-01

    Vulnerability assessment, with respect to natural hazards, is a complex process that must consider multiple dimensions of vulnerability, including both physical and social factors. Physical vulnerability refers to conditions of physical assets, and may be modeled by the intensity and magnitude of the hazard, the degree of physical protection provided by the natural and built environment, and the physical robustness of the exposed elements. Social vulnerability refers to the underlying factors leading to the inability of people, organizations, and societies to withstand impacts from the natural hazards. Social vulnerability models can be used in combination with physical vulnerability models to estimate both direct losses, i.e. losses that occur during and immediately after the impact, as well as indirect losses, i.e. long-term effects of the event. Direct impact of a landslide typically includes casualties and damages to buildings and infrastructure while indirect losses may e.g. include business closures or limitations in public services. The direct losses are often assessed using physical vulnerability indicators (e.g. construction material, height of buildings), while indirect losses are mainly assessed using social indicators (e.g. economical resources, demographic conditions). Within the EC-FP7 SafeLand research project, an indicator-based method was proposed to assess relative socio-economic vulnerability to landslides. The indicators represent the underlying factors which influence a community's ability to prepare for, deal with, and recover from the damage associated with landslides. The proposed model includes indicators representing demographic, economic and social characteristics as well as indicators representing the degree of preparedness and recovery capacity. Although the model focuses primarily on the indirect losses, it could easily be extended to include more physical indicators which account for the direct losses. Each indicator is individually

  12. On the modeling of uplink inter-cell interference based on proportional fair scheduling

    KAUST Repository

    Tabassum, Hina

    2012-10-03

    We derive a semi-analytical expression for the uplink inter-cell interference (ICI) assuming proportional fair scheduling (with a maximum normalized signal-to-noise ratio (SNR) criterion) deployed in the cellular network. The derived expression can be customized for different models of channel statistics that can capture path loss, shadowing, and fading. Firstly, we derive an expression for the distribution of the locations of the allocated user in a given cell. Then, we derive the distribution and moment generating function of the uplink ICI from one interfering cell. Finally, we determine the moment generating function of the cumulative uplink ICI from all interfering cells. The derived expression is utilized to evaluate important network performance metrics such as outage probability and fairness among users. The accuracy of the derived expressions is verified by comparing the obtained results to Monte Carlo simulations. © 2012 IEEE.

  13. On the modeling of uplink inter-cell interference based on proportional fair scheduling

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2012-01-01

    We derive a semi-analytical expression for the uplink inter-cell interference (ICI) assuming proportional fair scheduling (with a maximum normalized signal-to-noise ratio (SNR) criterion) deployed in the cellular network. The derived expression can be customized for different models of channel statistics that can capture path loss, shadowing, and fading. Firstly, we derive an expression for the distribution of the locations of the allocated user in a given cell. Then, we derive the distribution and moment generating function of the uplink ICI from one interfering cell. Finally, we determine the moment generating function of the cumulative uplink ICI from all interfering cells. The derived expression is utilized to evaluate important network performance metrics such as outage probability and fairness among users. The accuracy of the derived expressions is verified by comparing the obtained results to Monte Carlo simulations. © 2012 IEEE.

  14. Digital proportional multi-resonant current controller for improving grid-connected photovoltaic systems

    NARCIS (Netherlands)

    Almeida, de P.M.; Barbosa, P.G.; Oliveira, J.G.; Duarte, J.L.; Ribeiro, P.F.

    2015-01-01

    This paper presents the modelling and design steps of a digital proportional multi-resonant controller used in a grid-connected photovoltaic (PV) system. It is shown that the use of only one Proportional-Resonant (PR) compensator, tuned to the system fundamental frequency, may have its effectiveness

  15. Divine proportions in attractive and nonattractive faces.

    Science.gov (United States)

    Pancherz, Hans; Knapp, Verena; Erbe, Christina; Heiss, Anja Melina

    2010-01-01

    To test Ricketts' 1982 hypothesis that facial beauty is measurable by comparing attractive and nonattractive faces of females and males with respect to the presence of the divine proportions. The analysis of frontal view facial photos of 90 cover models (50 females, 40 males) from famous fashion magazines and of 34 attractive (29 females, five males) and 34 nonattractive (13 females, 21 males) persons selected from a group of former orthodontic patients was carried out in this study. Based on Ricketts' method, five transverse and seven vertical facial reference distances were measured and compared with the corresponding calculated divine distances expressed in phi-relationships (f=1.618). Furthermore, transverse and vertical facial disproportion indices were created. For both the models and patients, all the reference distances varied largely from respective divine values. The average deviations ranged from 0.3% to 7.8% in the female groups of models and attractive patients with no difference between them. In the male groups of models and attractive patients, the average deviations ranged from 0.2% to 11.2%. When comparing attractive and nonattractive female, as well as male, patients, deviations from the divine values for all variables were larger in the nonattractive sample. Attractive individuals have facial proportions closer to the divine values than nonattractive ones. In accordance with the hypothesis of Ricketts, facial beauty is measurable to some degree. COPYRIGHT © 2009 BY QUINTESSENCE PUBLISHING CO, INC.

  16. Recent developments in health risks modeling techniques applied to hazardous waste site assessment and remediation

    International Nuclear Information System (INIS)

    Mendez, W.M. Jr.

    1990-01-01

    Remediation of hazardous an mixed waste sites is often driven by assessments of human health risks posed by the exposures to hazardous substances released from these sites. The methods used to assess potential health risk involve, either implicitly or explicitly, models for pollutant releases, transport, human exposure and intake, and for characterizing health effects. Because knowledge about pollutant fate transport processes at most waste sites is quite limited, and data cost are quite high, most of the models currently used to assess risk, and endorsed by regulatory agencies, are quite simple. The models employ many simplifying assumptions about pollutant fate and distribution in the environment about human pollutant intake, and toxicologic responses to pollutant exposures. An important consequence of data scarcity and model simplification is that risk estimates are quite uncertain and estimates of the magnitude uncertainty associated with risk assessment has been very difficult. A number of methods have been developed to address the issue of uncertainty in risk assessments in a manner that realistically reflects uncertainty in model specification and data limitations. These methods include definition of multiple exposure scenarios, sensitivity analyses, and explicit probabilistic modeling of uncertainty. Recent developments in this area will be discussed, along with their possible impacts on remediation programs, and remaining obstacles to their wider use and acceptance by the scientific and regulatory communities

  17. Proportional odds model applied to mapping of disease resistance genes in plants

    Directory of Open Access Journals (Sweden)

    Maria Helena Spyrides-Cunha

    2000-03-01

    Full Text Available Molecular markers have been used extensively to map quantitative trait loci (QTL controlling disease resistance in plants. Mapping is usually done by establishing a statistical association between molecular marker genotypes and quantitative variations in disease resistance. However, most statistical approaches require a continuous distribution of the response variable, a requirement not always met since evaluation of disease resistance is often done using visual ratings based on an ordinal scale of disease severity. This paper discusses the application of the proportional odds model to the mapping of disease resistance genes in plants amenable to expression as ordinal data. The model was used to map two resistance QTL of maize to Puccinia sorghi. The microsatellite markers bngl166 and bngl669, located on chromosomes 2 and 8, respectively, were used to genotype F2 individuals from a segregating population. Genotypes at each marker locus were then compared by assessing disease severity in F3 plants derived from the selfing of each genotyped F2 plant based on an ordinal scale severity. The residual deviance and the chi-square score statistic indicated a good fit of the model to the data and the odds had a constant proportionality at each threshold. Single-marker analyses detected significant differences among marker genotypes at both marker loci, indicating that these markers were linked to disease resistance QTL. The inclusion of the interaction term after single-marker analysis provided strong evidence of an epistatic interaction between the two QTL. These results indicate that the proportional odds model can be used as an alternative to traditional methods in cases where the response variable consists of an ordinal scale, thus eliminating the problems of heterocedasticity, non-linearity, and the non-normality of residuals often associated with this type of data.Marcadores moleculares têm sido extensivamente usados para o mapeamento de loci de

  18. GIS-modelling of the spatial variability of flash flood hazard in Abu Dabbab catchment, Red Sea Region, Egypt

    Directory of Open Access Journals (Sweden)

    Islam Abou El-Magd

    2010-06-01

    Full Text Available In the mountainous area of the Red Sea region in southeastern Egypt, the development of new mining activities or/and domestic infrastructures require reliable and accurate information about natural hazards particularly flash flood. This paper presents the assessment of flash flood hazards in the Abu Dabbab drainage basin. Remotely sensed data were used to delineate the alluvial active channels, which were integrated with morphometric parameters extracted from digital elevation models (DEM into geographical information systems (GIS to construct a hydrological model that provides estimates about the amount of surface runoff as well as the magnitude of flash floods. The peak discharge is randomly varied at different cross-sections along the main channel. Under consistent 10 mm rainfall event, the selected cross-section in middle of the main channel is prone to maximum water depth at 80 cm, which decreases to nearly 30 cm at the outlet due to transmission loss. The estimation of spatial variability of flow parameters within the catchment at different confluences of the constituting sub-catchments can be considered and used in planning for engineering foundations and linear infrastructures with the least flash flood hazard. Such information would, indeed, help decision makers and planning to minimize such hazards.

  19. Modeling fault rupture hazard for the proposed repository at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Coppersmith, K.J.; Youngs, R.R.

    1992-01-01

    In this paper as part of the Electric Power Research Institute's High Level Waste program, the authors have developed a preliminary probabilistic model for assessing the hazard of fault rupture to the proposed high level waste repository at Yucca Mountain. The model is composed of two parts: the earthquake occurrence model that describes the three-dimensional geometry of earthquake sources and the earthquake recurrence characteristics for all sources in the site vicinity; and the rupture model that describes the probability of coseismic fault rupture of various lengths and amounts of displacement within the repository horizon 350 m below the surface. The latter uses empirical data from normal-faulting earthquakes to relate the rupture dimensions and fault displacement amounts to the magnitude of the earthquake. using a simulation procedure, we allow for earthquake occurrence on all of the earthquake sources in the site vicinity, model the location and displacement due to primary faults, and model the occurrence of secondary faulting in conjunction with primary faulting

  20. Modeling the bathtub shape hazard rate function in terms of reliability

    International Nuclear Information System (INIS)

    Wang, K.S.; Hsu, F.S.; Liu, P.P.

    2002-01-01

    In this paper, a general form of bathtub shape hazard rate function is proposed in terms of reliability. The degradation of system reliability comes from different failure mechanisms, in particular those related to (1) random failures, (2) cumulative damage, (3) man-machine interference, and (4) adaptation. The first item is referred to the modeling of unpredictable failures in a Poisson process, i.e. it is shown by a constant. Cumulative damage emphasizes the failures owing to strength deterioration and therefore the possibility of system sustaining the normal operation load decreases with time. It depends on the failure probability, 1-R. This representation denotes the memory characteristics of the second failure cause. Man-machine interference may lead to a positive effect in the failure rate due to learning and correction, or negative from the consequence of human inappropriate habit in system operations, etc. It is suggested that this item is correlated to the reliability, R, as well as the failure probability. Adaptation concerns with continuous adjusting between the mating subsystems. When a new system is set on duty, some hidden defects are explored and disappeared eventually. Therefore, the reliability decays combined with decreasing failure rate, which is expressed as a power of reliability. Each of these phenomena brings about the failures independently and is described by an additive term in the hazard rate function h(R), thus the overall failure behavior governed by a number of parameters is found by fitting the evidence data. The proposed model is meaningful in capturing the physical phenomena occurring during the system lifetime and provides for simpler and more effective parameter fitting than the usually adopted 'bathtub' procedures. Five examples of different type of failure mechanisms are taken in the validation of the proposed model. Satisfactory results are found from the comparisons

  1. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

    DEFF Research Database (Denmark)

    Nielsen, Jan; Parner, Erik

    2010-01-01

    In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

  2. Seismic rupture modelling, strong motion prediction and seismic hazard assessment: fundamental and applied approaches

    International Nuclear Information System (INIS)

    Berge-Thierry, C.

    2007-05-01

    The defence to obtain the 'Habilitation a Diriger des Recherches' is a synthesis of the research work performed since the end of my Ph D. thesis in 1997. This synthesis covers the two years as post doctoral researcher at the Bureau d'Evaluation des Risques Sismiques at the Institut de Protection (BERSSIN), and the seven consecutive years as seismologist and head of the BERSSIN team. This work and the research project are presented in the framework of the seismic risk topic, and particularly with respect to the seismic hazard assessment. Seismic risk combines seismic hazard and vulnerability. Vulnerability combines the strength of building structures and the human and economical consequences in case of structural failure. Seismic hazard is usually defined in terms of plausible seismic motion (soil acceleration or velocity) in a site for a given time period. Either for the regulatory context or the structural specificity (conventional structure or high risk construction), seismic hazard assessment needs: to identify and locate the seismic sources (zones or faults), to characterize their activity, to evaluate the seismic motion to which the structure has to resist (including the site effects). I specialized in the field of numerical strong-motion prediction using high frequency seismic sources modelling and forming part of the IRSN allowed me to rapidly working on the different tasks of seismic hazard assessment. Thanks to the expertise practice and the participation to the regulation evolution (nuclear power plants, conventional and chemical structures), I have been able to work on empirical strong-motion prediction, including site effects. Specific questions related to the interface between seismologists and structural engineers are also presented, especially the quantification of uncertainties. This is part of the research work initiated to improve the selection of the input ground motion in designing or verifying the stability of structures. (author)

  3. Transportation of Hazardous Materials Emergency Preparedness Hazards Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Blanchard, A.

    2000-02-28

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program.

  4. Transportation of Hazardous Materials Emergency Preparedness Hazards Assessment

    International Nuclear Information System (INIS)

    Blanchard, A.

    2000-01-01

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program

  5. Transportation of hazardous materials emergency preparedness hazards assessment

    International Nuclear Information System (INIS)

    Blanchard, A.

    2000-01-01

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program

  6. Come rain or shine: Multi-model Projections of Climate Hazards affecting Transportation in the South Central United States

    Science.gov (United States)

    Mullens, E.; Mcpherson, R. A.

    2016-12-01

    This work develops detailed trends in climate hazards affecting the Department of Transportation's Region 6, in the South Central U.S. Firstly, a survey was developed to gather information regarding weather and climate hazards in the region from the transportation community, identifying key phenomena and thresholds to evaluate. Statistically downscaled datasets were obtained from the Multivariate Adaptive Constructed Analogues (MACA) project, and the Asynchronous Regional Regression Model (ARRM), for a total of 21 model projections, two coupled model intercomparisons (CMIP3, and CMIP5), and four emissions pathways (A1Fi, B1, RCP8.5, RCP4.5). Specific hazards investigated include winter weather, freeze-thaw cycles, hot and cold extremes, and heavy precipitation. Projections for each of these variables were calculated for the region, utilizing spatial mapping, and time series analysis at the climate division level. The results indicate that cold-season phenomena such as winter weather, freeze-thaw, and cold extremes, decrease in intensity and frequency, particularly with the higher emissions pathways. Nonetheless, specific model and downscaling method yields variability in magnitudes, with the most notable decreasing trends late in the 21st century. Hot days show a pronounced increase, particularly with greater emissions, producing annual mean 100oF day frequencies by late 21st century analogous to the 2011 heatwave over the central Southern Plains. Heavy precipitation, evidenced by return period estimates and counts-over-thresholds, also show notable increasing trends, particularly between the recent past through mid-21st Century. Conversely, mean precipitation does not show significant trends and is regionally variable. Precipitation hazards (e.g., winter weather, extremes) diverge between downscaling methods and their associated model samples much more substantially than temperature, suggesting that the choice of global model and downscaled data is particularly

  7. Constant Proportion Debt Obligations (CPDOs)

    DEFF Research Database (Denmark)

    Cont, Rama; Jessen, Cathrine

    2012-01-01

    be made arbitrarily small—and thus the credit rating arbitrarily high—by increasing leverage, but the ratings obtained strongly depend on assumptions on the credit environment (high spread or low spread). More importantly, CPDO loss distributions are found to exhibit a wide range of tail risk measures......Constant Proportion Debt Obligations (CPDOs) are structured credit derivatives that generate high coupon payments by dynamically leveraging a position in an underlying portfolio of investment-grade index default swaps. CPDO coupons and principal notes received high initial credit ratings from...... the major rating agencies, based on complex models for the joint transition of ratings and spreads for all names in the underlying portfolio. We propose a parsimonious model for analysing the performance of CPDO strategies using a top-down approach that captures the essential risk factors of the CPDO. Our...

  8. Multi scenario seismic hazard assessment for Egypt

    Science.gov (United States)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-05-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  9. Issues in testing the new national seismic hazard model for Italy

    Science.gov (United States)

    Stein, S.; Peresan, A.; Kossobokov, V. G.; Brooks, E. M.; Spencer, B. D.

    2016-12-01

    It is important to bear in mind that we know little about how earthquake hazard maps actually describe the shaking that will actually occur in the future, and have no agreed way of assessing how well a map performed in the past, and, thus, whether one map performs better than another. Moreover, we should not forget that different maps can be useful for different end users, who may have different cost-and-benefit strategies. Thus, regardless of the specific tests we chose to use, the adopted testing approach should have several key features: We should assess map performance using all the available instrumental, paleo seismology, and historical intensity data. Instrumental data alone span a period much too short to capture the largest earthquakes - and thus strongest shaking - expected from most faults. We should investigate what causes systematic misfit, if any, between the longest record we have - historical intensity data available for the Italian territory from 217 B.C. to 2002 A.D. - and a given hazard map. We should compare how seismic hazard maps developed over time. How do the most recent maps for Italy compare to earlier ones? It is important to understand local divergences that show how the models are developing to the most recent one. The temporal succession of maps is important: we have to learn from previous errors. We should use the many different tests that have been proposed. All are worth trying, because different metrics of performance show different aspects of how a hazard map performs and can be used. We should compare other maps to the ones we are testing. Maps can be made using a wide variety of assumptions, which will lead to different predicted shaking. It is possible that maps derived by other approaches may perform better. Although Italian current codes are based on probabilistic maps, it is important from both a scientific and societal perspective to look at all options including deterministic scenario based ones. Comparing what works

  10. Computer Simulation of Atoms Nuclei Structure Using Information Coefficients of Proportionality

    OpenAIRE

    Labushev, Mikhail M.

    2012-01-01

    The latest research of the proportionality of atomic weights of chemical elements made it possible to obtain 3 x 3 matrices for the calculation of information coefficients of proportionality Ip that can be used for 3D modeling of the structure of atom nucleus. The results of computer simulation show high potential of nucleus structure research for the characterization of their chemical and physical properties.

  11. The median hazard ratio: a useful measure of variance and general contextual effects in multilevel survival analysis.

    Science.gov (United States)

    Austin, Peter C; Wagner, Philippe; Merlo, Juan

    2017-03-15

    Multilevel data occurs frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models (MLRM). MLRM incorporate cluster-specific random effects which allow one to partition the total individual variance into between-cluster variation and between-individual variation. Statistically, MLRM account for the dependency of the data within clusters and provide correct estimates of uncertainty around regression coefficients. Substantively, the magnitude of the effect of clustering provides a measure of the General Contextual Effect (GCE). When outcomes are binary, the GCE can also be quantified by measures of heterogeneity like the Median Odds Ratio (MOR) calculated from a multilevel logistic regression model. Time-to-event outcomes within a multilevel structure occur commonly in epidemiological and medical research. However, the Median Hazard Ratio (MHR) that corresponds to the MOR in multilevel (i.e., 'frailty') Cox proportional hazards regression is rarely used. Analogously to the MOR, the MHR is the median relative change in the hazard of the occurrence of the outcome when comparing identical subjects from two randomly selected different clusters that are ordered by risk. We illustrate the application and interpretation of the MHR in a case study analyzing the hazard of mortality in patients hospitalized for acute myocardial infarction at hospitals in Ontario, Canada. We provide R code for computing the MHR. The MHR is a useful and intuitive measure for expressing cluster heterogeneity in the outcome and, thereby, estimating general contextual effects in multilevel survival analysis. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  12. Applying the Land Use Portfolio Model to Estimate Natural-Hazard Loss and Risk - A Hypothetical Demonstration for Ventura County, California

    Science.gov (United States)

    Dinitz, Laura B.

    2008-01-01

    With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS

  13. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    Science.gov (United States)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  14. Estimation of Lithofacies Proportions Using Well and Well Test Data Estimation des proportions lithologiques à partir des données de puits et d'essais de puits

    Directory of Open Access Journals (Sweden)

    Blanc G.

    2006-12-01

    Full Text Available A crucial step of the two commonly used geostatistical methods for modeling heterogeneous reservoirs : the sequential indicator simulation and the truncated Gaussian simulation is the estimation of the lithofacies local proportion (or probability density functions. Well-test derived permeabilities show good correlation with lithofacies proportions around wells. Integrating well and well-test data in estimating lithofacies proportions could permit the building of more realistic models of reservoir heterogeneity. However this integration is difficult because of the different natures and measurement scales of these two types of data. This paper presents a two step approach to integrating well and well-test data into heterogeneous reservoir modeling. First lithofacies proportions in well-test investigation areas are estimated using a new kriging algorithm called KISCA. KISCA consists in kriging jointly the proportions of all lithofacies in a well-test investigation area so that the corresponding well-test derived permeability is respected through a weighted power averaging of lithofacies permeabilities. For multiple well-tests, an iterative process is used in KISCA to account for their interaction. After this, the estimated proportions are combined with lithofacies indicators at wells for estimating proportion (or probability density functions over the entire reservoir field using a classical kriging method. Some numerical examples were considered to test the proposed method for estimating lithofacies proportions. In addition, a synthetic lithofacies reservoir model was generated and a well-test simulation was performed. The comparison between the experimental and estimated proportions in the well-test investigation area demonstrates the validity of the proposed method. La méthode de simulation gaussienne seuillée et la méthode de simulation séquentielle d'indicatrices sont aujourd'hui couramment utilisées pour générer des mod

  15. Modeling hazardous mass flows Geoflows09: Mathematical and computational aspects of modeling hazardous geophysical mass flows; Seattle, Washington, 9–11 March 2009

    Science.gov (United States)

    Iverson, Richard M.; LeVeque, Randall J.

    2009-01-01

    A recent workshop at the University of Washington focused on mathematical and computational aspects of modeling the dynamics of dense, gravity-driven mass movements such as rock avalanches and debris flows. About 30 participants came from seven countries and brought diverse backgrounds in geophysics; geology; physics; applied and computational mathematics; and civil, mechanical, and geotechnical engineering. The workshop was cosponsored by the U.S. Geological Survey Volcano Hazards Program, by the U.S. National Science Foundation through a Vertical Integration of Research and Education (VIGRE) in the Mathematical Sciences grant to the University of Washington, and by the Pacific Institute for the Mathematical Sciences. It began with a day of lectures open to the academic community at large and concluded with 2 days of focused discussions and collaborative work among the participants.

  16. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    Science.gov (United States)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two

  17. LAV@HAZARD: a web-GIS interface for volcanic hazard assessment

    Directory of Open Access Journals (Sweden)

    Giovanni Gallo

    2011-12-01

    Full Text Available Satellite data, radiative power of hot spots as measured with remote sensing, historical records, on site geological surveys, digital elevation model data, and simulation results together provide a massive data source to investigate the behavior of active volcanoes like Mount Etna (Sicily, Italy over recent times. The integration of these heterogeneous data into a coherent visualization framework is important for their practical exploitation. It is crucial to fill in the gap between experimental and numerical data, and the direct human perception of their meaning. Indeed, the people in charge of safety planning of an area need to be able to quickly assess hazards and other relevant issues even during critical situations. With this in mind, we developed LAV@HAZARD, a web-based geographic information system that provides an interface for the collection of all of the products coming from the LAVA project research activities. LAV@HAZARD is based on Google Maps application programming interface, a choice motivated by its ease of use and the user-friendly interactive environment it provides. In particular, the web structure consists of four modules for satellite applications (time-space evolution of hot spots, radiant flux and effusion rate, hazard map visualization, a database of ca. 30,000 lava-flow simulations, and real-time scenario forecasting by MAGFLOW on Compute Unified Device Architecture.

  18. Occupational, social, and relationship hazards and psychological distress among low-income workers: implications of the 'inverse hazard law'.

    Science.gov (United States)

    Krieger, Nancy; Kaddour, Afamia; Koenen, Karestan; Kosheleva, Anna; Chen, Jarvis T; Waterman, Pamela D; Barbeau, Elizabeth M

    2011-03-01

    Few studies have simultaneously included exposure information on occupational hazards, relationship hazards (eg, intimate partner violence) and social hazards (eg, poverty and racial discrimination), especially among low-income multiracial/ethnic populations. A cross-sectional study (2003-2004) of 1202 workers employed at 14 worksites in the greater Boston area of Massachusetts investigated the independent and joint association of occupational, social and relationship hazards with psychological distress (K6 scale). Among this low-income cohort (45% were below the US poverty line), exposure to occupational, social and relationship hazards, per the 'inverse hazard law,' was high: 82% exposed to at least one occupational hazard, 79% to at least one social hazard, and 32% of men and 34% of women, respectively, stated they had been the perpetrator or target of intimate partner violence (IPV). Fully 15.4% had clinically significant psychological distress scores (K6 score ≥ 13). All three types of hazards, and also poverty, were independently associated with increased risk of psychological distress. In models including all three hazards, however, significant associations with psychological distress occurred among men and women for workplace abuse and high exposure to racial discrimination only; among men, for IPV; and among women, for high exposure to occupational hazards, poverty and smoking. Reckoning with the joint and embodied reality of diverse types of hazards involving how people live and work is necessary for understanding determinants of health status.

  19. The Principle of Proportionality

    DEFF Research Database (Denmark)

    Bennedsen, Morten; Meisner Nielsen, Kasper

    2005-01-01

    Recent policy initiatives within the harmonization of European company laws have promoted a so-called "principle of proportionality" through proposals that regulate mechanisms opposing a proportional distribution of ownership and control. We scrutinize the foundation for these initiatives...... in relationship to the process of harmonization of the European capital markets.JEL classifications: G30, G32, G34 and G38Keywords: Ownership Structure, Dual Class Shares, Pyramids, EU companylaws....

  20. Development of evaluation method for software hazard identification techniques

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-01-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  1. KSC VAB Aeroacoustic Hazard Assessment

    Science.gov (United States)

    Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.

    2010-01-01

    NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.

  2. Hazards assessment for the Hazardous Waste Storage Facility

    International Nuclear Information System (INIS)

    Knudsen, J.K.; Calley, M.B.

    1994-04-01

    This report documents the hazards assessment for the Hazardous Waste Storage Facility (HWSF) located at the Idaho National Engineering Laboratory. The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility's operational emergency management program. The area surrounding HWSF, the buildings and structures at HWSF, and the processes used at HWSF are described in this report. All nonradiological hazardous materials at the HWSF were identified (radiological hazardous materials are not stored at HWSF) and screened against threshold quantities according to DOE Order 5500.3A guidance. Two of the identified hazardous materials exceeded their specified threshold quantity. This report discusses the potential release scenarios and consequences associated with an accidental release for each of the two identified hazardous materials, lead and mercury. Emergency considerations, such as emergency planning zones, emergency classes, protective actions, and emergency action levels, are also discussed based on the analysis of potential consequences. Evaluation of the potential consequences indicated that the highest emergency class for operational emergencies at the HWSF would be a Site Area Emergency

  3. Measurements and models for hazardous chemical and mixed wastes. 1998 annual progress report

    International Nuclear Information System (INIS)

    Holcomb, C.; Louie, B.; Mullins, M.E.; Outcalt, S.L.; Rogers, T.N.; Watts, L.

    1998-01-01

    'Aqueous waste of various chemical compositions constitutes a significant fraction of the total waste produced by industry in the US. A large quantity of the waste generated by the US chemical process industry is waste water. In addition, the majority of the waste inventory at DoE sites previously used for nuclear weapons production is aqueous waste. Large quantities of additional aqueous waste are expected to be generated during the clean-up of those sites. In order to effectively treat, safely handle, and properly dispose of these wastes, accurate and comprehensive knowledge of basic thermophysical property information is paramount. This knowledge will lead to huge savings by aiding in the design and optimization of treatment and disposal processes. The main objectives of this project are: Develop and validate models that accurately predict the phase equilibria and thermodynamic properties of hazardous aqueous systems necessary for the safe handling and successful design of separation and treatment processes for hazardous chemical and mixed wastes. Accurately measure the phase equilibria and thermodynamic properties of a representative system (water + acetone + isopropyl alcohol + sodium nitrate) over the applicable ranges of temperature, pressure, and composition to provide the pure component, binary, ternary, and quaternary experimental data required for model development. As of May, 1998, nine months into the first year of a three year project, the authors have made significant progress in the database development, have begun testing the models, and have been performance testing the apparatus on the pure components.'

  4. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    Energy Technology Data Exchange (ETDEWEB)

    (NOEMAIL), R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  5. Landslide hazard assessment along a mountain highway in the Indian Himalayan Region (IHR) using remote sensing and computational models

    Science.gov (United States)

    Krishna, Akhouri P.; Kumar, Santosh

    2013-10-01

    Landslide hazard assessments using computational models, such as artificial neural network (ANN) and frequency ratio (FR), were carried out covering one of the important mountain highways in the Central Himalaya of Indian Himalayan Region (IHR). Landslide influencing factors were either calculated or extracted from spatial databases including recent remote sensing data of LANDSAT TM, CARTOSAT digital elevation model (DEM) and Tropical Rainfall Measuring Mission (TRMM) satellite for rainfall data. ANN was implemented using the multi-layered feed forward architecture with different input, output and hidden layers. This model based on back propagation algorithm derived weights for all possible parameters of landslides and causative factors considered. The training sites for landslide prone and non-prone areas were identified and verified through details gathered from remote sensing and other sources. Frequency Ratio (FR) models are based on observed relationships between the distribution of landslides and each landslide related factor. FR model implementation proved useful for assessing the spatial relationships between landslide locations and factors contributing to its occurrence. Above computational models generated respective susceptibility maps of landslide hazard for the study area. This further allowed the simulation of landslide hazard maps on a medium scale using GIS platform and remote sensing data. Upon validation and accuracy checks, it was observed that both models produced good results with FR having some edge over ANN based mapping. Such statistical and functional models led to better understanding of relationships between the landslides and preparatory factors as well as ensuring lesser levels of subjectivity compared to qualitative approaches.

  6. Macroeconomic Proportions and Corellations

    Directory of Open Access Journals (Sweden)

    Constantin Mitrut

    2006-04-01

    Full Text Available The work is focusing on the main proportions and correlations which are being set up between the major macroeconomic indicators. This is the general frame for the analysis of the relations between the Gross Domestic Product growth rate and the unemployment rate; the interaction between the inflation rate and the unemployment rate; the connection between the GDP growth rate and the inflation rate. Within the analysis being performed, a particular attention is paid to “the basic relationship of the economic growth” by emphasizing the possibilities as to a factorial analysis of the macroeconomic development, mainly as far as the Gross Domestic Product is concerned. At this point, the authors are introducing the mathematical relations, which are used for modeling the macroeconomic correlations, hence the strictness of the analysis being performed.

  7. The effects of hazardous working conditions on burnout in Macau nurses

    OpenAIRE

    Sydney X. Hu; Andrew L. Luk; Graeme D. Smith

    2015-01-01

    Objective: To examine the effects of various hazardous factors in working environments on burnout in a cohort of clinical nurses in Macau. Methods: A cross-sectional survey was used to examine specific workplace hazards for burnout in qualified nurses (n = 424) in Macau. Structural equation modeling (SEM) was used to analyze relationships between specific hazards and manifestations of burnout. Results: In the final model, workplace hazards accounted for 73% of the variance of burnout wi...

  8. Assessing the effect of quantitative and qualitative predictors on gastric cancer individuals survival using hierarchical artificial neural network models.

    Science.gov (United States)

    Amiri, Zohreh; Mohammad, Kazem; Mahmoudi, Mahmood; Parsaeian, Mahbubeh; Zeraati, Hojjat

    2013-01-01

    There are numerous unanswered questions in the application of artificial neural network models for analysis of survival data. In most studies, independent variables have been studied as qualitative dichotomous variables, and results of using discrete and continuous quantitative, ordinal, or multinomial categorical predictive variables in these models are not well understood in comparison to conventional models. This study was designed and conducted to examine the application of these models in order to determine the survival of gastric cancer patients, in comparison to the Cox proportional hazards model. We studied the postoperative survival of 330 gastric cancer patients who suffered surgery at a surgical unit of the Iran Cancer Institute over a five-year period. Covariates of age, gender, history of substance abuse, cancer site, type of pathology, presence of metastasis, stage, and number of complementary treatments were entered in the models, and survival probabilities were calculated at 6, 12, 18, 24, 36, 48, and 60 months using the Cox proportional hazards and neural network models. We estimated coefficients of the Cox model and the weights in the neural network (with 3, 5, and 7 nodes in the hidden layer) in the training group, and used them to derive predictions in the study group. Predictions with these two methods were compared with those of the Kaplan-Meier product limit estimator as the gold standard. Comparisons were performed with the Friedman and Kruskal-Wallis tests. Survival probabilities at different times were determined using the Cox proportional hazards and a neural network with three nodes in the hidden layer; the ratios of standard errors with these two methods to the Kaplan-Meier method were 1.1593 and 1.0071, respectively, revealed a significant difference between Cox and Kaplan-Meier (P neural network, and the neural network and the standard (Kaplan-Meier), as well as better accuracy for the neural network (with 3 nodes in the hidden layer

  9. Using Multi-Scenario Tsunami Modelling Results combined with Probabilistic Analyses to provide Hazard Information for the South-WestCoast of Indonesia

    Science.gov (United States)

    Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.

    2009-04-01

    Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of

  10. Development of seismic hazard analysis in Japan

    International Nuclear Information System (INIS)

    Itoh, T.; Ishii, K.; Ishikawa, Y.; Okumura, T.

    1987-01-01

    In recent years, seismic risk assessment of the nuclear power plant have been conducted increasingly in various countries, particularly in the United States to evaluate probabilistically the safety of existing plants under earthquake loading. The first step of the seismic risk assessment is the seismic hazard analysis, in which the relationship between the maximum earthquake ground motions at the plant site and their annual probability of exceedance, i.e. the seismic hazard curve, is estimated. In this paper, seismic hazard curves are evaluated and examined based on historical earthquake records model, in which seismic sources are modeled with area-sources, for several different sites in Japan. A new evaluation method is also proposed to compute the response spectra of the earthquake ground motions in connection with estimating the probabilistic structural response. Finally the numerical result of probabilistic risk assessment for a base-isolated three story RC structure, in which the frequency of seismic induced structural failure is evaluated combining the seismic hazard analysis, is described briefly

  11. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  12. Hazardous materials management and compliance training

    International Nuclear Information System (INIS)

    Dalton, T.F.

    1991-01-01

    OSHA training for hazardous waste site workers is required by the Superfund Amendments and Reauthorization Act of 1986 (SARA). In December 1986, a series of regulations was promulgated by OSHA on an interim basis calling for the training of workers engaged in hazardous waste operations. Subsequent to these interim regulations, final rules were promulgated and these final rules on hazardous waste operations and emergency response became effective on March 6, 1990. OSHA has conducted hearings on the accreditation of training programs. OSHA would like to follow the accreditation process under the AHERA regulations for asbestos, in which the model plan for accreditation of asbestos abatement training was included in Section 206 of Title 11 of the Toxic Substance Control Act (TSCA). OSHA proposed on January 26, 1990, to perform the accreditation of training programs for hazardous waste operations and that proposal suggested that they follow the model plan similar to the one used for AHERA. They did not propose to accredited training programs for workers engaged in emergency response. These new regulations pose a significant problem to the various contractors and emergency responders who deal with hazardous materials spill response, cleanup and site remediation since these programs have expanded so quickly that many people are not familiar with what particular segment of the training they are required to have and whether or not programs that have yet to be accredited are satisfactory for this type of training. Title III of SARA stipulates a training program for first responders which includes local emergency response organizations such as firemen and policemen. The purpose of this paper is to discuss the needs of workers at hazardous waste site remediation projects and workers who are dealing with hazardous substances, spill response and cleanup

  13. Evaluating Middle Years Students' Proportional Reasoning

    Science.gov (United States)

    Hilton, Annette; Dole, Shelley; Hilton, Geoff; Goos, Merrilyn; O'Brien, Mia

    2012-01-01

    Proportional reasoning is a key aspect of numeracy that is not always developed naturally by students. Understanding the types of proportional reasoning that students apply to different problem types is a useful first step to identifying ways to support teachers and students to develop proportional reasoning in the classroom. This paper describes…

  14. Identification of natural hazards and classification of urban areas by TOPSIS model (case study: Bandar Abbas city, Iran

    Directory of Open Access Journals (Sweden)

    Rasool Mahdavi Najafabadi

    2016-01-01

    Full Text Available In this paper, among multi-criteria models for complex decision-making and multiple-attribute models for assigning the most preferable choice, the technique for order preference by similarity ideal solution (TOPSIS is implied. The main objective of this research is to identify potential natural hazards in Bandar Abbas city, Iran, using TOPSIS model, which is based on an analytical hierarchy process structure. A set of 12 relevant geomorphologic parameters, including earthquake frequency, distance from the earthquake epicentre, number of faults, flood, talus creep, landslide, land subsidence, tide, hurricane and tidal wave, dust storms with external source, wind erosion and sea level fluctuations are considered to quantify inputs of the model. The outputs of this study indicate that one region, among three assessed regions, has the maximum potential occurrence of natural hazards, while it has been urbanized at a greater rate compared to other regions. Furthermore, based on Delphi method, the earthquake frequency and the landslide are the most and the least dangerous phenomena, respectively.

  15. Implications of different digital elevation models and preprocessing techniques to delineate debris flow inundation hazard zones in El Salvador

    Science.gov (United States)

    Anderson, E. R.; Griffin, R.; Irwin, D.

    2013-12-01

    Heavy rains and steep, volcanic slopes in El Salvador cause numerous landslides every year, posing a persistent threat to the population, economy and environment. Although potential debris inundation hazard zones have been delineated using digital elevation models (DEMs), some disparities exist between the simulated zones and actual affected areas. Moreover, these hazard zones have only been identified for volcanic lahars and not the shallow landslides that occur nearly every year. This is despite the availability of tools to delineate a variety of landslide types (e.g., the USGS-developed LAHARZ software). Limitations in DEM spatial resolution, age of the data, and hydrological preprocessing techniques can contribute to inaccurate hazard zone definitions. This study investigates the impacts of using different elevation models and pit filling techniques in the final debris hazard zone delineations, in an effort to determine which combination of methods most closely agrees with observed landslide events. In particular, a national DEM digitized from topographic sheets from the 1970s and 1980s provide an elevation product at a 10 meter resolution. Both natural and anthropogenic modifications of the terrain limit the accuracy of current landslide hazard assessments derived from this source. Global products from the Shuttle Radar Topography Mission (SRTM) and the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global DEM (ASTER GDEM) offer more recent data but at the cost of spatial resolution. New data derived from the NASA Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) in 2013 provides the opportunity to update hazard zones at a higher spatial resolution (approximately 6 meters). Hydrological filling of sinks or pits for current hazard zone simulation has previously been achieved through ArcInfo spatial analyst. Such hydrological processing typically only fills pits and can lead to drastic modifications of original elevation values

  16. High resolution global flood hazard map from physically-based hydrologic and hydraulic models.

    Science.gov (United States)

    Begnudelli, L.; Kaheil, Y.; McCollum, J.

    2017-12-01

    The global flood map published online at http://www.fmglobal.com/research-and-resources/global-flood-map at 90m resolution is being used worldwide to understand flood risk exposure, exercise certain measures of mitigation, and/or transfer the residual risk financially through flood insurance programs. The modeling system is based on a physically-based hydrologic model to simulate river discharges, and 2D shallow-water hydrodynamic model to simulate inundation. The model can be applied to large-scale flood hazard mapping thanks to several solutions that maximize its efficiency and the use of parallel computing. The hydrologic component of the modeling system is the Hillslope River Routing (HRR) hydrologic model. HRR simulates hydrological processes using a Green-Ampt parameterization, and is calibrated against observed discharge data from several publicly-available datasets. For inundation mapping, we use a 2D Finite-Volume Shallow-Water model with wetting/drying. We introduce here a grid Up-Scaling Technique (UST) for hydraulic modeling to perform simulations at higher resolution at global scale with relatively short computational times. A 30m SRTM is now available worldwide along with higher accuracy and/or resolution local Digital Elevation Models (DEMs) in many countries and regions. UST consists of aggregating computational cells, thus forming a coarser grid, while retaining the topographic information from the original full-resolution mesh. The full-resolution topography is used for building relationships between volume and free surface elevation inside cells and computing inter-cell fluxes. This approach almost achieves computational speed typical of the coarse grids while preserving, to a significant extent, the accuracy offered by the much higher resolution available DEM. The simulations are carried out along each river of the network by forcing the hydraulic model with the streamflow hydrographs generated by HRR. Hydrographs are scaled so that the peak

  17. Trimming a hazard logic tree with a new model-order-reduction technique

    Science.gov (United States)

    Porter, Keith; Field, Edward; Milner, Kevin R

    2017-01-01

    The size of the logic tree within the Uniform California Earthquake Rupture Forecast Version 3, Time-Dependent (UCERF3-TD) model can challenge risk analyses of large portfolios. An insurer or catastrophe risk modeler concerned with losses to a California portfolio might have to evaluate a portfolio 57,600 times to estimate risk in light of the hazard possibility space. Which branches of the logic tree matter most, and which can one ignore? We employed two model-order-reduction techniques to simplify the model. We sought a subset of parameters that must vary, and the specific fixed values for the remaining parameters, to produce approximately the same loss distribution as the original model. The techniques are (1) a tornado-diagram approach we employed previously for UCERF2, and (2) an apparently novel probabilistic sensitivity approach that seems better suited to functions of nominal random variables. The new approach produces a reduced-order model with only 60 of the original 57,600 leaves. One can use the results to reduce computational effort in loss analyses by orders of magnitude.

  18. Small area estimation of proportions with different levels of auxiliary data.

    Science.gov (United States)

    Chandra, Hukum; Kumar, Sushil; Aditya, Kaustav

    2018-03-01

    Binary data are often of interest in many small areas of applications. The use of standard small area estimation methods based on linear mixed models becomes problematic for such data. An empirical plug-in predictor (EPP) under a unit-level generalized linear mixed model with logit link function is often used for the estimation of a small area proportion. However, this EPP requires the availability of unit-level population information for auxiliary data that may not be always accessible. As a consequence, in many practical situations, this EPP approach cannot be applied. Based on the level of auxiliary information available, different small area predictors for estimation of proportions are proposed. Analytic and bootstrap approaches to estimating the mean squared error of the proposed small area predictors are also developed. Monte Carlo simulations based on both simulated and real data show that the proposed small area predictors work well for generating the small area estimates of proportions and represent a practical alternative to the above approach. The developed predictor is applied to generate estimates of the proportions of indebted farm households at district-level using debt investment survey data from India. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. A Monte Carlo study of time-aggregation in continuous-time and discrete-time parametric hazard models.

    NARCIS (Netherlands)

    Hofstede, ter F.; Wedel, M.

    1998-01-01

    This study investigates the effects of time aggregation in discrete and continuous-time hazard models. A Monte Carlo study is conducted in which data are generated according to various continuous and discrete-time processes, and aggregated into daily, weekly and monthly intervals. These data are

  20. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Science.gov (United States)

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  1. Optical fusions and proportional syntheses

    Science.gov (United States)

    Albert-Vanel, Michel

    2002-06-01

    A tragic error is being made in the literature concerning matters of color when dealing with optical fusions. They are still considered to be of additive nature, whereas experience shows us somewhat different results. The goal of this presentation is to show that fusions are, in fact, of 'proportional' nature, tending to be additive or subtractive, depending on each individual case. Using the pointillist paintings done in the manner of Seurat, or the spinning discs experiment could highlight this intermediate sector of the proportional. So, let us try to examine more closely what occurs in fact, by reviewing additive, subtractive and proportional syntheses.

  2. Processing LiDAR Data to Predict Natural Hazards

    Science.gov (United States)

    Fairweather, Ian; Crabtree, Robert; Hager, Stacey

    2008-01-01

    ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.

  3. Job Hazards Analysis Among A Group Of Surgeons At Zagazig ...

    African Journals Online (AJOL)

    ... 75% respectively. Conclusion: Job hazards analysis model was effective in assessment, evaluation and management of occupational hazards concerning surgeons and should considered as part of hospital wide quality and safety program. Key Words: Job Hazard Analysis, Risk Management, occupational Health Safety.

  4. A hydro-sedimentary modeling system for flash flood propagation and hazard estimation under different agricultural practices

    Science.gov (United States)

    Kourgialas, N. N.; Karatzas, G. P.

    2014-03-01

    A modeling system for the estimation of flash flood flow velocity and sediment transport is developed in this study. The system comprises three components: (a) a modeling framework based on the hydrological model HSPF, (b) the hydrodynamic module of the hydraulic model MIKE 11 (quasi-2-D), and (c) the advection-dispersion module of MIKE 11 as a sediment transport model. An important parameter in hydraulic modeling is the Manning's coefficient, an indicator of the channel resistance which is directly dependent on riparian vegetation changes. Riparian vegetation's effect on flood propagation parameters such as water depth (inundation), discharge, flow velocity, and sediment transport load is investigated in this study. Based on the obtained results, when the weed-cutting percentage is increased, the flood wave depth decreases while flow discharge, velocity and sediment transport load increase. The proposed modeling system is used to evaluate and illustrate the flood hazard for different riparian vegetation cutting scenarios. For the estimation of flood hazard, a combination of the flood propagation characteristics of water depth, flow velocity and sediment load was used. Next, a well-balanced selection of the most appropriate agricultural cutting practices of riparian vegetation was performed. Ultimately, the model results obtained for different agricultural cutting practice scenarios can be employed to create flood protection measures for flood-prone areas. The proposed methodology was applied to the downstream part of a small Mediterranean river basin in Crete, Greece.

  5. SHC, Seismic Hazard Assessment for Eastern US

    International Nuclear Information System (INIS)

    Savy, J.; Davis, B.

    2001-01-01

    1 - Description of program or function: SHC was developed as part of the Eastern United States (EUS) Seismic Hazard Characterization (SHC) Project to design an SHC methodology for the region east of the Rocky Mountains in a form suitable for probabilistic risk assessment and to apply that methodology to 69 site locations, some of them with local soil conditions. The method developed uses expert opinions to obtain the input to the analysis. SHC contains four modules which calculate the seismic hazard at a site located in a region of diffuse seismicity, where the seismicity is modeled by area sources. SHC integrates the opinions of 11 seismicity and five ground-motion experts. The PRDS model generates the discrete probability density function of the distances to the site for the various seismic source zones. These probability distributions are used by the COMAP module to generate the set of all alternative maps and the discrete probability density of the seismic zonation maps for each expert. The third module, ALEAS, uses these maps and their weights to calculate the best estimate and constant percentile hazard distribution resulting from the choice of a given seismicity expert for all ground-motion experts. This module can be used alone to perform a seismic hazard analysis as well as in conjunction with the other modules. The fourth module, COMB, combines the best- estimate and constant-percentile hazard over all seismicity experts, using the set of weights calculated by ALEAS, to produce the final probability distribution of the hazard for the site under consideration so that the hazard analysis can be performed for any location in the EUS. Local geological-site characteristics are incorporated in a generic fashion, and the data are developed in a generic manner. 2 - Method of solution: SHC uses a seismic-source approach utilizing statistical and geological evidence to define geographical regions with homogeneous Poisson activity throughout the zone, described by a

  6. Evaluating a multi-criteria model for hazard assessment in urban design. The Porto Marghera case study

    International Nuclear Information System (INIS)

    Luria, Paolo; Aspinall, Peter A.

    2003-01-01

    The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based on a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)

  7. Seismic hazard map of North and Central America and the Caribbean

    Directory of Open Access Journals (Sweden)

    K. M. Shedlock

    1999-06-01

    Full Text Available Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes, emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of North and Central America and the Caribbean is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful regional seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA with a 10% chance of exceedance in 50 years. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of North and Central America and the Caribbean depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings. The highest seismic hazard values in the region generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes.

  8. Preliminary deformation model for National Seismic Hazard map of Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Meilano, Irwan; Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z. [Geodesy Research Division, Faculty of Earth Science and Technology, Institute of Technology Bandung (Indonesia); Susilo,; Efendi, Joni [Agency for Geospatial Information (BIG) (Indonesia)

    2015-04-24

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  9. Ensemble of ground subsidence hazard maps using fuzzy logic

    Science.gov (United States)

    Park, Inhye; Lee, Jiyeong; Saro, Lee

    2014-06-01

    Hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok, Korea, were constructed using fuzzy ensemble techniques and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, groundwater, and ground subsidence maps. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 70/30 for training and validation of the models. The relationships between the detected ground-subsidence area and the factors were identified and quantified by frequency ratio (FR), logistic regression (LR) and artificial neural network (ANN) models. The relationships were used as factor ratings in the overlay analysis to create ground-subsidence hazard indexes and maps. The three GSH maps were then used as new input factors and integrated using fuzzy-ensemble methods to make better hazard maps. All of the hazard maps were validated by comparison with known subsidence areas that were not used directly in the analysis. As the result, the ensemble model was found to be more effective in terms of prediction accuracy than the individual model.

  10. Issues of tsunami hazard maps revealed by the 2011 Tohoku tsunami

    Science.gov (United States)

    Sugimoto, M.

    2013-12-01

    Tsunami scientists are imposed responsibilities of selection for people's tsunami evacuation place after the 2011 Tohoku Tsunami in Japan. A lot of matured people died out of tsunami hazard zone based on tsunami hazard map though students made a miracle by evacuation on their own judgment in Kamaishi city. Tsunami hazard maps were based on numerical model smaller than actual magnitude 9. How can we bridge the gap between hazard map and future disasters? We have to discuss about using tsunami numerical model better enough to contribute tsunami hazard map. How do we have to improve tsunami hazard map? Tsunami hazard map should be revised included possibility of upthrust or downthrust after earthquakes and social information. Ground sank 1.14m below sea level in Ayukawa town, Tohoku. Ministry of Land, Infrastructure, Transport and Tourism's research shows around 10% people know about tsunami hazard map in Japan. However, people know about their evacuation places (buildings) through experienced drills once a year even though most people did not know about tsunami hazard map. We need wider spread of tsunami hazard with contingency of science (See the botom disaster handbook material's URL). California Emergency Management Agency (CEMA) team practically shows one good practice and solution to me. I followed their field trip in Catalina Island, California in Sep 2011. A team members are multidisciplinary specialists: A geologist, a GIS specialist, oceanographers in USC (tsunami numerical modeler) and a private company, a local policeman, a disaster manager, a local authority and so on. They check field based on their own specialties. They conduct an on-the-spot inspection of ambiguous locations between tsunami numerical model and real field conditions today. The data always become older. They pay attention not only to topographical conditions but also to social conditions: vulnerable people, elementary schools and so on. It takes a long time to check such field

  11. On-board adaptive model for state of charge estimation of lithium-ion batteries based on Kalman filter with proportional integral-based error adjustment

    Science.gov (United States)

    Wei, Jingwen; Dong, Guangzhong; Chen, Zonghai

    2017-10-01

    With the rapid development of battery-powered electric vehicles, the lithium-ion battery plays a critical role in the reliability of vehicle system. In order to provide timely management and protection for battery systems, it is necessary to develop a reliable battery model and accurate battery parameters estimation to describe battery dynamic behaviors. Therefore, this paper focuses on an on-board adaptive model for state-of-charge (SOC) estimation of lithium-ion batteries. Firstly, a first-order equivalent circuit battery model is employed to describe battery dynamic characteristics. Then, the recursive least square algorithm and the off-line identification method are used to provide good initial values of model parameters to ensure filter stability and reduce the convergence time. Thirdly, an extended-Kalman-filter (EKF) is applied to on-line estimate battery SOC and model parameters. Considering that the EKF is essentially a first-order Taylor approximation of battery model, which contains inevitable model errors, thus, a proportional integral-based error adjustment technique is employed to improve the performance of EKF method and correct model parameters. Finally, the experimental results on lithium-ion batteries indicate that the proposed EKF with proportional integral-based error adjustment method can provide robust and accurate battery model and on-line parameter estimation.

  12. Proportioning of U3O8 powder

    International Nuclear Information System (INIS)

    Cermak, V.; Markvart, M.; Novy, P.; Vanka, M.

    1989-01-01

    The tests are briefly described or proportioning U 3 O 8 powder of a granulometric grain size range of 0-160 μm using a vertical screw, a horizontal dual screw and a vibration dispenser with a view to proportioning very fine U 3 O 8 powder fractions produced in the oxidation of UO 2 fuel pellets. In the tests, the evenness of proportioning was assessed by the percentage value of the proportioning rate spread measured at one-minute intervals at a proportioning rate of 1-3 kg/h. In feeding the U 3 O 3 in a flame fluorator, it is advantageous to monitor the continuity of the powder column being proportioned and to assess it radiometrically by the value of the proportioning rate spread at very short intervals (0.1 s). (author). 10 figs., 1 tab., 12 refs

  13. Proportion and clinical features of never-smokers with non-small cell lung cancer.

    Science.gov (United States)

    Cho, Jaeyoung; Choi, Sun Mi; Lee, Jinwoo; Lee, Chang-Hoon; Lee, Sang-Min; Kim, Dong-Wan; Yim, Jae-Joon; Kim, Young Tae; Yoo, Chul-Gyu; Kim, Young Whan; Han, Sung Koo; Park, Young Sik

    2017-02-08

    The proportion of never-smokers with non-small cell lung cancer (NSCLC) is increasing, but that in Korea has not been well addressed in a large population. We aimed to evaluate the proportion and clinical features of never-smokers with NSCLC in a large single institution. We analyzed clinical data of 1860 consecutive patients who were newly diagnosed with NSCLC between June 2011 and December 2014. Of the 1860 NSCLC patients, 707 (38.0%) were never-smokers. The proportions of women (83.7% vs. 5.6%) and adenocarcinoma (89.8% vs. 44.9%) were higher among never-smokers than among ever-smokers. Significantly more never-smokers were diagnosed at a younger median age (65 vs. 68 years, P smokers. Epidermal growth factor receptor mutations (57.8% vs. 24.4%, P never-smokers, whereas Kirsten rat sarcoma viral oncogene homolog mutations (5.8% vs. 9.6%, P = 0.021) were less frequently encountered in never-smokers than in ever-smokers. Never-smokers showed longer survival after adjusting for the favorable effects of younger age, female sex, adenocarcinoma histology, better performance status, early stage disease, being asymptomatic at diagnosis, received antitumor treatment, and the presence of driver mutations (hazard ratio, 0.624; 95% confidence interval, 0.460-0.848; P = 0.003). More than one-third of the Korean patients with NSCLC were never-smokers. NSCLC in never-smokers had different clinical characteristics and major driver mutations and resulted in longer overall survival compared with NSCLC in ever-smokers.

  14. Modeling and Prediction of Wildfire Hazard in Southern California, Integration of Models with Imaging Spectrometry

    Science.gov (United States)

    Roberts, Dar A.; Church, Richard; Ustin, Susan L.; Brass, James A. (Technical Monitor)

    2001-01-01

    Large urban wildfires throughout southern California have caused billions of dollars of damage and significant loss of life over the last few decades. Rapid urban growth along the wildland interface, high fuel loads and a potential increase in the frequency of large fires due to climatic change suggest that the problem will worsen in the future. Improved fire spread prediction and reduced uncertainty in assessing fire hazard would be significant, both economically and socially. Current problems in the modeling of fire spread include the role of plant community differences, spatial heterogeneity in fuels and spatio-temporal changes in fuels. In this research, we evaluated the potential of Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Airborne Synthetic Aperture Radar (AIRSAR) data for providing improved maps of wildfire fuel properties. Analysis concentrated in two areas of Southern California, the Santa Monica Mountains and Santa Barbara Front Range. Wildfire fuel information can be divided into four basic categories: fuel type, fuel load (live green and woody biomass), fuel moisture and fuel condition (live vs senesced fuels). To map fuel type, AVIRIS data were used to map vegetation species using Multiple Endmember Spectral Mixture Analysis (MESMA) and Binary Decision Trees. Green live biomass and canopy moisture were mapped using AVIRIS through analysis of the 980 nm liquid water absorption feature and compared to alternate measures of moisture and field measurements. Woody biomass was mapped using L and P band cross polarimetric data acquired in 1998 and 1999. Fuel condition was mapped using spectral mixture analysis to map green vegetation (green leaves), nonphotosynthetic vegetation (NPV; stems, wood and litter), shade and soil. Summaries describing the potential of hyperspectral and SAR data for fuel mapping are provided by Roberts et al. and Dennison et al. To utilize remotely sensed data to assess fire hazard, fuel-type maps were translated

  15. Macroeconomic Proportions and Corellations

    Directory of Open Access Journals (Sweden)

    Constantin Anghelache

    2006-02-01

    Full Text Available The work is focusing on the main proportions and correlations which are being set up between the major macroeconomic indicators. This is the general frame for the analysis of the relations between the Gross Domestic Product growth rate and the unemployment rate; the interaction between the inflation rate and the unemployment rate; the connection between the GDP growth rate and the inflation rate. Within the analysis being performed, a particular attention is paid to �the basic relationship of the economic growth� by emphasizing the possibilities as to a factorial analysis of the macroeconomic development, mainly as far as the Gross Domestic Product is concerned. At this point, the authors are introducing the mathematical relations, which are used for modeling the macroeconomic correlations, hence the strictness of the analysis being performed.

  16. Proportional Symbol Mapping in R

    Directory of Open Access Journals (Sweden)

    Susumu Tanimura

    2006-01-01

    Full Text Available Visualization of spatial data on a map aids not only in data exploration but also in communication to impart spatial conception or ideas to others. Although recent carto-graphic functions in R are rapidly becoming richer, proportional symbol mapping, which is one of the common mapping approaches, has not been packaged thus far. Based on the theories of proportional symbol mapping developed in cartography, the authors developed some functions for proportional symbol mapping using R, including mathematical and perceptual scaling. An example of these functions demonstrated the new expressive power and options available in R, particularly for the visualization of conceptual point data.

  17. A balanced hazard ratio for risk group evaluation from survival data.

    Science.gov (United States)

    Branders, Samuel; Dupont, Pierre

    2015-07-30

    Common clinical studies assess the quality of prognostic factors, such as gene expression signatures, clinical variables or environmental factors, and cluster patients into various risk groups. Typical examples include cancer clinical trials where patients are clustered into high or low risk groups. Whenever applied to survival data analysis, such groups are intended to represent patients with similar survival odds and to select the most appropriate therapy accordingly. The relevance of such risk groups, and of the related prognostic factors, is typically assessed through the computation of a hazard ratio. We first stress three limitations of assessing risk groups through the hazard ratio: (1) it may promote the definition of arbitrarily unbalanced risk groups; (2) an apparently optimal group hazard ratio can be largely inconsistent with the p-value commonly associated to it; and (3) some marginal changes between risk group proportions may lead to highly different hazard ratio values. Those issues could lead to inappropriate comparisons between various prognostic factors. Next, we propose the balanced hazard ratio to solve those issues. This new performance metric keeps an intuitive interpretation and is as simple to compute. We also show how the balanced hazard ratio leads to a natural cut-off choice to define risk groups from continuous risk scores. The proposed methodology is validated through controlled experiments for which a prescribed cut-off value is defined by design. Further results are also reported on several cancer prognosis studies, and the proposed methodology could be applied more generally to assess the quality of any prognostic markers. Copyright © 2015 John Wiley & Sons, Ltd.

  18. A model for the operation of helium-filled proportional counter at low temperatures near 4.2 K

    International Nuclear Information System (INIS)

    Masaoka, Sei; Katano, Rintaro; Kishimoto, Shunji; Isozumi, Yasuhito

    2000-01-01

    In order to understand the operation of helium-filled proportional counter (HFPC) from the standpoint of fundamental atomic and molecular processes, we have surveyed previous works on collision processes in discharged helium gas. By analyzing gas gain curve, after-pulses and discharge current experimentally observed at 4.2 K, the electron avalanche and the secondary electron emission from cathode have been related to the collision processes in helium. A simplified model for the HFPC operation at low temperatures near 4.2 K has been constructed with the related processes

  19. Covariate selection for the semiparametric additive risk model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas

    2009-01-01

    This paper considers covariate selection for the additive hazards model. This model is particularly simple to study theoretically and its practical implementation has several major advantages to the similar methodology for the proportional hazards model. One complication compared...... and study their large sample properties for the situation where the number of covariates p is smaller than the number of observations. We also show that the adaptive Lasso has the oracle property. In many practical situations, it is more relevant to tackle the situation with large p compared with the number...... of observations. We do this by studying the properties of the so-called Dantzig selector in the setting of the additive risk model. Specifically, we establish a bound on how close the solution is to a true sparse signal in the case where the number of covariates is large. In a simulation study, we also compare...

  20. Prognostic model for patients treated for colorectal adenomas with regard to development of recurrent adenomas and carcinoma

    DEFF Research Database (Denmark)

    Jensen, P; Krogsgaard, M R; Christiansen, J

    1996-01-01

    -80. INTERVENTIONS: All patients were followed up by rectoscopy and double contrast barium enema. The survival data were analysed by Cox's proportional hazards model. MAIN OUTCOME MEASURES: Variables of significant prognostic importance for recurrence of adenomas and the development of cancer were identified...

  1. Regression analysis of informative current status data with the additive hazards model.

    Science.gov (United States)

    Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo

    2015-04-01

    This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.

  2. Asian Americans and disproportionate exposure to carcinogenic hazardous air pollutants: A national study.

    Science.gov (United States)

    Grineski, Sara E; Collins, Timothy W; Morales, Danielle X

    2017-07-01

    Studies have demonstrated disparate exposures to carcinogenic hazardous air pollutants (HAPs) in neighborhoods with high densities of Black and Hispanic residents in the US. Asians are the fastest growing racial/ethnic group in the US, yet they have been underemphasized in previous studies of environmental health and injustice. This cross-sectional study investigated possible disparities in residential exposure to carcinogenic HAPs among Asian Americans, including Asian American subgroups in the US (including all 50 states and the District of Columbia, n = 71,208 US census tracts) using National Air Toxics Assessment and US Census data. In an unadjusted analysis, Chinese and Korean Americans experience the highest mean cancer risks from HAPs, followed by Blacks. The aggregated Asian category ranks just below Blacks and above Hispanics, in terms of carcinogenic HAP risk. Multivariate models adjusting for socioeconomic status, population density, urban location, and geographic clustering show that an increase in proportion of Asian residents in census tracts is associated with significantly greater cancer risk from HAPs. Neighborhoods with higher proportions (as opposed to lower proportions) of Chinese, Korean, and South Asian residents have significantly greater cancer risk burdens relative to Whites. Tracts with higher concentrations of Asians speaking a non-English language and Asians that are US-born have significantly greater cancer risk burdens. Asian Americans experience substantial residential exposure to carcinogenic HAPs in US census tracts and in the US more generally. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Identifying model pollutants to investigate biodegradation of hazardous XOCs in WWTPs

    Energy Technology Data Exchange (ETDEWEB)

    Press-Kristensen, Kaare; Ledin, Anna; Schmidt, Jens Ejbye; Henze, Mogens [Department of Environment and Resources, Technical University of Denmark Building 115, 2800 Lyngby (Denmark)

    2007-02-01

    Xenobiotic organic compounds (XOCs) in wastewater treatment plant (WWTP) effluents might cause toxic effects in ecosystems. Several investigations have emphasized biodegradation as an important removal mechanism to reduce pollution with XOCs from WWTP effluents. The aim of the study was to design a screening tool to identify and select hazardous model pollutants for the further investigation of biodegradation in WWTPs. The screening tool consists of three criteria: The XOC is present in WWTP effluents, the XOC constitutes an intolerable risk in drinking water or the environment, and the XOC is expected to be biodegradable in WWTPs. The screening tool was tested on bisphenol A (BPA), carbamazepine (CBZ), di(2ethylhexyl)-phthalate (DEHP), 17{beta}-estradiol (E2), estrone (E1), 17{alpha}-ethinyloetradiol (EE2), ibuprofen, naproxen, nonylphenol (NP), and octylphenol (OP). BPA, DEHP, E2, E1, EE2, and NP passed all criteria in the screening tool and were selected as model pollutants. OP did not pass the filter and was rejected as model pollutant. CBZ, ibuprofen, and naproxen were not finally evaluated due to insufficient data. (author)

  4. A Real-Time Construction Safety Monitoring System for Hazardous Gas Integrating Wireless Sensor Network and Building Information Modeling Technologies.

    Science.gov (United States)

    Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng

    2018-02-02

    In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications.

  5. Seismic hazard assessment of the Hanford region, Eastern Washington State

    International Nuclear Information System (INIS)

    Youngs, R.R.; Coppersmith, K.J.; Power, M.S.; Swan, F.H. III

    1985-01-01

    A probabilistic seismic hazard assessment was made for a site within the Hanford region of eastern Washington state, which is characterized as an intraplate region having a relatively low rate of seismic activity. Probabilistic procedures, such as logic trees, were utilized to account for the uncertainties in identifying and characterizing the potential seismic sources in the region. Logic trees provide a convenient, flexible means of assessing the values and relative likelihoods of input parameters to the hazard model that may be dependent upon each other. Uncertainties accounted for in this way include the tectonic model, segmentation, capability, fault geometry, maximum earthquake magnitude, and earthquake recurrence rate. The computed hazard results are expressed as a distribution from which confidence levels are assessed. Analysis of the results show the contributions to the total hazard from various seismic sources and due to various earthquake magnitudes. In addition, the contributions of uncertainties in the various source parameters to the uncertainty in the computed hazard are assessed. For this study, the major contribution to uncertainty in the computed hazard are due to uncertainties in the applicable tectonic model and the earthquake recurrence rate. This analysis serves to illustrate some of the probabilistic tools that are available for conducting seismic hazard assessments and for analyzing the results of these studies. 5 references, 7 figures

  6. Landslide Hazard Assessment and Mapping in the Guil Catchment (Queyras, Southern French Alps): From Landslide Inventory to Susceptibility Modelling

    Science.gov (United States)

    Roulleau, Louise; Bétard, François; Carlier, Benoît; Lissak, Candide; Fort, Monique

    2016-04-01

    Landslides are common natural hazards in the Southern French Alps, where they may affect human lives and cause severe damages to infrastructures. As a part of the SAMCO research project dedicated to risk evaluation in mountain areas, this study focuses on the Guil river catchment (317 km2), Queyras, to assess landslide hazard poorly studied until now. In that area, landslides are mainly occasional, low amplitude phenomena, with limited direct impacts when compared to other hazards such as floods or snow avalanches. However, when interacting with floods during extreme rainfall events, landslides may have indirect consequences of greater importance because of strong hillslope-channel connectivity along the Guil River and its tributaries (i.e. positive feedbacks). This specific morphodynamic functioning reinforces the need to have a better understanding of landslide hazards and their spatial distribution at the catchment scale to prevent local population from disasters with multi-hazard origin. The aim of this study is to produce a landslide susceptibility mapping at 1:50 000 scale as a first step towards global estimation of landslide hazard and risk. The three main methodologies used for assessing landslide susceptibility are qualitative (i.e. expert opinion), deterministic (i.e. physics-based models) and statistical methods (i.e. probabilistic models). Due to the rapid development of geographical information systems (GIS) during the last two decades, statistical methods are today widely used because they offer a greater objectivity and reproducibility at large scales. Among them, multivariate analyses are considered as the most robust techniques, especially the logistic regression method commonly used in landslide susceptibility mapping. However, this method like others is strongly dependent on the accuracy of the input data to avoid significant errors in the final results. In particular, a complete and accurate landslide inventory is required before the modelling

  7. The proportion of prostate biopsy tissue with Gleason pattern 4 or 5 predicts for biochemical and clinical outcome after radiotherapy for prostate cancer

    International Nuclear Information System (INIS)

    D'Ambrosio, David J.; Hanlon, Alexandra L.; Al-Saleem, Tahseen; Feigenberg, Steven J.; Horwitz, Eric M.; Uzzo, Robert G.; Pollack, Alan; Buyyounouski, Mark K.

    2007-01-01

    Purpose: To investigate the prognostic utility of the proportion of prostate biopsy tissue containing Gleason pattern 4 or 5 (GP4/5) after definitive radiotherapy (RT) for prostate cancer. Methods and Materials: A total of 568 patients with T1c-3 Nx/0 prostate cancer who received three-dimensional conformal RT alone between May 1989 and August 2001 were studied. There were 161 men with Gleason score 7-10 disease. The GP4/5 was defined as the percentage of biopsy tissue containing Gleason pattern 4 or 5. A Cox proportional hazards model was used for univariate and multivariate analyses (MVA) for biochemical failure (BF) (American Society of Therapeutic Radiology and Oncology definition) and distant metastasis (DM). A recursive partitioning analysis was done using the results of the MVA to identify a cutpoint for GP4/5. Results: The median follow-up was 46 (range, 13-114) months and median RT dose was 76 (range, 65-82) Gy. On MVA, increasing initial prostate-specific antigen (p = 0.0248) decreasing RT dose (continuous, p = 0.0022), T stage (T1/2 vs. T3) (p = 0.0136) and GP4/5 (continuous, p < 0.0001) were significant predictors of BF in a model also containing GS. GP4/5 was the only significant predictor of DM in the same model (p < 0.0001). Conclusion: The GP4/5 in prostate biopsy specimens is a predictor of BF and DM after RT independent of Gleason score. This parameter should be reported by the pathologist when reviewing prostatic biopsy specimens

  8. Urban-hazard risk analysis: mapping of heat-related risks in the elderly in major Italian cities.

    Science.gov (United States)

    Morabito, Marco; Crisci, Alfonso; Gioli, Beniamino; Gualtieri, Giovanni; Toscano, Piero; Di Stefano, Valentina; Orlandini, Simone; Gensini, Gian Franco

    2015-01-01

    Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks. Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥ 65). A long time-series (2001-2013) of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m) daytime and night-time land surface temperatures (LST). LST was estimated pixel-wise by applying two statistical model approaches: 1) the Linear Regression Model (LRM); 2) the Generalized Additive Model (GAM). Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m) from the 2001 census (Eurostat source), and processed together using "Crichton's Risk Triangle" hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI). The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk) were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities. This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public health operators and facilitate coordination for heat

  9. Urban-hazard risk analysis: mapping of heat-related risks in the elderly in major Italian cities.

    Directory of Open Access Journals (Sweden)

    Marco Morabito

    Full Text Available Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks.Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥ 65.A long time-series (2001-2013 of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m daytime and night-time land surface temperatures (LST. LST was estimated pixel-wise by applying two statistical model approaches: 1 the Linear Regression Model (LRM; 2 the Generalized Additive Model (GAM. Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m from the 2001 census (Eurostat source, and processed together using "Crichton's Risk Triangle" hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI.The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities.This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public health operators and facilitate coordination for heat

  10. An evaluation of three representative multimedia models used to support cleanup decision-making at hazardous, mixed, and radioactive waste sites

    International Nuclear Information System (INIS)

    Moskowitz, P.D.; Pardi, R.; Fthenakis, V.M.; Holtzman, S.

    1996-01-01

    The decision process involved in cleaning sites contaminated with hazardous, mixed, and radioactive materials is supported often by results obtained from computer models. These results provide limits within which a decision-maker can judge the importance of individual transport and fate processes, and the likely outcome of alternative cleanup strategies. The transport of hazardous materials may occur predominately through one particular pathway but, more often, actual or potential transport must be evaluated across several pathways and media. Multimedia models are designed to simulate the transport of contaminants from a source to a receptor through more than one environmental pathway. Three such multimedia models are reviewed here: MEPAS, MMSOILS, and PRESTO-EPA-CPG. The reviews are based on documentation provided with the software, on published reviews, on personal interviews with the model developers, and on model summaries extracted from computer databases and expert systems. The three models are reviewed within the context of specific media components: air, surface water, ground water, and food chain. Additional sections evaluate the way that these three models calculate human exposure and dose and how they report uncertainty. Special emphasis is placed on how each model handles radionuclide transport within specific media. For the purpose of simulating the transport, fate and effects of radioactive contaminants through more than one pathway, both MEPAS and PRESTO-EPA-CPG are adequate for screening studies; MMSOILS only handles nonradioactive substances and must be modified before it can be used in these same applications. Of the three models, MEPAS is the most versatile, especially if the user needs to model the transport, fate, and effects of hazardous and radioactive contaminants. 44 refs., 2 tabs

  11. Proportioning of light weight concrete

    DEFF Research Database (Denmark)

    Palmus, Lars

    1996-01-01

    Development of a method to determine the proportions of the raw materials in light weight concrete made with leight expanded clay aggregate. The method is based on composite theory......Development of a method to determine the proportions of the raw materials in light weight concrete made with leight expanded clay aggregate. The method is based on composite theory...

  12. Bayesian network learning for natural hazard assessments

    Science.gov (United States)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  13. Probabilistic seismic hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D. [New Brunswick Power Corp., Point Lepreau Generating Station, Lepreau, New Brunswick (Canada); Lavine, A. [AMEC Foster Wheeler Environment and Infrastructure Americas, Oakland, California (United States); Egan, J. [SAGE Engineers, Oakland, California (United States)

    2015-09-15

    A Probabilistic Seismic Hazard Assessment (PSHA) has been performed for the Point Lepreau Generating Station (PLGS). The objective is to provide characterization of the earthquake ground shaking that will be used to evaluate seismic safety. The assessment is based on the current state of knowledge of the informed scientific and engineering community regarding earthquake hazards in the site region, and includes two primary components-a seismic source model and a ground motion model. This paper provides the methodology and results of the PLGS PSHA. The implications of the updated hazard information for site safety are discussed in a separate paper. (author)

  14. Probabilistic seismic hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau, NB (Canada); Lavine, A., E-mail: alexis.lavine@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure Americas, Oakland, CA (United States); Egan, J., E-mail: jegan@sageengineers.com [SAGE Engineers, Oakland, CA (United States)

    2015-07-01

    A Probabilistic Seismic Hazard Assessment (PSHA) has been performed for the Point Lepreau Generating Station (PLGS). The objective is to provide characterization of the earthquake ground shaking that will be used to evaluate seismic safety. The assessment is based on the current state of knowledge of the informed scientific and engineering community regarding earthquake hazards in the site region, and includes two primary components--a seismic source model and a ground motion model. This paper provides the methodology and results of the PLGS PSHA. The implications of the updated hazard information for site safety are discussed in a separate paper. (author)

  15. Volcanic Hazard Assessments for Nuclear Installations: Methods and Examples in Site Evaluation

    International Nuclear Information System (INIS)

    2016-07-01

    To provide guidance on the protection of nuclear installations against the effects of volcanoes, the IAEA published in 2012 IAEA Safety Standards Series No. SSG-21, Volcanic Hazards in Site Evaluation for Nuclear Installations. SSG-21 addresses hazards relating to volcanic phenomena, and provides recommendations and general guidance for evaluation of these hazards. Unlike seismic hazard assessments, models for volcanic hazard assessment have not undergone decades of review, evaluation and testing for suitability in evaluating hazards at proposed nuclear installations. Currently in volcanology, scientific developments and detailed methodologies to model volcanic phenomena are evolving rapidly.This publication provides information on detailed methodologies and examples in the application of volcanic hazard assessment to site evaluation for nuclear installations, thereby addressing the recommendations in SSG-21. Although SSG-21 develops a logical framework for conducting a volcanic hazard assessment, this publication demonstrates the practicability of evaluating the recommendations in SSG-21 through a systematic volcanic hazard assessment and examples from Member States. The results of this hazard assessment can be used to derive the appropriate design bases and operational considerations for specific nuclear installations

  16. Identifying nonproportional covariates in the Cox model

    Czech Academy of Sciences Publication Activity Database

    Kraus, David

    2008-01-01

    Roč. 37, č. 4 (2008), s. 617-625 ISSN 0361-0926 R&D Projects: GA AV ČR(CZ) IAA101120604; GA MŠk(CZ) 1M06047; GA ČR(CZ) GD201/05/H007 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * goodness of fit * proportional hazards assumption * time-varying coefficients Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.324, year: 2008

  17. A semiparametric hazard model of activity timing and sequencing decisions during visits to theme parks using experimental design data

    NARCIS (Netherlands)

    Kemperman, A.D.A.M.; Borgers, A.W.J.; Timmermans, H.J.P.

    2002-01-01

    In this study we introduce a semi parametric hazard-based duration model to predict the timing and sequence of theme park visitors' activity choice behavior. The model is estimated on the basis of observations of consumer choices in various hypothetical theme parks. These parks are constructed by

  18. Seismic hazard estimation based on the distributed seismicity in northern China

    Science.gov (United States)

    Yang, Yong; Shi, Bao-Ping; Sun, Liang

    2008-03-01

    In this paper, we have proposed an alternative seismic hazard modeling by using distributed seismicites. The distributed seismicity model does not need delineation of seismic source zones, and simplify the methodology of probabilistic seismic hazard analysis. Based on the devastating earthquake catalogue, we established three seismicity model, derived the distribution of a-value in northern China by using Gaussian smoothing function, and calculated peak ground acceleration distributions for this area with 2%, 5% and 10% probability of exceedance in a 50-year period by using three attenuation models, respectively. In general, the peak ground motion distribution patterns are consistent with current seismic hazard map of China, but in some specific seismic zones which include Shanxi Province and Shijiazhuang areas, our results indicated a little bit higher peak ground motions and zonation characters which are in agreement with seismicity distribution patterns in these areas. The hazard curves have been developed for Beijing, Tianjin, Taiyuan, Tangshan, and Ji’nan, the metropolitan cities in the northern China. The results showed that Tangshan, Taiyuan, Beijing has a higher seismic hazard than that of other cities mentioned above.

  19. CalTOX, a multimedia total exposure model for hazardous-waste sites

    International Nuclear Information System (INIS)

    McKone, T.E.

    1993-06-01

    CalTOX has been developed as a spreadsheet model to assist in health-risk assessments that address contaminated soils and the contamination of adjacent air, surface water, sediments, and ground water. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify and reduce uncertainty in multimedia, multiple-pathway exposure models. This report provides an overview of the CalTOX model components, lists the objectives of the model, describes the philosophy under which the model was developed, identifies the chemical classes for which the model can be used, and describes critical sensitivities and uncertainties. The multimedia transport and transformation model is a dynamic model that can be used to assess time-varying concentrations of contaminants introduced initially to soil layers or for contaminants released continuously to air or water. This model assists the user in examining how chemical and landscape properties impact both the ultimate route and quantity of human contact. Multimedia, multiple pathway exposure models are used in the CalTOX model to estimate average daily potential doses within a human population in the vicinity of a hazardous substances release site. The exposure models encompass twenty-three exposure pathways. The exposure assessment process consists of relating contaminant concentrations in the multimedia model compartments to contaminant concentrations in the media with which a human population has contact (personal air, tap water, foods, household dusts soils, etc.). The average daily dose is the product of the exposure concentrations in these contact media and an intake or uptake factor that relates the concentrations to the distributions of potential dose within the population

  20. Numerical Modelling of Extreme Natural Hazards in the Russian Seas

    Science.gov (United States)

    Arkhipkin, Victor; Dobrolyubov, Sergey; Korablina, Anastasia; Myslenkov, Stanislav; Surkova, Galina

    2017-04-01

    Storm surges and extreme waves are severe natural sea hazards. Due to the almost complete lack of natural observations of these phenomena in the Russian seas (Caspian, Black, Azov, Baltic, White, Barents, Okhotsk, Kara), especially about their formation, development and destruction, they have been studied using numerical simulation. To calculate the parameters of wind waves for the seas listed above, except the Barents Sea, the spectral model SWAN was applied. For the Barents and Kara seas we used WAVEWATCH III model. Formation and development of storm surges were studied using ADCIRC model. The input data for models - bottom topography, wind, atmospheric pressure and ice cover. In modeling of surges in the White and Barents seas tidal level fluctuations were used. They have been calculated from 16 harmonic constant obtained from global atlas tides FES2004. Wind, atmospheric pressure and ice cover was taken from the NCEP/NCAR reanalysis for the period from 1948 to 2010, and NCEP/CFSR reanalysis for the period from 1979 to 2015. In modeling we used both regular and unstructured grid. The wave climate of the Caspian, Black, Azov, Baltic and White seas was obtained. Also the extreme wave height possible once in 100 years has been calculated. The statistics of storm surges for the White, Barents and Azov Seas were evaluated. The contribution of wind and atmospheric pressure in the formation of surges was estimated. The technique of climatic forecast frequency of storm synoptic situations was developed and applied for every sea. The research was carried out with financial support of the RFBR (grant 16-08-00829).

  1. Assessment of volcanic hazards, vulnerability, risk and uncertainty (Invited)

    Science.gov (United States)

    Sparks, R. S.

    2009-12-01

    A volcanic hazard is any phenomenon that threatens communities . These hazards include volcanic events like pyroclastic flows, explosions, ash fall and lavas, and secondary effects such as lahars and landslides. Volcanic hazards are described by the physical characteristics of the phenomena, by the assessment of the areas that they are likely to affect and by the magnitude-dependent return period of events. Volcanic hazard maps are generated by mapping past volcanic events and by modelling the hazardous processes. Both these methods have their strengths and limitations and a robust map should use both approaches in combination. Past records, studied through stratigraphy, the distribution of deposits and age dating, are typically incomplete and may be biased. Very significant volcanic hazards, such as surge clouds and volcanic blasts, are not well-preserved in the geological record for example. Models of volcanic processes are very useful to help identify hazardous areas that do not have any geological evidence. They are, however, limited by simplifications and incomplete understanding of the physics. Many practical volcanic hazards mapping tools are also very empirical. Hazards maps are typically abstracted into hazards zones maps, which are some times called threat or risk maps. Their aim is to identify areas at high levels of threat and the boundaries between zones may take account of other factors such as roads, escape routes during evacuation, infrastructure. These boundaries may change with time due to new knowledge on the hazards or changes in volcanic activity levels. Alternatively they may remain static but implications of the zones may change as volcanic activity changes. Zone maps are used for planning purposes and for management of volcanic crises. Volcanic hazards maps are depictions of the likelihood of future volcanic phenomena affecting places and people. Volcanic phenomena are naturally variable, often complex and not fully understood. There are

  2. Development of multiwire proportional chambers

    CERN Multimedia

    Charpak, G

    1969-01-01

    It has happened quite often in the history of science that theoreticians, confronted with some major difficulty, have successfully gone back thirty years to look at ideas that had then been thrown overboard. But it is rare that experimentalists go back thirty years to look again at equipment which had become out-dated. This is what Charpak and his colleagues did to emerge with the 'multiwire proportional chamber' which has several new features making it a very useful addition to the armoury of particle detectors. In the 1930s, ion-chambers, Geiger- Muller counters and proportional counters, were vital pieces of equipment in nuclear physics research. Other types of detectors have since largely replaced them but now the proportional counter, in new array, is making a comeback.

  3. Managing patients with concerns about workplace reproductive hazards.

    Science.gov (United States)

    Frazier, L M; Jones, T L

    2000-01-01

    To find out who uses an occupational reproductive consultation service, what proportion of patients have different types of workplace exposures, and what hypotheses can be generated about barriers to implementing medically necessary job modifications to promote reproductive health. A case series study was conducted by reviewing medical records at two occupational health clinics. 51 patients (1 man and 50 women) were seen, 10 of whom wished to discuss a future pregnancy and 41 of whom were pregnant. Pregnant women worked with a mean of 15.5 different chemicals, and patients were also concerned about ionizing radiation, biological hazards, electromagnetic fields, and ultraviolet light. Pregnant women made clinic visits at a mean gestational age of 10.9 weeks. Only one man used the service, suggesting a lack of knowledge about possible paternal contributions to adverse reproductive outcomes. Many pregnant women visited the clinic too late to prevent harm from exposure to some teratogens, so preconception counseling may be of benefit. Cases are presented that illustrate ways in which the primary care provider can assist the patient who may be exposed to reproductive hazards.

  4. Sustainability-Based Flood Hazard Mapping of the Swannanoa River Watershed

    Directory of Open Access Journals (Sweden)

    Ebrahim Ahmadisharaf

    2017-09-01

    Full Text Available An integrated framework is presented for sustainability-based flood hazard mapping of the Swannanoa River watershed in the state of North Carolina, U.S. The framework uses a hydrologic model for rainfall–runoff transformation, a two-dimensional unsteady hydraulic model flood simulation and a GIS-based multi-criteria decision-making technique for flood hazard mapping. Economic, social, and environmental flood hazards are taken into account. The importance of each hazard is quantified through a survey to the experts. Utilizing the proposed framework, sustainability-based flood hazard mapping is performed for the 100-year design event. As a result, the overall flood hazard is provided in each geographic location. The sensitivity of the overall hazard with respect to the weights of the three hazard components were also investigated. While the conventional flood management approach is to assess the environmental impacts of mitigation measures after a set of feasible options are selected, the presented framework incorporates the environmental impacts into the analysis concurrently with the economic and social influences. Thereby, it provides a more sustainable perspective of flood management and can greatly help the decision makers to make better-informed decisions by clearly understanding the impacts of flooding on economy, society and environment.

  5. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    Science.gov (United States)

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  6. Methods for developing seismic and extreme wind-hazard models for evaluating critical structures and equipment at US Department of Energy facilities and commercial plutonium facilities in the United States

    International Nuclear Information System (INIS)

    Coats, D.W.; Murray, R.C.; Bernreuter, D.L.

    1981-01-01

    Lawrence Livermore National Laboratory (LLNL) is developing seismic and wind hazard models for the US Department of Energy (DOE). The work is part of a three-phase effort to establish building design criteria developed with a uniform methodology for seismic and wind hazards at the various DOE sites throughout the United States. In Phase 1, LLNL gathered information on the sites and their critical facilities, including nuclear reactors, fuel-reprocessing plants, high-level waste storage and treatment facilities, and special nuclear material facilities. Phase 2 - development of seismic and wind hazard models - is discussed in this paper, which summarizes the methodologies used by seismic and extreme-wind experts and gives sample hazard curves for the first sites to be modeled. These hazard models express the annual probability that the site will experience an earthquake (or windspeed) greater than some specified magnitude. In the final phase, the DOE will use the hazards models and LLNL-recommended uniform design criteria to evaluate critical facilities. The methodology presented in this paper also was used for a related LLNL study - involving the seismic assessment of six commercial plutonium fabrication plants licensed by the US Nuclear Regulatory Commission (NRC). Details and results of this reassessment are documented in reference

  7. Regional-scale GIS-models for assessment of hazards from glacier lake outbursts: evaluation and application in the Swiss Alps

    Directory of Open Access Journals (Sweden)

    C. Huggel

    2003-01-01

    Full Text Available Debris flows triggered by glacier lake outbursts have repeatedly caused disasters in various high-mountain regions of the world. Accelerated change of glacial and periglacial environments due to atmospheric warming and increased anthropogenic development in most of these areas raise the need for an adequate hazard assessment and corresponding modelling. The purpose of this paper is to pro-vide a modelling approach which takes into account the current evolution of the glacial environment and satisfies a robust first-order assessment of hazards from glacier-lake outbursts. Two topography-based GIS-models simulating debris flows related to outbursts from glacier lakes are presented and applied for two lake outburst events in the southern Swiss Alps. The models are based on information about glacier lakes derived from remote sensing data, and on digital elevation models (DEM. Hydrological flow routing is used to simulate the debris flow resulting from the lake outburst. Thereby, a multiple- and a single-flow-direction approach are applied. Debris-flow propagation is given in probability-related values indicating the hazard potential of a certain location. The debris flow runout distance is calculated on the basis of empirical data on average slope trajectory. The results show that the multiple-flow-direction approach generally yields a more detailed propagation. The single-flow-direction approach, however, is more robust against DEM artifacts and, hence, more suited for process automation. The model is tested with three differently generated DEMs (including aero-photogrammetry- and satellite image-derived. Potential application of the respective DEMs is discussed with a special focus on satellite-derived DEMs for use in remote high-mountain areas.

  8. Wind hazard assessment for Point Lepreau Generating Station

    International Nuclear Information System (INIS)

    Mullin, D.; Moland, M.; Sciaudone, J.C.; Twisdale, L.A.; Vickery, P.J.; Mizzen, D.R.

    2015-01-01

    In response to the CNSC Fukushima Action Plan, NB Power has embarked on a wind hazard assessment for the Point Lepreau Generating Station site that incorporates the latest up to date wind information and modeling. The objective was to provide characterization of the wind hazard from all potential sources and estimate wind-driven missile fragilities and wind pressure fragilities for various structures, systems and components that would provide input to a possible high wind Probabilistic Safety Assessment. The paper will discuss the overall methodology used to assess hazards related to tornadoes, hurricanes and straight-line winds, and site walk-down and hazard/fragility results. (author)

  9. Hazards and hazard combinations relevant for the safety of nuclear power plants

    Science.gov (United States)

    Decker, Kurt; Brinkman, Hans; Raimond, Emmanuel

    2017-04-01

    The potential of the contemporaneous impact of different, yet causally related, hazardous events and event cascades on nuclear power plants is a major contributor to the overall risk of nuclear installations. In the aftermath of the Fukushima accident, which was caused by a combination of severe ground shaking by an earthquake, an earthquake-triggered tsunami and the disruption of the plants from the electrical grid by a seismically induced landslide, hazard combinations and hazard cascades moved into the focus of nuclear safety research. We therefore developed an exhaustive list of external hazards and hazard combinations which pose potential threats to nuclear installations in the framework of the European project ASAMPSAE (Advanced Safety Assessment: Extended PSA). The project gathers 31 partners from Europe, North Amerika and Japan. The list comprises of exhaustive lists of natural hazards, external man-made hazards, and a cross-correlation matrix of these hazards. The hazard list is regarded comprehensive by including all types of hazards that were previously cited in documents by IAEA, the Western European Nuclear Regulators Association (WENRA), and others. 73 natural hazards and 24 man-made external hazards are included. Natural hazards are grouped into seismotectonic hazards, flooding and hydrological hazards, extreme values of meteorological phenomena, rare meteorological phenomena, biological hazards / infestation, geological hazards, and forest fire / wild fire. The list of external man-made hazards includes industry accidents, military accidents, transportation accidents, pipeline accidents and other man-made external events. The large number of different hazards results in the extremely large number of 5.151 theoretically possible hazard combinations (not considering hazard cascades). In principle all of these combinations are possible to occur by random coincidence except for 82 hazard combinations that - depending on the time scale - are mutually

  10. Hazardous Waste

    Science.gov (United States)

    ... chemicals can still harm human health and the environment. When you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint thinner. U.S. residents ...

  11. Current Status and Application of Hazard Definition Technology

    Science.gov (United States)

    Greene, George C.

    1997-01-01

    A research is performed: to define wake non-encounter & hazard, to provide requirements for sensors, and to obtain input from the user community. This research includes: validating wake encounter simulation models, establishing a metric to quantify the upset potential of a wake encounter, applying hazard metric and simulation models to the commercial fleet for development of candidate acceptable encounter limits, and applying technology to near term problems to evaluate current status of technology. The following lessons are learned from this project: technology is not adequate to determine absolute spacing requirements; time, not distance, determines the duration of the wake hazard; Optimum standards depend on the traffic; Wing span is an important parameter for characterizing both generator and follower; and Short span "biz jets" are easily rolled.

  12. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  13. Integrating Volcanic Hazard Data in a Systematic Approach to Develop Volcanic Hazard Maps in the Lesser Antilles

    Directory of Open Access Journals (Sweden)

    Jan M. Lindsay

    2018-04-01

    Full Text Available We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick ‘em Jenny and Ronde/Caille, Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia, and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past ~10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that

  14. Partitioning into hazard subregions for regional peaks-over-threshold modeling of heavy precipitation

    Science.gov (United States)

    Carreau, J.; Naveau, P.; Neppel, L.

    2017-05-01

    The French Mediterranean is subject to intense precipitation events occurring mostly in autumn. These can potentially cause flash floods, the main natural danger in the area. The distribution of these events follows specific spatial patterns, i.e., some sites are more likely to be affected than others. The peaks-over-threshold approach consists in modeling extremes, such as heavy precipitation, by the generalized Pareto (GP) distribution. The shape parameter of the GP controls the probability of extreme events and can be related to the hazard level of a given site. When interpolating across a region, the shape parameter should reproduce the observed spatial patterns of the probability of heavy precipitation. However, the shape parameter estimators have high uncertainty which might hide the underlying spatial variability. As a compromise, we choose to let the shape parameter vary in a moderate fashion. More precisely, we assume that the region of interest can be partitioned into subregions with constant hazard level. We formalize the model as a conditional mixture of GP distributions. We develop a two-step inference strategy based on probability weighted moments and put forward a cross-validation procedure to select the number of subregions. A synthetic data study reveals that the inference strategy is consistent and not very sensitive to the selected number of subregions. An application on daily precipitation data from the French Mediterranean shows that the conditional mixture of GPs outperforms two interpolation approaches (with constant or smoothly varying shape parameter).

  15. GPS Imaging of Time-Variable Earthquake Hazard: The Hilton Creek Fault, Long Valley California

    Science.gov (United States)

    Hammond, W. C.; Blewitt, G.

    2016-12-01

    The Hilton Creek Fault, in Long Valley, California is a down-to-the-east normal fault that bounds the eastern edge of the Sierra Nevada/Great Valley microplate, and lies half inside and half outside the magmatically active caldera. Despite the dense coverage with GPS networks, the rapid and time-variable surface deformation attributable to sporadic magmatic inflation beneath the resurgent dome makes it difficult to use traditional geodetic methods to estimate the slip rate of the fault. While geologic studies identify cumulative offset, constrain timing of past earthquakes, and constrain a Quaternary slip rate to within 1-5 mm/yr, it is not currently possible to use geologic data to evaluate how the potential for slip correlates with transient caldera inflation. To estimate time-variable seismic hazard of the fault we estimate its instantaneous slip rate from GPS data using a new set of algorithms for robust estimation of velocity and strain rate fields and fault slip rates. From the GPS time series, we use the robust MIDAS algorithm to obtain time series of velocity that are highly insensitive to the effects of seasonality, outliers and steps in the data. We then use robust imaging of the velocity field to estimate a gridded time variable velocity field. Then we estimate fault slip rate at each time using a new technique that forms ad-hoc block representations that honor fault geometries, network complexity, connectivity, but does not require labor-intensive drawing of block boundaries. The results are compared to other slip rate estimates that have implications for hazard over different time scales. Time invariant long term seismic hazard is proportional to the long term slip rate accessible from geologic data. Contemporary time-invariant hazard, however, may differ from the long term rate, and is estimated from the geodetic velocity field that has been corrected for the effects of magmatic inflation in the caldera using a published model of a dipping ellipsoidal

  16. Hazard perception in traffic. [previously knows as: Hazard perception.

    NARCIS (Netherlands)

    2008-01-01

    Hazard perception is an essential part of the driving task. There are clear indications that insufficient skills in perceiving hazards play an important role in the occurrence of crashes, especially those involving novice drivers. Proper hazard perception not only consists of scanning and perceiving

  17. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Antoun, T; Harris, D; Lay, T; Myers, S C; Pasyanos, M E; Richards, P; Rodgers, A J; Walter, W R; Zucca, J J

    2008-02-11

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring and seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes a path by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas.

  18. Proportional-Integral-Resonant AC Current Controller

    Directory of Open Access Journals (Sweden)

    STOJIC, D.

    2017-02-01

    Full Text Available In this paper an improved stationary-frame AC current controller based on the proportional-integral-resonant control action (PIR is proposed. Namely, the novel two-parameter PIR controller is applied in the stationary-frame AC current control, accompanied by the corresponding parameter-tuning procedure. In this way, the proportional-resonant (PR controller, common in the stationary-frame AC current control, is extended by the integral (I action in order to enable the AC current DC component tracking, and, also, to enable the DC disturbance compensation, caused by the voltage source inverter (VSI nonidealities and by nonlinear loads. The proposed controller parameter-tuning procedure is based on the three-phase back-EMF-type load, which corresponds to a wide range of AC power converter applications, such as AC motor drives, uninterruptible power supplies, and active filters. While the PIR controllers commonly have three parameters, the novel controller has two. Also, the provided parameter-tuning procedure needs only one parameter to be tuned in relation to the load and power converter model parameters, since the second controller parameter is directly derived from the required controller bandwidth value. The dynamic performance of the proposed controller is verified by means of simulation and experimental runs.

  19. Kalman-predictive-proportional-integral-derivative (KPPID)

    International Nuclear Information System (INIS)

    Fluerasu, A.; Sutton, M.

    2004-01-01

    With third generation synchrotron X-ray sources, it is possible to acquire detailed structural information about the system under study with time resolution orders of magnitude faster than was possible a few years ago. These advances have generated many new challenges for changing and controlling the state of the system on very short time scales, in a uniform and controlled manner. For our particular X-ray experiments on crystallization or order-disorder phase transitions in metallic alloys, we need to change the sample temperature by hundreds of degrees as fast as possible while avoiding over or under shooting. To achieve this, we designed and implemented a computer-controlled temperature tracking system which combines standard Proportional-Integral-Derivative (PID) feedback, thermal modeling and finite difference thermal calculations (feedforward), and Kalman filtering of the temperature readings in order to reduce the noise. The resulting Kalman-Predictive-Proportional-Integral-Derivative (KPPID) algorithm allows us to obtain accurate control, to minimize the response time and to avoid over/under shooting, even in systems with inherently noisy temperature readings and time delays. The KPPID temperature controller was successfully implemented at the Advanced Photon Source at Argonne National Laboratories and was used to perform coherent and time-resolved X-ray diffraction experiments.

  20. An Optimization Approach for Hazardous Waste Management Problem under Stochastic Programming

    International Nuclear Information System (INIS)

    Abass, S.A.; Abdallah, A.S.; Gomaa, M.A.

    2008-01-01

    Hazardous waste is the waste which, due to their nature and quantity, is potentially hazardous to human health and/or the environment. This kind of waste requires special disposal techniques to eliminate or reduce thc hazardous. Hazardous waste management (HWM) problem is concerned in the basic with the disposal method. hi this paper we focus on incineration as an effective to dispose the waste. For this type of disposal, there arc air pollution standards imposed by the government. We will propose an optimization model satisfied the air pollution standards and based on the model of Emek and Kara with using random variable coefficients in the constraint

  1. Maximum likelihood estimation for Cox's regression model under nested case-control sampling

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder; Juul, Anders

    2004-01-01

    -like growth factor I was associated with ischemic heart disease. The study was based on a population of 3784 Danes and 231 cases of ischemic heart disease where controls were matched on age and gender. We illustrate the use of the MLE for these data and show how the maximum likelihood framework can be used......Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazards...... model. The MLE is computed by the EM-algorithm, which is easy to implement in the proportional hazards setting. Standard errors are estimated by a numerical profile likelihood approach based on EM aided differentiation. The work was motivated by a nested case-control study that hypothesized that insulin...

  2. Do French macroseismic intensity observations agree with expectations from the European Seismic Hazard Model 2013?

    Science.gov (United States)

    Rey, Julien; Beauval, Céline; Douglas, John

    2018-02-01

    Probabilistic seismic hazard assessments are the basis of modern seismic design codes. To test fully a seismic hazard curve at the return periods of interest for engineering would require many thousands of years' worth of ground-motion recordings. Because strong-motion networks are often only a few decades old (e.g. in mainland France the first accelerometric network dates from the mid-1990s), data from such sensors can be used to test hazard estimates only at very short return periods. In this article, several hundreds of years of macroseismic intensity observations for mainland France are interpolated using a robust kriging-with-a-trend technique to establish the earthquake history of every French mainland municipality. At 24 selected cities representative of the French seismic context, the number of exceedances of intensities IV, V and VI is determined over time windows considered complete. After converting these intensities to peak ground accelerations using the global conversion equation of Caprio et al. (Ground motion to intensity conversion equations (GMICEs): a global relationship and evaluation of regional dependency, Bulletin of the Seismological Society of America 105:1476-1490, 2015), these exceedances are compared with those predicted by the European Seismic Hazard Model 2013 (ESHM13). In half of the cities, the number of observed exceedances for low intensities (IV and V) is within the range of predictions of ESHM13. In the other half of the cities, the number of observed exceedances is higher than the predictions of ESHM13. For intensity VI, the match is closer, but the comparison is less meaningful due to a scarcity of data. According to this study, the ESHM13 underestimates hazard in roughly half of France, even when taking into account the uncertainty in the conversion from intensity to acceleration. However, these results are valid only for the acceleration range tested in this study (0.01 to 0.09 g).

  3. Do French macroseismic intensity observations agree with expectations from the European Seismic Hazard Model 2013?

    Science.gov (United States)

    Rey, Julien; Beauval, Céline; Douglas, John

    2018-05-01

    Probabilistic seismic hazard assessments are the basis of modern seismic design codes. To test fully a seismic hazard curve at the return periods of interest for engineering would require many thousands of years' worth of ground-motion recordings. Because strong-motion networks are often only a few decades old (e.g. in mainland France the first accelerometric network dates from the mid-1990s), data from such sensors can be used to test hazard estimates only at very short return periods. In this article, several hundreds of years of macroseismic intensity observations for mainland France are interpolated using a robust kriging-with-a-trend technique to establish the earthquake history of every French mainland municipality. At 24 selected cities representative of the French seismic context, the number of exceedances of intensities IV, V and VI is determined over time windows considered complete. After converting these intensities to peak ground accelerations using the global conversion equation of Caprio et al. (Ground motion to intensity conversion equations (GMICEs): a global relationship and evaluation of regional dependency, Bulletin of the Seismological Society of America 105:1476-1490, 2015), these exceedances are compared with those predicted by the European Seismic Hazard Model 2013 (ESHM13). In half of the cities, the number of observed exceedances for low intensities (IV and V) is within the range of predictions of ESHM13. In the other half of the cities, the number of observed exceedances is higher than the predictions of ESHM13. For intensity VI, the match is closer, but the comparison is less meaningful due to a scarcity of data. According to this study, the ESHM13 underestimates hazard in roughly half of France, even when taking into account the uncertainty in the conversion from intensity to acceleration. However, these results are valid only for the acceleration range tested in this study (0.01 to 0.09 g).

  4. FLOOD HAZARD MAP IN THE CITY OF BATNA (ALGERIA BY HYDRAULIC MODELING APPROCH

    Directory of Open Access Journals (Sweden)

    Guellouh SAMI

    2016-06-01

    Full Text Available In the light of the global climatic changes that appear to influence the frequency and the intensity of floods, and whose damages are still growing; understanding the hydrological processes, their spatiotemporal setting and their extreme shape, became a paramount concern to local communities in forecasting terms. The aim of this study is to map the floods hazard using a hydraulic modeling method. In fact, using the operating Geographic Information System (GIS, would allow us to perform a more detailed spatial analysis about the extent of the flooding risk, through the approval of the hydraulic modeling programs in different frequencies. Based on the results of this analysis, decision makers can implement a strategy of risk management related to rivers overflowing through the city of Batna.

  5. Adaptive bayesian analysis for binomial proportions

    CSIR Research Space (South Africa)

    Das, Sonali

    2008-10-01

    Full Text Available of testing the proportion of some trait. For example, say, we are interested to infer about the effectiveness of a certain intervention teaching strategy, by comparing proportion of ‘proficient’ teachers, before and after an intervention. The number...

  6. SHEAT: a computer code for probabilistic seismic hazard analysis, user's manual

    International Nuclear Information System (INIS)

    Ebisawa, Katsumi; Kondo, Masaaki; Abe, Kiyoharu; Tanaka, Toshiaki; Takani, Michio.

    1994-08-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. Seismic hazard is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site. With the SHEAT code, seismic hazard is calculated by the following two steps: (1) Modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquakes) is modelled based on the historical earthquake records, active fault data and expert judgement. (2) Calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT code. It includes: (1) Outlines of the code, which include overall concept, logical process, code structure, data file used and special characteristics of the code, (2) Functions of subprograms and analytical models in them, (3) Guidance of input and output data, and (4) Sample run results. The code has widely been used at JAERI to analyze seismic hazard at various nuclear power plant sites in japan. (author)

  7. Icon arrays help younger children's proportional reasoning.

    Science.gov (United States)

    Ruggeri, Azzurra; Vagharchakian, Laurianne; Xu, Fei

    2018-06-01

    We investigated the effects of two context variables, presentation format (icon arrays or numerical frequencies) and time limitation (limited or unlimited time), on the proportional reasoning abilities of children aged 7 and 10 years, as well as adults. Participants had to select, between two sets of tokens, the one that offered the highest likelihood of drawing a gold token, that is, the set of elements with the greater proportion of gold tokens. Results show that participants performed better in the unlimited time condition. Moreover, besides a general developmental improvement in accuracy, our results show that younger children performed better when proportions were presented as icon arrays, whereas older children and adults were similarly accurate in the two presentation format conditions. Statement of contribution What is already known on this subject? There is a developmental improvement in proportional reasoning accuracy. Icon arrays facilitate reasoning in adults with low numeracy. What does this study add? Participants were more accurate when they were given more time to make the proportional judgement. Younger children's proportional reasoning was more accurate when they were presented with icon arrays. Proportional reasoning abilities correlate with working memory, approximate number system, and subitizing skills. © 2018 The British Psychological Society.

  8. A Remote Sensing-Based Tool for Assessing Rainfall-Driven Hazards

    Science.gov (United States)

    Wright, Daniel B.; Mantilla, Ricardo; Peters-Lidard, Christa D.

    2018-01-01

    RainyDay is a Python-based platform that couples rainfall remote sensing data with Stochastic Storm Transposition (SST) for modeling rainfall-driven hazards such as floods and landslides. SST effectively lengthens the extreme rainfall record through temporal resampling and spatial transposition of observed storms from the surrounding region to create many extreme rainfall scenarios. Intensity-Duration-Frequency (IDF) curves are often used for hazard modeling but require long records to describe the distribution of rainfall depth and duration and do not provide information regarding rainfall space-time structure, limiting their usefulness to small scales. In contrast, RainyDay can be used for many hazard applications with 1-2 decades of data, and output rainfall scenarios incorporate detailed space-time structure from remote sensing. Thanks to global satellite coverage, RainyDay can be used in inaccessible areas and developing countries lacking ground measurements, though results are impacted by remote sensing errors. RainyDay can be useful for hazard modeling under nonstationary conditions. PMID:29657544

  9. Evaluation of seismic hazard at the northwestern part of Egypt

    Science.gov (United States)

    Ezzelarab, M.; Shokry, M. M. F.; Mohamed, A. M. E.; Helal, A. M. A.; Mohamed, Abuoelela A.; El-Hadidy, M. S.

    2016-01-01

    The objective of this study is to evaluate the seismic hazard at the northwestern Egypt using the probabilistic seismic hazard assessment approach. The Probabilistic approach was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. The doubly-truncated exponential model was adopted for calculations of the recurrence parameters. Ground-motion prediction equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 0.2° × 0.2° covering the study area, seismic hazard curves for every node were calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to six spectral periods (0.1, 0.2, 0.3, 1.0, 2.0 and 3.0 s) for return periods of 72, 475 and 2475 years. The unified hazard spectra of two selected rock sites at Alexandria and Mersa Matruh Cities were provided. Finally, the hazard curves were de-aggregated to determine the sources that contribute most of hazard level of 10% probability of exceedance in 50 years for the mentioned selected sites.

  10. Fuzzy-logic assessment of failure hazard in pipelines due to mining activity

    Directory of Open Access Journals (Sweden)

    A. A. Malinowska

    2015-11-01

    Full Text Available The present research is aimed at a critical analysis of a method presently used for evaluating failure hazard in linear objects in mining areas. A fuzzy model of failure hazard of a linear object was created on the basis of the experience gathered so far. The rules of Mamdani fuzzy model have been used in the analyses. Finally the scaled model was integrated with a Geographic Information System (GIS, which was used to evaluate failure hazard in a water pipeline in a mining area.

  11. Identification of Potential Hazard using Hazard Identification and Risk Assessment

    Science.gov (United States)

    Sari, R. M.; Syahputri, K.; Rizkya, I.; Siregar, I.

    2017-03-01

    This research was conducted in the paper production’s company. These Paper products will be used as a cigarette paper. Along in the production’s process, Company provides the machines and equipment that operated by workers. During the operations, all workers may potentially injured. It known as a potential hazard. Hazard identification and risk assessment is one part of a safety and health program in the stage of risk management. This is very important as part of efforts to prevent occupational injuries and diseases resulting from work. This research is experiencing a problem that is not the identification of potential hazards and risks that would be faced by workers during the running production process. The purpose of this study was to identify the potential hazards by using hazard identification and risk assessment methods. Risk assessment is done using severity criteria and the probability of an accident. According to the research there are 23 potential hazard that occurs with varying severity and probability. Then made the determination Risk Assessment Code (RAC) for each potential hazard, and gained 3 extreme risks, 10 high risks, 6 medium risks and 3 low risks. We have successfully identified potential hazard using RAC.

  12. Electronics for proportional drift tubes

    International Nuclear Information System (INIS)

    Fremont, G.; Friend, B.; Mess, K.H.; Schmidt-Parzefall, W.; Tarle, J.C.; Verweij, H.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration); Geske, K.; Riege, H.; Schuett, J.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration); Semenov, Y.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration)

    1980-01-01

    An electronic system for the read-out of a large number of proportional drift tubes (16,000) has been designed. This system measures deposited charge and drift-time of the charge of a particle traversing a proportional drift tube. A second event can be accepted during the read-out of the system. Up to 40 typical events can be collected and buffered before a data transfer to a computer is necessary. (orig.)

  13. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    Science.gov (United States)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  14. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    International Nuclear Information System (INIS)

    Savy, J.

    1989-01-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the ''guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs

  15. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  16. Multi Hazard Assessment: The Azores Archipelagos (PT) case

    Science.gov (United States)

    Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos

    2016-04-01

    The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall

  17. Down to Earth with an electric hazard from space

    Science.gov (United States)

    Love, Jeffrey J.; Bedrosian, Paul A.; Schultz, Adam

    2017-01-01

    In reaching across traditional disciplinary boundaries, solid-Earth geophysicists and space physicists are forging new collaborations to map magnetic-storm hazards for electric-power grids. Future progress in evaluation storm time geoelectric hazards will come primarily through monitoring, surveys, and modeling of related data.

  18. Success in transmitting hazard science

    Science.gov (United States)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in

  19. Volcanology and hazards of phreatomagmatic basaltic eruptions

    DEFF Research Database (Denmark)

    Schmith, Johanne

    Iceland is one of the most active terrestrial volcanic regions on Earth with an average of more than 20 eruptions per century. Around 80% of all events are tephra generating explosive eruptions, but less than 10 % of all known tephra layers have been mapped. Recent hazard assessment models show...... that the two key parameters for hazard assessment modeling are total grain size distribution (TGSD) and eruptive style. These two parameters have been determined for even fewer eruptive events in Iceland. One of the most hazardous volcanoes in Iceland is Katla and no data set of TGSD or other eruptive...... parameters exist. Katla has not erupted for 99 years, but at least 2 of the 20 eruptions since the settlement of Iceland in 871 have reached Northern Europe as visible tephra fall. These eruptions occurred in 1755 and 1625 and remain enigmatic both in terms of actual size and eruption dynamics. This work...

  20. Proportional Reasoning and the Visually Impaired

    Science.gov (United States)

    Hilton, Geoff; Hilton, Annette; Dole, Shelley L.; Goos, Merrilyn; O'Brien, Mia

    2012-01-01

    Proportional reasoning is an important aspect of formal thinking that is acquired during the developmental years that approximate the middle years of schooling. Students who fail to acquire sound proportional reasoning often experience difficulties in subjects that require quantitative thinking, such as science, technology, engineering, and…

  1. On the Importance of Accounting for Competing Risks in Pediatric Brain Cancer: II. Regression Modeling and Sample Size

    International Nuclear Information System (INIS)

    Tai, Bee-Choo; Grundy, Richard; Machin, David

    2011-01-01

    Purpose: To accurately model the cumulative need for radiotherapy in trials designed to delay or avoid irradiation among children with malignant brain tumor, it is crucial to account for competing events and evaluate how each contributes to the timing of irradiation. An appropriate choice of statistical model is also important for adequate determination of sample size. Methods and Materials: We describe the statistical modeling of competing events (A, radiotherapy after progression; B, no radiotherapy after progression; and C, elective radiotherapy) using proportional cause-specific and subdistribution hazard functions. The procedures of sample size estimation based on each method are outlined. These are illustrated by use of data comparing children with ependymoma and other malignant brain tumors. The results from these two approaches are compared. Results: The cause-specific hazard analysis showed a reduction in hazards among infants with ependymoma for all event types, including Event A (adjusted cause-specific hazard ratio, 0.76; 95% confidence interval, 0.45-1.28). Conversely, the subdistribution hazard analysis suggested an increase in hazard for Event A (adjusted subdistribution hazard ratio, 1.35; 95% confidence interval, 0.80-2.30), but the reduction in hazards for Events B and C remained. Analysis based on subdistribution hazard requires a larger sample size than the cause-specific hazard approach. Conclusions: Notable differences in effect estimates and anticipated sample size were observed between methods when the main event showed a beneficial effect whereas the competing events showed an adverse effect on the cumulative incidence. The subdistribution hazard is the most appropriate for modeling treatment when its effects on both the main and competing events are of interest.

  2. Modeling hydrologic and geomorphic hazards across post-fire landscapes using a self-organizing map approach

    Science.gov (United States)

    Friedel, Michael J.

    2011-01-01

    Few studies attempt to model the range of possible post-fire hydrologic and geomorphic hazards because of the sparseness of data and the coupled, nonlinear, spatial, and temporal relationships among landscape variables. In this study, a type of unsupervised artificial neural network, called a self-organized map (SOM), is trained using data from 540 burned basins in the western United States. The sparsely populated data set includes variables from independent numerical landscape categories (climate, land surface form, geologic texture, and post-fire condition), independent landscape classes (bedrock geology and state), and dependent initiation processes (runoff, landslide, and runoff and landslide combination) and responses (debris flows, floods, and no events). Pattern analysis of the SOM-based component planes is used to identify and interpret relations among the variables. Application of the Davies-Bouldin criteria following k-means clustering of the SOM neurons identified eight conceptual regional models for focusing future research and empirical model development. A split-sample validation on 60 independent basins (not included in the training) indicates that simultaneous predictions of initiation process and response types are at least 78% accurate. As climate shifts from wet to dry conditions, forecasts across the burned landscape reveal a decreasing trend in the total number of debris flow, flood, and runoff events with considerable variability among individual basins. These findings suggest the SOM may be useful in forecasting real-time post-fire hazards, and long-term post-recovery processes and effects of climate change scenarios.

  3. A taylor series approach to survival analysis

    International Nuclear Information System (INIS)

    Brodsky, J.B.; Groer, P.G.

    1984-09-01

    A method of survival analysis using hazard functions is developed. The method uses the well known mathematical theory for Taylor Series. Hypothesis tests of the adequacy of many statistical models, including proportional hazards and linear and/or quadratic dose responses, are obtained. A partial analysis of leukemia mortality in the Life Span Study cohort is used as an example. Furthermore, a relatively robust estimation procedure for the proportional hazards model is proposed. (author)

  4. Proportional gas scintillation detectors and their applications

    International Nuclear Information System (INIS)

    Petr, I.

    1978-01-01

    The principle is described of a gas proportional scintillation detector and its function. Dependence of Si(Li) and xenon proportional detectors energy resolution on the input window size is given. A typical design is shown of a xenon detector used for X-ray spetrometry at an energy of 277 eV to 5.898 keV and at a gas pressure of 98 to 270 kPa. Gas proportional scintillation detectors show considerable better energy resolution than common proportional counters and even better resolution than semiconductor Si(Li) detectors for low X radiation energies. For detection areas smaller than 25 mm 2 Si(Li) detectors show better resolution, especially for higher X radiation energies. For window areas 25 to 190 mm 2 both types of detectors are equal, for a window area exceeding 190 mm 2 the proportional scintillation detector has higher energy resolution. (B.S.)

  5. Modeling of hazardous air pollutant removal in the pulsed corona discharge

    International Nuclear Information System (INIS)

    Derakhshesh, Marzie; Abedi, Jalal; Omidyeganeh, Mohammad

    2009-01-01

    This study investigated the effects of two parts of the performance equation of the pulsed corona reactor, which is one of the non-thermal plasma processing tools of atmospheric pressure for eliminating pollutant streams. First, the effect of axial dispersion in the diffusion term and then the effect of different orders of the reaction in the decomposition rate term were considered. The mathematical model was primarily developed to predict the effluent concentration of the pulsed corona reactor using mass balance, and considering axial dispersion, linear velocity and decomposition rate of pollutant. The steady state form of this equation was subsequently solved assuming different reaction orders. For the derivation of the performance equation of the reactor, it was assumed that the decomposition rate of the pollutant was directly proportional to discharge power and the concentration of the pollutant. The results were validated and compared with another predicted model using their experimental data. The model developed in this study was also validated with two other experimental data in the literature for N 2 O

  6. Climate change beliefs and hazard mitigation behaviors: Homeowners and wildfire risk

    Science.gov (United States)

    Hannah Brenkert-Smith; James R. Meldrum; Patricia A. Champ

    2015-01-01

    Downscaled climate models provide projections of how climate change may exacerbate the local impacts of natural hazards. The extent to which people facing exacerbated hazard conditions understand or respond to climate-related changes to local hazards has been largely overlooked. In this article, we examine the relationships among climate change beliefs, environmental...

  7. Morální hazard ve vztahu nájemce a zmocněnce

    OpenAIRE

    Zatlukal, Marek

    2012-01-01

    This paper will introduce the reader to the issues of moral hazard in a principal-agent setting, with the primary focus on the incentive pay models of moral hazard. Firstly, with an introduction and analysis of various microeconomic models designed to alleviate the problems of moral hazard, and secondly, with an analyses of these models in the context of a specific company, the aim of this thesis is to offer a comprehensive understanding of the specific problems caused by moral hazard in the ...

  8. An enhanced fire hazard assessment model and validation experiments for vertical cable trays

    International Nuclear Information System (INIS)

    Li, Lu; Huang, Xianjia; Bi, Kun; Liu, Xiaoshuang

    2016-01-01

    Highlights: • An enhanced model was developed for vertical cable fire hazard assessment in NPP. • The validated experiments on vertical cable tray fires were conducted. • The capability of the model for cable tray with different cable spacing were tested. - Abstract: The model, referred to as FLASH-CAT (Flame Spread over Horizontal Cable Trays), was developed to estimate the heat release rate for vertical cable tray fire. The focus of this work is to investigate the application of an enhanced model to the single vertical cable tray fires with different cable spacing. The experiments on vertical cable tray fires with three typical cable spacing were conducted. The histories of mass loss rate and flame length were recorded during the cable fire. From the experimental results, it is found that the space between cable lines intensifies the cable combustion and accelerates the flame spread. The predictions by the enhanced model show good agreements with the experimental data. At the same time, it is shown that the enhanced model is capable of predicting the different behaviors of cable fires with different cable spacing by adjusting the flame spread speed only.

  9. An enhanced fire hazard assessment model and validation experiments for vertical cable trays

    Energy Technology Data Exchange (ETDEWEB)

    Li, Lu [Sate Key Laboratory of Fire Science, University of Science and Technology of China, Hefei 230027 (China); Huang, Xianjia, E-mail: huangxianjia@gziit.ac.cn [Joint Laboratory of Fire Safety in Nuclear Power Plants, Institute of Industry Technology Guangzhou & Chinese Academy of Sciences, Guangzhou 511458 (China); Bi, Kun; Liu, Xiaoshuang [China Nuclear Power Design Co., Ltd., Shenzhen 518045 (China)

    2016-05-15

    Highlights: • An enhanced model was developed for vertical cable fire hazard assessment in NPP. • The validated experiments on vertical cable tray fires were conducted. • The capability of the model for cable tray with different cable spacing were tested. - Abstract: The model, referred to as FLASH-CAT (Flame Spread over Horizontal Cable Trays), was developed to estimate the heat release rate for vertical cable tray fire. The focus of this work is to investigate the application of an enhanced model to the single vertical cable tray fires with different cable spacing. The experiments on vertical cable tray fires with three typical cable spacing were conducted. The histories of mass loss rate and flame length were recorded during the cable fire. From the experimental results, it is found that the space between cable lines intensifies the cable combustion and accelerates the flame spread. The predictions by the enhanced model show good agreements with the experimental data. At the same time, it is shown that the enhanced model is capable of predicting the different behaviors of cable fires with different cable spacing by adjusting the flame spread speed only.

  10. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  11. PROPORTIONS AND HUMAN SCALE IN DAMASCENE COURTYARD HOUSES

    Directory of Open Access Journals (Sweden)

    M. Salim Ferwati

    2008-03-01

    Full Text Available Interior designers, architects, landscape architects, and even urban designers, agree that environment, as a form of non-verbal communication means, has a symbolic dimension to it. As for its aesthetic dimension, it seems that beauty is related to a certain proportion, partially and as a whole. Suitable proportion leaves a good impression upon the beholders, especially when it matches human proportion. That in fact was the underlining belief of LeCorbusier, according to which he developed his Modular concept. The study searches for a modular, or proportion, system that governs the design of Damascene traditional house. By geometrical and mathematical examinations of 28 traditional houses, it was found that a certain proportional relationship existed; however, these proportional relationships were not fixed ones. The study relied on analyzing the Iwan elevation as well as the inner courtyard proportion in relation to the building area. Charts, diagrams and tables were produced to summarize the results.

  12. Seismic hazard in the Nation's breadbasket

    Science.gov (United States)

    Boyd, Oliver; Haller, Kathleen; Luco, Nicolas; Moschetti, Morgan P.; Mueller, Charles; Petersen, Mark D.; Rezaeian, Sanaz; Rubinstein, Justin L.

    2015-01-01

    The USGS National Seismic Hazard Maps were updated in 2014 and included several important changes for the central United States (CUS). Background seismicity sources were improved using a new moment-magnitude-based catalog; a new adaptive, nearest-neighbor smoothing kernel was implemented; and maximum magnitudes for background sources were updated. Areal source zones developed by the Central and Eastern United States Seismic Source Characterization for Nuclear Facilities project were simplified and adopted. The weighting scheme for ground motion models was updated, giving more weight to models with a faster attenuation with distance compared to the previous maps. Overall, hazard changes (2% probability of exceedance in 50 years, across a range of ground-motion frequencies) were smaller than 10% in most of the CUS relative to the 2008 USGS maps despite new ground motion models and their assigned logic tree weights that reduced the probabilistic ground motions by 5–20%.

  13. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    Science.gov (United States)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  14. Building 894 hazards assessment document

    International Nuclear Information System (INIS)

    Banda, Z.; Williams, M.

    1996-07-01

    The Department of Energy Order 5500.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with Building 894. The entire inventory was subjected to the screening criteria for potential airborne impact to onsite and offsite individuals out of which 9 chemicals were kept for further evaluation. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance at which a postulated facility event will produce consequences exceeding the Early Severe Health Effects threshold is 130 meters. The highest emergency classification is a General Emergency. The Emergency Planning Zone is a nominal 130 meter area that conforms to DOE boundaries and physical/jurisdictional boundaries such as fence lines and streets

  15. Building 6630 hazards assessment document

    International Nuclear Information System (INIS)

    Williams, M.; Banda, Z.

    1996-10-01

    The Department of Energy Order 5500.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with Building 6630. The entire inventory was subjected to the screening criteria for potential airborne impact to onsite and offsite individuals out of which one chemical was kept for further evaluation. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the chemical release site, the atmospheric conditions, and the circumstances of the release. The greatest distance at which a postulated facility event will produce consequences exceeding the Early Severe Health Effects threshold is 76 meters. The highest emergency classification is an Alert. The Emergency Planning Zone is a nominal 100 meter area that conforms to DOE boundaries and physical/jurisdictional boundaries such as fence lines and streets

  16. Development of hydrogeological modelling approaches for assessment of consequences of hazardous accidents at nuclear power plants

    International Nuclear Information System (INIS)

    Rumynin, V.G.; Mironenko, V.A.; Konosavsky, P.K.; Pereverzeva, S.A.

    1994-07-01

    This paper introduces some modeling approaches for predicting the influence of hazardous accidents at nuclear reactors on groundwater quality. Possible pathways for radioactive releases from nuclear power plants were considered to conceptualize boundary conditions for solving the subsurface radionuclides transport problems. Some approaches to incorporate physical-and-chemical interactions into transport simulators have been developed. The hydrogeological forecasts were based on numerical and semi-analytical scale-dependent models. They have been applied to assess the possible impact of the nuclear power plants designed in Russia on groundwater reservoirs

  17. Principles of proportional recovery after stroke generalize to neglect and aphasia.

    Science.gov (United States)

    Marchi, N A; Ptak, R; Di Pietro, M; Schnider, A; Guggisberg, A G

    2017-08-01

    Motor recovery after stroke can be characterized into two different patterns. A majority of patients recover about 70% of initial impairment, whereas some patients with severe initial deficits show little or no improvement. Here, we investigated whether recovery from visuospatial neglect and aphasia is also separated into two different groups and whether similar proportions of recovery can be expected for the two cognitive functions. We assessed 35 patients with neglect and 14 patients with aphasia at 3 weeks and 3 months after stroke using standardized tests. Recovery patterns were classified with hierarchical clustering and the proportion of recovery was estimated from initial impairment using a linear regression analysis. Patients were reliably clustered into two different groups. For patients in the first cluster (n = 40), recovery followed a linear model where improvement was proportional to initial impairment and achieved 71% of maximal possible recovery for both cognitive deficits. Patients in the second cluster (n = 9) exhibited poor recovery (aphasia after stroke shows the same dichotomy and proportionality as observed in motor recovery. This is suggestive of common underlying principles of plasticity, which apply to motor and cognitive functions. © 2017 EAN.

  18. Working towards a clearer and more helpful hazard map: investigating the influence of hazard map design on hazard communication

    Science.gov (United States)

    Thompson, M. A.; Lindsay, J. M.; Gaillard, J.

    2015-12-01

    Globally, geological hazards are communicated using maps. In traditional hazard mapping practice, scientists analyse data about a hazard, and then display the results on a map for stakeholder and public use. However, this one-way, top-down approach to hazard communication is not necessarily effective or reliable. The messages which people take away will be dependent on the way in which they read, interpret, and understand the map, a facet of hazard communication which has been relatively unexplored. Decades of cartographic studies suggest that variables in the visual representation of data on maps, such as colour and symbology, can have a powerful effect on how people understand map content. In practice, however, there is little guidance or consistency in how hazard information is expressed and represented on maps. Accordingly, decisions are often made based on subjective preference, rather than research-backed principles. Here we present the results of a study in which we explore how hazard map design features can influence hazard map interpretation, and we propose a number of considerations for hazard map design. A series of hazard maps were generated, with each one showing the same probabilistic volcanic ashfall dataset, but using different verbal and visual variables (e.g., different colour schemes, data classifications, probabilistic formats). Following a short pilot study, these maps were used in an online survey of 110 stakeholders and scientists in New Zealand. Participants answered 30 open-ended and multiple choice questions about ashfall hazard based on the different maps. Results suggest that hazard map design can have a significant influence on the messages readers take away. For example, diverging colour schemes were associated with concepts of "risk" and decision-making more than sequential schemes, and participants made more precise estimates of hazard with isarithmic data classifications compared to binned or gradational shading. Based on such

  19. Two encyclopedia contributions

    DEFF Research Database (Denmark)

    Andersen, Per Kragh

    2003-01-01

    censoring; Cox regression model; counting process; event history analysis; intensity process; multi-state model; non-parametric inference; parametric models; survival analysis; Cox's proportional hazards model; hazard function, regression model; time-dependent covariate; time scale...

  20. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  1. Report 5: Guidance document. Implementation of biological infestation hazards in extended PSA

    International Nuclear Information System (INIS)

    Hasnaoui, C.; Georgescu, G.; Joel, P.; Sperbeck, S.; Kollasko, H.; Kumar, M.

    2016-01-01

    This report covers the assessment of biological hazards with PSA. It provides an overview of the available data and available practices in modelling this type of hazard. First researches in the national and international literature regarding PSA for external and internal hazards shows that probabilistic analyses were very rarely carried out in order to quantify the risk induced by biological hazards. Nevertheless, Section 3 provides some data from some countries. History has shown that this hazard can happened and can be highly safety significant. Screening out this event must be done with great care. The overall analysis approach for Level 1 PSA for internal events can be used for the biological hazards with some care to take into impact the nature of the hazard as it impacts many systems at different times and duration. A proposed detailed methodology is described in Section 4. Still some open issues remain: the methodology must also consider event combination of biological infestation with other external hazards wind or flooding or rainfall and multi-units impact. These aspects present still a lot of challenges to PSA developers. The ASAMPSA-E report recommends that further emphasis shall be put on these two aspects of PSA modelling: multi-units site impact and hazards combinations. (authors)

  2. Anti-synchronization control of BAM memristive neural networks with multiple proportional delays and stochastic perturbations

    Science.gov (United States)

    Wang, Weiping; Yuan, Manman; Luo, Xiong; Liu, Linlin; Zhang, Yao

    2018-01-01

    Proportional delay is a class of unbounded time-varying delay. A class of bidirectional associative memory (BAM) memristive neural networks with multiple proportional delays is concerned in this paper. First, we propose the model of BAM memristive neural networks with multiple proportional delays and stochastic perturbations. Furthermore, by choosing suitable nonlinear variable transformations, the BAM memristive neural networks with multiple proportional delays can be transformed into the BAM memristive neural networks with constant delays. Based on the drive-response system concept, differential inclusions theory and Lyapunov stability theory, some anti-synchronization criteria are obtained. Finally, the effectiveness of proposed criteria are demonstrated through numerical examples.

  3. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    Science.gov (United States)

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  4. Designsafe-Ci a Cyberinfrastructure for Natural Hazard Simulation and Data

    Science.gov (United States)

    Dawson, C.; Rathje, E.; Stanzione, D.; Padgett, J.; Pinelli, J. P.

    2017-12-01

    DesignSafe is the web-based research platform of the Natural Hazards Engineering Research Infrastructure (NHERI) network that provides the computational tools needed to manage and analyze critical data for natural hazards research, with wind and storm surge related hazards being a primary focus. One of the simulation tools under DesignSafe is the Advanced Circulation (ADCIRC) model, a coastal ocean model used in storm surge analysis. ADCIRC is an unstructured, finite element model with high resolution capabilities for studying storm surge impacts, and has long been used in storm surge hind-casting and forecasting. In this talk, we will demonstrate the use of ADCIRC within the DesignSafe platform and its use for forecasting Hurricane Harvey. We will also demonstrate how to analyze, visualize and archive critical storm surge related data within DesignSafe.

  5. Turbulence Hazard Metric Based on Peak Accelerations for Jetliner Passengers

    Science.gov (United States)

    Stewart, Eric C.

    2005-01-01

    Calculations are made of the approximate hazard due to peak normal accelerations of an airplane flying through a simulated vertical wind field associated with a convective frontal system. The calculations are based on a hazard metric developed from a systematic application of a generic math model to 1-cosine discrete gusts of various amplitudes and gust lengths. The math model simulates the three degree-of- freedom longitudinal rigid body motion to vertical gusts and includes (1) fuselage flexibility, (2) the lag in the downwash from the wing to the tail, (3) gradual lift effects, (4) a simplified autopilot, and (5) motion of an unrestrained passenger in the rear cabin. Airplane and passenger response contours are calculated for a matrix of gust amplitudes and gust lengths. The airplane response contours are used to develop an approximate hazard metric of peak normal accelerations as a function of gust amplitude and gust length. The hazard metric is then applied to a two-dimensional simulated vertical wind field of a convective frontal system. The variations of the hazard metric with gust length and airplane heading are demonstrated.

  6. 2014 Update of the United States National Seismic Hazard Maps

    Science.gov (United States)

    Petersen, M.D.; Mueller, C.S.; Haller, K.M.; Moschetti, M.; Harmsen, S.C.; Field, E.H.; Rukstales, K.S.; Zeng, Y.; Perkins, D.M.; Powers, P.; Rezaeian, S.; Luco, N.; Olsen, A.; Williams, R.

    2012-01-01

    The U.S. National Seismic Hazard Maps are revised every six years, corresponding with the update cycle of the International Building Code. These maps cover the conterminous U.S. and will be updated in 2014 using the best-available science that is obtained from colleagues at regional and topical workshops, which are convened in 2012-2013. Maps for Alaska and Hawaii will be updated shortly following this update. Alternative seismic hazard models discussed at the workshops will be implemented in a logic tree framework and will be used to develop the seismic hazard maps and associated products. In this paper we describe the plan to update the hazard maps, the issues raised in workshops up to March 2012, and topics that will be discussed at future workshops. An advisory panel will guide the development of the hazard maps and ensure that the maps are acceptable to a broad segment of the science and engineering communities. These updated maps will then be considered by end-users for inclusion in building codes, risk models, and public policy documents.

  7. Seismic hazard analysis for Jayapura city, Papua

    International Nuclear Information System (INIS)

    Robiana, R.; Cipta, A.

    2015-01-01

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds

  8. Seismic hazard analysis for Jayapura city, Papua

    Energy Technology Data Exchange (ETDEWEB)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A. [Geological Agency, Diponegoro Road No.57, Bandung, 40122 (Indonesia)

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  9. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models.

    Science.gov (United States)

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; Gutiérrez-Gutiérrez, O. Q.; Larreynaga, J.; González, M.; Castro, M.; Gavidia, F.; Aguirre-Ayerbe, I.; González-Riancho, P.; Carreño, E.

    2013-11-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and deterministic methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold: on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high-resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps, and from the elevation in the near shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific Basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences-finite volumes numerical model in this work, based on the linear and non-linear shallow water equations, to simulate a total of 24 earthquake-generated tsunami scenarios. Our results show that at the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results

  10. Handbook of hazardous waste management

    International Nuclear Information System (INIS)

    Metry, A.A.

    1980-01-01

    The contents of this work are arranged so as to give the reader a detailed understanding of the elements of hazardous waste management. Generalized management concepts are covered in Chapters 1 through 5 which are entitled: Introduction, Regulations Affecting Hazardous Waste Management, Comprehensive Hazardous Waste Management, Control of Hazardous Waste Transportation, and Emergency Hazardous Waste Management. Chapters 6 through 11 deal with treatment concepts and are entitled: General Considerations for Hazardous Waste Management Facilities, Physical Treatment of Hazardous Wastes, Chemical Treatment of Hazardous Wastes, Biological Treatment of Hazardous Wastes, Incineration of Hazardous Wastes, and Hazardous Waste Management of Selected Industries. Chapters 12 through 15 are devoted to ultimate disposal concepts and are entitled: Land Disposal Facilities, Ocean Dumping of Hazardous Wastes, Disposal of Extremely Hazardous Wastes, and Generalized Criteria for Hazardous Waste Management Facilities

  11. Assessment of liquefaction-induced hazards using Bayesian networks based on standard penetration test data

    Science.gov (United States)

    Tang, Xiao-Wei; Bai, Xu; Hu, Ji-Lei; Qiu, Jiang-Nan

    2018-05-01

    Liquefaction-induced hazards such as sand boils, ground cracks, settlement, and lateral spreading are responsible for considerable damage to engineering structures during major earthquakes. Presently, there is no effective empirical approach that can assess different liquefaction-induced hazards in one model. This is because of the uncertainties and complexity of the factors related to seismic liquefaction and liquefaction-induced hazards. In this study, Bayesian networks (BNs) are used to integrate multiple factors related to seismic liquefaction, sand boils, ground cracks, settlement, and lateral spreading into a model based on standard penetration test data. The constructed BN model can assess four different liquefaction-induced hazards together. In a case study, the BN method outperforms an artificial neural network and Ishihara and Yoshimine's simplified method in terms of accuracy, Brier score, recall, precision, and area under the curve (AUC) of the receiver operating characteristic (ROC). This demonstrates that the BN method is a good alternative tool for the risk assessment of liquefaction-induced hazards. Furthermore, the performance of the BN model in estimating liquefaction-induced hazards in Japan's 2011 Tōhoku earthquake confirms its correctness and reliability compared with the liquefaction potential index approach. The proposed BN model can also predict whether the soil becomes liquefied after an earthquake and can deduce the chain reaction process of liquefaction-induced hazards and perform backward reasoning. The assessment results from the proposed model provide informative guidelines for decision-makers to detect the damage state of a field following liquefaction.

  12. Utilizing NASA Earth Observations to Model Volcanic Hazard Risk Levels in Areas Surrounding the Copahue Volcano in the Andes Mountains

    Science.gov (United States)

    Keith, A. M.; Weigel, A. M.; Rivas, J.

    2014-12-01

    Copahue is a stratovolcano located along the rim of the Caviahue Caldera near the Chile-Argentina border in the Andes Mountain Range. There are several small towns located in proximity of the volcano with the two largest being Banos Copahue and Caviahue. During its eruptive history, it has produced numerous lava flows, pyroclastic flows, ash deposits, and lahars. This isolated region has steep topography and little vegetation, rendering it poorly monitored. The need to model volcanic hazard risk has been reinforced by recent volcanic activity that intermittently released several ash plumes from December 2012 through May 2013. Exposure to volcanic ash is currently the main threat for the surrounding populations as the volcano becomes more active. The goal of this project was to study Copahue and determine areas that have the highest potential of being affected in the event of an eruption. Remote sensing techniques were used to examine and identify volcanic activity and areas vulnerable to experiencing volcanic hazards including volcanic ash, SO2 gas, lava flow, pyroclastic density currents and lahars. Landsat 7 Enhanced Thematic Mapper Plus (ETM+), Landsat 8 Operational Land Imager (OLI), EO-1 Advanced Land Imager (ALI), Terra Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Shuttle Radar Topography Mission (SRTM), ISS ISERV Pathfinder, and Aura Ozone Monitoring Instrument (OMI) products were used to analyze volcanic hazards. These datasets were used to create a historic lava flow map of the Copahue volcano by identifying historic lava flows, tephra, and lahars both visually and spectrally. Additionally, a volcanic risk and hazard map for the surrounding area was created by modeling the possible extent of ash fallout, lahars, lava flow, and pyroclastic density currents (PDC) for future eruptions. These model results were then used to identify areas that should be prioritized for disaster relief and evacuation orders.

  13. The Herfa-Neurode hazardous waste repository in bedded salt as an operating model for safe mixed waste disposal

    International Nuclear Information System (INIS)

    Rempe, N.T.

    1991-01-01

    For 18 years, The Herfa-Neurode underground repository has demonstrated the environmentally sound disposal of hazardous waste in a former potash mine. Its principal characteristics make it an excellent analogue to the Waste Isolation Pilot Plant (WIPP). The Environmental Protection Agency has ruled in its first conditional no-migration determination that is reasonably certain that no hazardous constituents of the mixed waste, destined for the WIPP during its test phase, will migrate from the site for up to ten years. Knowledge of and reference to the Herfa-Neurode operating model may substantially improve the no-migration variance petition for the WIPP's disposal phase and thereby expedite its approval. 2 refs., 1 fig., 1 tab

  14. Urban Heat Wave Hazard Assessment

    Science.gov (United States)

    Quattrochi, D. A.; Jedlovec, G.; Crane, D. L.; Meyer, P. J.; LaFontaine, F.

    2016-12-01

    Heat waves are one of the largest causes of environmentally-related deaths globally and are likely to become more numerous as a result of climate change. The intensification of heat waves by the urban heat island effect and elevated humidity, combined with urban demographics, are key elements leading to these disasters. Better warning of the potential hazards may help lower risks associated with heat waves. Moderate resolution thermal data from NASA satellites is used to derive high spatial resolution estimates of apparent temperature (heat index) over urban regions. These data, combined with demographic data, are used to produce a daily heat hazard/risk map for selected cities. MODIS data are used to derive daily composite maximum and minimum land surface temperature (LST) fields to represent the amplitude of the diurnal temperature cycle and identify extreme heat days. Compositing routines are used to generate representative daily maximum and minimum LSTs for the urban environment. The limited effect of relative humidity on the apparent temperature (typically 10-15%) allows for the use of modeled moisture fields to convert LST to apparent temperature without loss of spatial variability. The daily max/min apparent temperature fields are used to identify abnormally extreme heat days relative to climatological values in order to produce a heat wave hazard map. Reference to climatological values normalizes the hazard for a particular region (e.g., the impact of an extreme heat day). A heat wave hazard map has been produced for several case study periods and then computed on a quasi-operational basis during the summer of 2016 for Atlanta, GA, Chicago, IL, St. Louis, MO, and Huntsville, AL. A hazard does not become a risk until someone or something is exposed to that hazard at a level that might do harm. Demographic information is used to assess the urban risk associated with the heat wave hazard. Collectively, the heat wave hazard product can warn people in urban

  15. Sustainable regional development and natural hazard impacts

    Science.gov (United States)

    Petrova, Elena; Svetlosanov, Vladimir; Kudin, Valery

    2016-04-01

    During the last decades, natural hazard impacts on social and economic development in many countries were increasing due to the expansion of human activities into the areas prone to natural risks as well as to increasing in number and severity of natural hazardous events caused by climate changes and other natural phenomena. The escalation of severe disasters (such as Tohoku earthquake and tsunami in Japan 2011) triggered by natural hazards and related natural-technological and environmental events is increasingly threatening sustainable development at different levels from regional to global scale. In our study, we develop a model of ecological, economic and social sustainable development for the European part of Russia and the Republic of Belarus. The model consists of six blocks including 1) population, 2) environment, 3) mineral resources, 4) geographic space, 5) investments, and 6) food production and import. These blocks were created based on the analysis of the main processes at the regional level; all the blocks are closely interrelated between each other. Reaching the limit values of block parameters corresponds to a sharp deterioration of the system; as a result, the system can lose its stability. Aggravation of natural and natural-technological risk impacts on each block and should be taken into account in the model of regional development. Natural hazards can cause both strong influences and small but permanent perturbations. In both cases, a system can become unstable. The criterion for sustainable development is proposed. The Russian Foundation for Humanities and Belorussian Republican Foundation for Fundamental Research supported the study (project 15-22-01008).

  16. Mix Proportion Design of Asphalt Concrete

    Science.gov (United States)

    Wu, Xianhu; Gao, Lingling; Du, Shoujun

    2017-12-01

    Based on the gradation of AC and SMA, this paper designs a new type of anti slide mixture with two types of advantages. Chapter introduces the material selection, ratio of ore mixture ratio design calculation, and determine the optimal asphalt content test and proportioning design of asphalt concrete mix. This paper introduces the new technology of mix proportion.

  17. Flood Hazard Mapping by Using Geographic Information System and Hydraulic Model: Mert River, Samsun, Turkey

    Directory of Open Access Journals (Sweden)

    Vahdettin Demir

    2016-01-01

    Full Text Available In this study, flood hazard maps were prepared for the Mert River Basin, Samsun, Turkey, by using GIS and Hydrologic Engineering Centers River Analysis System (HEC-RAS. In this river basin, human life losses and a significant amount of property damages were experienced in 2012 flood. The preparation of flood risk maps employed in the study includes the following steps: (1 digitization of topographical data and preparation of digital elevation model using ArcGIS, (2 simulation of flood lows of different return periods using a hydraulic model (HEC-RAS, and (3 preparation of flood risk maps by integrating the results of (1 and (2.

  18. Using remotely sensed data and stochastic models to simulate realistic flood hazard footprints across the continental US

    Science.gov (United States)

    Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.

    2017-12-01

    Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in

  19. A synchrotron X-ray diffraction study of non-proportional strain-path effects

    International Nuclear Information System (INIS)

    Collins, D.M.; Erinosho, T.; Dunne, F.P.E.; Todd, R.I.; Connolley, T.; Mostafavi, M.; Kupfer, H.; Wilkinson, A.J.

    2017-01-01

    Common alloys used in sheet form can display a significant ductility benefit when they are subjected to certain multiaxial strain paths. This effect has been studied here for a polycrystalline ferritic steel using a combination of Nakajima bulge testing, X-ray diffraction during biaxial testing of cruciform samples and crystal plasticity finite element (CPFE) modelling. Greatest gains in strain to failure were found when subjecting sheets to uniaxial loading followed by balanced biaxial deformation, resulting in a total deformation close to plane-strain. A combined strain of approximately double that of proportional loading was achieved. The evolution of macrostrain, microstrain and texture during non-proportional loading were evaluated by in-situ high energy synchrotron diffraction. The results have demonstrated that the inhomogeneous strain accumulation from non-proportional deformation is strongly dependent on texture and the applied strain-ratio of the first deformation pass. Experimental diffraction evidence is supported by results produced by a novel method of CPFE-derived diffraction simulation. Using constitutive laws selected on the basis of good agreement with measured lattice strain development, the CPFE model demonstrated the capability to replicate ductility gains measured experimentally.

  20. Afghanistan Multi-Risk Assessment to Natural Hazards

    Science.gov (United States)

    Diermanse, Ferdinand; Daniell, James; Pollino, Maurizio; Glover, James; Bouwer, Laurens; de Bel, Mark; Schaefer, Andreas; Puglisi, Claudio; Winsemius, Hessel; Burzel, Andreas; Ammann, Walter; Aliparast, Mojtaba; Jongman, Brenden; Ranghieri, Federica; Fallesen, Ditte

    2017-04-01

    separately for each peril. Several models were implemented used to simulate the relevant processes involved. These models were fed by global and local climate data and geological data like elevation, slope, land use, soil characteristics etc. Exposure is a measure of the assets and population at risk. An extensive data collection and processing effort was carried out to derive nation-wide exposure data. Vulnerability is a measure of potential exposure losses if a hazardous event occurs. Vulnerability analyses were carried out separately for each peril, because of differences in impact characteristics. Damage functions were derived from asset characteristics and/or experiences from (international) literature. The main project output consists of tables and (GIS-) maps of hazard, exposure and risk. Tables present results at the nation-wide level (admin0), province level (admin1) and district level (admin2). Hazard maps are provided for various return periods, including 10, 20, 50, 100, 250, 500 and 1000 years. All maps are stored in a Web-based GIS-platform. This platform contains four separate directories with [1] generic data (catchment boundaries, rivers etc), [2] hazard maps, [3] exposure maps and [4] risk maps for each of the considered perils.

  1. A generalized partially linear mean-covariance regression model for longitudinal proportional data, with applications to the analysis of quality of life data from cancer clinical trials.

    Science.gov (United States)

    Zheng, Xueying; Qin, Guoyou; Tu, Dongsheng

    2017-05-30

    Motivated by the analysis of quality of life data from a clinical trial on early breast cancer, we propose in this paper a generalized partially linear mean-covariance regression model for longitudinal proportional data, which are bounded in a closed interval. Cholesky decomposition of the covariance matrix for within-subject responses and generalized estimation equations are used to estimate unknown parameters and the nonlinear function in the model. Simulation studies are performed to evaluate the performance of the proposed estimation procedures. Our new model is also applied to analyze the data from the cancer clinical trial that motivated this research. In comparison with available models in the literature, the proposed model does not require specific parametric assumptions on the density function of the longitudinal responses and the probability function of the boundary values and can capture dynamic changes of time or other interested variables on both mean and covariance of the correlated proportional responses. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Robust estimation of the proportion of treatment effect explained by surrogate marker information.

    Science.gov (United States)

    Parast, Layla; McDermott, Mary M; Tian, Lu

    2016-05-10

    In randomized treatment studies where the primary outcome requires long follow-up of patients and/or expensive or invasive obtainment procedures, the availability of a surrogate marker that could be used to estimate the treatment effect and could potentially be observed earlier than the primary outcome would allow researchers to make conclusions regarding the treatment effect with less required follow-up time and resources. The Prentice criterion for a valid surrogate marker requires that a test for treatment effect on the surrogate marker also be a valid test for treatment effect on the primary outcome of interest. Based on this criterion, methods have been developed to define and estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on the surrogate marker. These methods aim to identify useful statistical surrogates that capture a large proportion of the treatment effect. However, current methods to estimate this proportion usually require restrictive model assumptions that may not hold in practice and thus may lead to biased estimates of this quantity. In this paper, we propose a nonparametric procedure to estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on a potential surrogate marker and extend this procedure to a setting with multiple surrogate markers. We compare our approach with previously proposed model-based approaches and propose a variance estimation procedure based on a perturbation-resampling method. Simulation studies demonstrate that the procedure performs well in finite samples and outperforms model-based procedures when the specified models are not correct. We illustrate our proposed procedure using a data set from a randomized study investigating a group-mediated cognitive behavioral intervention for peripheral artery disease participants. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Effect of Strategy Teaching for the Solution of Ratio Problems on Students' Proportional Reasoning Skills

    Science.gov (United States)

    Sen, Ceylan; Güler, Gürsel

    2017-01-01

    The study was conducted to reveal the effects of the instruction of different problem-solving strategies on the proportional reasoning skills of students in solving proportional problems in the 6th grade math's class. Quasi-experimental research model with pretest-posttest control group was employed in the study. For eight class hours, the…

  4. Advanced Materials Laboratory hazards assessment document

    Energy Technology Data Exchange (ETDEWEB)

    Barnett, B.; Banda, Z.

    1995-10-01

    The Department of Energy Order 55OO.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the AML. The entire inventory was screened according to the potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance at which a postulated facility event will produce consequences exceeding the Early Severe Health Effects threshold is 23 meters. The highest emergency classification is a General Emergency. The Emergency Planning Zone is a nominal area that conforms to DOE boundaries and physical/jurisdictional boundaries such as fence lines and streets.

  5. Integrated approach for coastal hazards and risks in Sri Lanka

    Science.gov (United States)

    Garcin, M.; Desprats, J. F.; Fontaine, M.; Pedreros, R.; Attanayake, N.; Fernando, S.; Siriwardana, C. H. E. R.; de Silva, U.; Poisson, B.

    2008-06-01

    The devastating impact of the tsunami of 26 December 2004 on the shores of the Indian Ocean recalled the importance of knowledge and the taking into account of coastal hazards. Sri Lanka was one of the countries most affected by this tsunami (e.g. 30 000 dead, 1 million people homeless and 70% of the fishing fleet destroyed). Following this tsunami, as part of the French post-tsunami aid, a project to establish a Geographical Information System (GIS) on coastal hazards and risks was funded. This project aims to define, at a pilot site, a methodology for multiple coastal hazards assessment that might be useful for the post-tsunami reconstruction and for development planning. This methodology could be applied to the whole coastline of Sri Lanka. The multi-hazard approach deals with very different coastal processes in terms of dynamics as well as in terms of return period. The first elements of this study are presented here. We used a set of tools integrating a GIS, numerical simulations and risk scenario modelling. While this action occurred in response to the crisis caused by the tsunami, it was decided to integrate other coastal hazards into the study. Although less dramatic than the tsunami these remain responsible for loss of life and damage. Furthermore, the establishment of such a system could not ignore the longer-term effects of climate change on coastal hazards in Sri Lanka. This GIS integrates the physical and demographic data available in Sri Lanka that is useful for assessing the coastal hazards and risks. In addition, these data have been used in numerical modelling of the waves generated during periods of monsoon as well as for the December 2004 tsunami. Risk scenarios have also been assessed for test areas and validated by field data acquired during the project. The results obtained from the models can be further integrated into the GIS and contribute to its enrichment and to help in better assessment and mitigation of these risks. The coastal-hazards

  6. Integrated approach for coastal hazards and risks in Sri Lanka

    Directory of Open Access Journals (Sweden)

    M. Garcin

    2008-06-01

    Full Text Available The devastating impact of the tsunami of 26 December 2004 on the shores of the Indian Ocean recalled the importance of knowledge and the taking into account of coastal hazards. Sri Lanka was one of the countries most affected by this tsunami (e.g. 30 000 dead, 1 million people homeless and 70% of the fishing fleet destroyed. Following this tsunami, as part of the French post-tsunami aid, a project to establish a Geographical Information System (GIS on coastal hazards and risks was funded. This project aims to define, at a pilot site, a methodology for multiple coastal hazards assessment that might be useful for the post-tsunami reconstruction and for development planning. This methodology could be applied to the whole coastline of Sri Lanka.

    The multi-hazard approach deals with very different coastal processes in terms of dynamics as well as in terms of return period. The first elements of this study are presented here. We used a set of tools integrating a GIS, numerical simulations and risk scenario modelling. While this action occurred in response to the crisis caused by the tsunami, it was decided to integrate other coastal hazards into the study. Although less dramatic than the tsunami these remain responsible for loss of life and damage. Furthermore, the establishment of such a system could not ignore the longer-term effects of climate change on coastal hazards in Sri Lanka.

    This GIS integrates the physical and demographic data available in Sri Lanka that is useful for assessing the coastal hazards and risks. In addition, these data have been used in numerical modelling of the waves generated during periods of monsoon as well as for the December 2004 tsunami. Risk scenarios have also been assessed for test areas and validated by field data acquired during the project. The results obtained from the models can be further integrated into the GIS and contribute to its enrichment and to help in better assessment and mitigation

  7. Egyptian Environmental Activities and Regulations for Management of Hazardous Substances and Hazardous Wastes

    International Nuclear Information System (INIS)

    El Zarka, M.

    1999-01-01

    A substantial use of hazardous substances is essential to meet the social and economic goals of the community in Egypt. Agrochemicals are being used extensively to increase crop yield. The outdated agrochemicals and their empty containers represent a serious environmental problem. Industrial development in different sectors in Egypt obligates handling of huge amounts of hazardous substances and hazardous wastes. The inappropriate handling of such hazardous substances creates several health and environmental problems. Egypt faces many challenges to control safe handling of such substances and wastes. Several regulations are governing handling of hazardous substances in Egypt. The unified Environmental Law 4 for the year 1994 includes a full chapter on the Management of Hazardous Substances and Hazardous Wastes. National and international activities have been taken to manage hazardous substances and hazardous wastes in an environmental sound manner

  8. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  9. Why do card issuers charge proportional fees?

    OpenAIRE

    Oz Shy; Zhu Wang

    2008-01-01

    This paper explains why payment card companies charge consumers and merchants fees which are proportional to the transaction values instead of charging a fixed per-transaction fee. Our theory shows that, even in the absence of any cost considerations, card companies earn much higher profit when they charge proportional fees. It is also shown that competition among merchants reduces card companies' gains from using proportional fees relative to a fixed per-transaction fee. Merchants are found ...

  10. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  11. The arcsine is asinine: the analysis of proportions in ecology.

    Science.gov (United States)

    Warton, David I; Hui, Francis K C

    2011-01-01

    The arcsine square root transformation has long been standard procedure when analyzing proportional data in ecology, with applications in data sets containing binomial and non-binomial response variables. Here, we argue that the arcsine transform should not be used in either circumstance. For binomial data, logistic regression has greater interpretability and higher power than analyses of transformed data. However, it is important to check the data for additional unexplained variation, i.e., overdispersion, and to account for it via the inclusion of random effects in the model if found. For non-binomial data, the arcsine transform is undesirable on the grounds of interpretability, and because it can produce nonsensical predictions. The logit transformation is proposed as an alternative approach to address these issues. Examples are presented in both cases to illustrate these advantages, comparing various methods of analyzing proportions including untransformed, arcsine- and logit-transformed linear models and logistic regression (with or without random effects). Simulations demonstrate that logistic regression usually provides a gain in power over other methods.

  12. Estimation of direct effects for survival data by using the Aalen additive hazards model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Vansteelandt, Stijn; Gerster, Mette

    2011-01-01

    We extend the definition of the controlled direct effect of a point exposure on a survival outcome, other than through some given, time-fixed intermediate variable, to the additive hazard scale. We propose two-stage estimators for this effect when the exposure is dichotomous and randomly assigned...... Aalen's additive regression for the event time, given exposure, intermediate variable and confounders. The second stage involves applying Aalen's additive model, given the exposure alone, to a modified stochastic process (i.e. a modification of the observed counting process based on the first...

  13. Application of systems and control theory-based hazard analysis to radiation oncology.

    Science.gov (United States)

    Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G

    2016-03-01

    Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve

  14. Urban-Hazard Risk Analysis: Mapping of Heat-Related Risks in the Elderly in Major Italian Cities

    Science.gov (United States)

    Morabito, Marco; Crisci, Alfonso; Gioli, Beniamino; Gualtieri, Giovanni; Toscano, Piero; Di Stefano, Valentina; Orlandini, Simone; Gensini, Gian Franco

    2015-01-01

    Background Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks. Objectives Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥65). Methods A long time-series (2001–2013) of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m) daytime and night-time land surface temperatures (LST). LST was estimated pixel-wise by applying two statistical model approaches: 1) the Linear Regression Model (LRM); 2) the Generalized Additive Model (GAM). Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m) from the 2001 census (Eurostat source), and processed together using “Crichton’s Risk Triangle” hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI). Results The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk) were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities. Conclusions This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public

  15. Determining the effect of worker exposure conditions on the risk of hearing loss in noisy industrial workroom using Cox proportional hazard model.

    Science.gov (United States)

    Aliabadi, Mohsen; Fereidan, Mohammad; Farhadian, Maryam; Tajik, Leila

    2015-01-01

    In noisy workrooms, exposure conditions such as noise level, exposure duration and use of hearing protection devices are contributory factors to hearing loss. The aim of this study was to determine the effect of exposure conditions on the risk of hearing loss using the Cox model. Seventy workers, employed in a press workshop, were selected to study their hearing threshold using an audiometric test. Their noise exposure histories also were analyzed. The results of the Cox model showed that the job type, smoking and the use of protection devices were effective to induce hearing loss. The relative risk of hearing loss in smokers was 1.1 times of non-smokers The relative risk of hearing loss in workers with the intermittent use of protection devices was 3.3 times those who used these devices continuously. The Cox model could analyze the effect of exposure conditions on hearing loss and provides useful information for managers in order to improve hearing conservation programs.

  16. Modelling Inland Flood Events for Hazard Maps in Taiwan

    Science.gov (United States)

    Ghosh, S.; Nzerem, K.; Sassi, M.; Hilberts, A.; Assteerawatt, A.; Tillmanns, S.; Mathur, P.; Mitas, C.; Rafique, F.

    2015-12-01

    Taiwan experiences significant inland flooding, driven by torrential rainfall from plum rain storms and typhoons during summer and fall. From last 13 to 16 years data, 3,000 buildings were damaged by such floods annually with a loss US$0.41 billion (Water Resources Agency). This long, narrow island nation with mostly hilly/mountainous topography is located at tropical-subtropical zone with annual average typhoon-hit-frequency of 3-4 (Central Weather Bureau) and annual average precipitation of 2502mm (WRA) - 2.5 times of the world's average. Spatial and temporal distributions of countrywide precipitation are uneven, with very high local extreme rainfall intensities. Annual average precipitation is 3000-5000mm in the mountainous regions, 78% of it falls in May-October, and the 1-hour to 3-day maximum rainfall are about 85 to 93% of the world records (WRA). Rivers in Taiwan are short with small upstream areas and high runoff coefficients of watersheds. These rivers have the steepest slopes, the shortest response time with rapid flows, and the largest peak flows as well as specific flood peak discharge (WRA) in the world. RMS has recently developed a countrywide inland flood model for Taiwan, producing hazard return period maps at 1arcsec grid resolution. These can be the basis for evaluating and managing flood risk, its economic impacts, and insured flood losses. The model is initiated with sub-daily historical meteorological forcings and calibrated to daily discharge observations at about 50 river gauges over the period 2003-2013. Simulations of hydrologic processes, via rainfall-runoff and routing models, are subsequently performed based on a 10000 year set of stochastic forcing. The rainfall-runoff model is physically based continuous, semi-distributed model for catchment hydrology. The 1-D wave propagation hydraulic model considers catchment runoff in routing and describes large-scale transport processes along the river. It also accounts for reservoir storage

  17. Seismic hazard studies in Egypt

    Directory of Open Access Journals (Sweden)

    Abuo El-Ela A. Mohamed

    2012-12-01

    Full Text Available The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba–Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5° within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA values were detected in the western part of the western desert and it is less than 25 gal.

  18. Social Markers of Mild Cognitive Impairment: Proportion of Word Counts in Free Conversational Speech.

    Science.gov (United States)

    Dodge, Hiroko H; Mattek, Nora; Gregor, Mattie; Bowman, Molly; Seelye, Adriana; Ybarra, Oscar; Asgari, Meysam; Kaye, Jeffrey A

    2015-01-01

    Detecting early signs of Alzheimer's disease (AD) and mild cognitive impairment (MCI) during the pre-symptomatic phase is becoming increasingly important for costeffective clinical trials and also for deriving maximum benefit from currently available treatment strategies. However, distinguishing early signs of MCI from normal cognitive aging is difficult. Biomarkers have been extensively examined as early indicators of the pathological process for AD, but assessing these biomarkers is expensive and challenging to apply widely among pre-symptomatic community dwelling older adults. Here we propose assessment of social markers, which could provide an alternative or complementary and ecologically valid strategy for identifying the pre-symptomatic phase leading to MCI and AD. The data came from a larger randomized controlled clinical trial (RCT), where we examined whether daily conversational interactions using remote video telecommunications software could improve cognitive functions of older adult participants. We assessed the proportion of words generated by participants out of total words produced by both participants and staff interviewers using transcribed conversations during the intervention trial as an indicator of how two people (participants and interviewers) interact with each other in one-on-one conversations. We examined whether the proportion differed between those with intact cognition and MCI, using first, generalized estimating equations with the proportion as outcome, and second, logistic regression models with cognitive status as outcome in order to estimate the area under ROC curve (ROC AUC). Compared to those with normal cognitive function, MCI participants generated a greater proportion of words out of the total number of words during the timed conversation sessions (p=0.01). This difference remained after controlling for participant age, gender, interviewer and time of assessment (p=0.03). The logistic regression models showed the ROC AUC of

  19. The Volcanic Hazards Assessment Support System for the Online Hazard Assessment and Risk Mitigation of Quaternary Volcanoes in the World

    Directory of Open Access Journals (Sweden)

    Shinji Takarada

    2017-12-01

    Full Text Available Volcanic hazards assessment tools are essential for risk mitigation of volcanic activities. A number of offline volcanic hazard assessment tools have been provided, but in most cases, they require relatively complex installation procedure and usage. This situation causes limited usage of volcanic hazard assessment tools among volcanologists and volcanic hazards communities. In addition, volcanic eruption chronology and detailed database of each volcano in the world are essential key information for volcanic hazard assessment, but most of them are isolated and not connected to and with each other. The Volcanic Hazard Assessment Support System aims to implement a user-friendly, WebGIS-based, open-access online system for potential hazards assessment and risk-mitigation of Quaternary volcanoes in the world. The users can get up-to-date information such as eruption chronology and geophysical monitoring data of a specific volcano using the direct link system to major volcano databases on the system. Currently, the system provides 3 simple, powerful and notable deterministic modeling simulation codes of volcanic processes, such as Energy Cone, Titan2D and Tephra2. The system provides deterministic tools because probabilistic assessment tools are normally much more computationally demanding. By using the volcano hazard assessment system, the area that would be affected by volcanic eruptions in any location near the volcano can be estimated using numerical simulations. The system is being implemented using the ASTER Global DEM covering 2790 Quaternary volcanoes in the world. The system can be used to evaluate volcanic hazards and move this toward risk-potential by overlaying the estimated distribution of volcanic gravity flows or tephra falls on major roads, houses and evacuation areas using the GIS-enabled systems. The system is developed for all users in the world who need volcanic hazards assessment tools.

  20. Which Mixed-Member Proportional Electoral Formula Fits You Best? Assessing the Proportionality Principle of Positive Vote Transfer Systems

    DEFF Research Database (Denmark)

    Bochsler, Daniel

    2014-01-01

    Mixed-member proportional systems (MMP) are a family of electoral systems which combine district-based elections with a proportional seat allocation. Positive vote transfer systems belong to this family. This article explains why they might be better than their siblings, and examines under which ...

  1. Reliability of lifeline networks under seismic hazard

    International Nuclear Information System (INIS)

    Selcuk, A. Sevtap; Yuecemen, M. Semih

    1999-01-01

    Lifelines, such as pipelines, transportation, communication and power transmission systems, are networks which extend spatially over large geographical regions. The quantification of the reliability (survival probability) of a lifeline under seismic threat requires attention, as the proper functioning of these systems during or after a destructive earthquake is vital. In this study, a lifeline is idealized as an equivalent network with the capacity of its elements being random and spatially correlated and a comprehensive probabilistic model for the assessment of the reliability of lifelines under earthquake loads is developed. The seismic hazard that the network is exposed to is described by a probability distribution derived by using the past earthquake occurrence data. The seismic hazard analysis is based on the 'classical' seismic hazard analysis model with some modifications. An efficient algorithm developed by Yoo and Deo (Yoo YB, Deo N. A comparison of algorithms for terminal pair reliability. IEEE Transactions on Reliability 1988; 37: 210-215) is utilized for the evaluation of the network reliability. This algorithm eliminates the CPU time and memory capacity problems for large networks. A comprehensive computer program, called LIFEPACK is coded in Fortran language in order to carry out the numerical computations. Two detailed case studies are presented to show the implementation of the proposed model

  2. Doubly stochastic models for volcanic hazard assessment at Campi Flegrei caldera

    CERN Document Server

    Bevilacqua, Andrea

    2016-01-01

    This study provides innovative mathematical models for assessing the eruption probability and associated volcanic hazards, and applies them to the Campi Flegrei caldera in Italy. Throughout the book, significant attention is devoted to quantifying the sources of uncertainty affecting the forecast estimates. The Campi Flegrei caldera is certainly one of the world’s highest-risk volcanoes, with more than 70 eruptions over the last 15,000 years, prevalently explosive ones of varying magnitude, intensity and vent location. In the second half of the twentieth century the volcano apparently once again entered a phase of unrest that continues to the present. Hundreds of thousands of people live inside the caldera and over a million more in the nearby city of Naples, making a future eruption of Campi Flegrei an event with potentially catastrophic consequences at the national and European levels.

  3. Remote sensing and GIS-based landslide hazard analysis and cross-validation using multivariate logistic regression model on three test areas in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet

    2010-05-01

    This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross

  4. Multi-hazard risk analysis for management strategies

    Science.gov (United States)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  5. Modelos de regresión para variables expresadas como una proporción continua Regression models for variables expressed as a continuous proportion

    Directory of Open Access Journals (Sweden)

    Aarón Salinas-Rodríguez

    2006-10-01

    Full Text Available OBJETIVO: Describir algunas de las alternativas estadísticas disponibles para el estudio de proporciones continuas y comparar los distintos modelos que existen para evidenciar sus ventajas y desventajas, mediante su aplicación a un ejemplo práctico del ámbito de la salud pública. MATERIAL Y MÉTODOS: Con base en la Encuesta Nacional de Salud Reproductiva realizada en el año 2003, se modeló la proporción de cobertura individual en el programa de planificación familiar -propuesta en un estudio previo realizado en el Instituto Nacional de Salud Pública en Cuernavaca, Morelos, México (2005- mediante el uso de los modelos de regresión normal, gama, beta y de quasi-verosimilitud. La variante del criterio de información de Akaike (AIC que propusieron McQuarrie y Tsai se utilizó para definir el mejor modelo. A continuación, y mediante simulación (enfoque Monte Carlo/cadenas de Markov, se generó una variable con distribución beta para evaluar el comportamiento de los cuatro modelos al variar el tamaño de la muestra desde 100 hasta 18 000 observaciones. RESULTADOS: Los resultados muestran que la mejor opción estadística para el análisis de proporciones continuas es el modelo de regresión beta, de acuerdo con sus supuestos y el valor de AIC. La simulación mostró que a medida que aumenta el tamaño de la muestra, el modelo gama y, en especial, el modelo de quasi-verosimilitud se aproximan en grado significativo al modelo beta. CONCLUSIONES: Para la modelación de proporciones continuas se recomienda emplear el enfoque paramétrico de la regresión beta y evitar el uso del modelo normal. Si se tiene un tamaño de muestra grande, el uso del enfoque de quasi-verosimilitud representa una buena alternativa.OBJECTIVE: To describe some of the statistical alternatives available for studying continuous proportions and to compare them in order to show their advantages and disadvantages by means of their application in a practical example of

  6. Proportion congruency effects: Instructions may be enough

    Directory of Open Access Journals (Sweden)

    Olga eEntel

    2014-10-01

    Full Text Available Learning takes time, namely, one needs to be exposed to contingency relations between stimulus dimensions in order to learn, whereas intentional control can be recruited through task demands. Therefore showing that control can be recruited as a function of experimental instructions alone, that is, adapting the processing according to the instructions before the exposure to the task, can be taken as evidence for existence of control recruitment in the absence of learning. This was done by manipulating the information given at the outset of the experiment. In the first experiment, we manipulated list-level congruency proportion. Half of the participants were informed that most of the stimuli would be congruent, whereas the other half were informed that most of the stimuli would be incongruent. This held true for the stimuli in the second part of each experiment. In the first part, however, the proportion of the two stimulus types was equal. A proportion congruent effect was found in both parts of the experiment, but it was larger in the second part. In our second experiment, we manipulated the proportion of the stimuli within participants by applying an item-specific design. This was done by presenting some color words most often in their congruent color, and other color words in incongruent colors. Participants were informed about the exact word-color pairings in advance. Similar to Experiment 1, this held true only for the second experimental part. In contrast to our first experiment, informing participants in advance did not result in an item-specific proportion effect, which was observed only in the second part. Thus our results support the hypothesis that instructions may be enough to trigger list-level control, yet learning does contribute to the proportion congruent effect under such conditions. The item-level proportion effect is apparently caused by learning or at least it is moderated by it.

  7. Probabilistic Volcanic Multi-Hazard Assessment at Somma-Vesuvius (Italy): coupling Bayesian Belief Networks with a physical model for lahar propagation

    Science.gov (United States)

    Tierz, Pablo; Woodhouse, Mark; Phillips, Jeremy; Sandri, Laura; Selva, Jacopo; Marzocchi, Warner; Odbert, Henry

    2017-04-01

    Volcanoes are extremely complex physico-chemical systems where magma formed at depth breaks into the planet's surface resulting in major hazards from local to global scales. Volcano physics are dominated by non-linearities, and complicated spatio-temporal interrelationships which make volcanic hazards stochastic (i.e. not deterministic) by nature. In this context, probabilistic assessments are required to quantify the large uncertainties related to volcanic hazards. Moreover, volcanoes are typically multi-hazard environments where different hazardous processes can occur whether simultaneously or in succession. In particular, explosive volcanoes are able to accumulate, through tephra fallout and Pyroclastic Density Currents (PDCs), large amounts of pyroclastic material into the drainage basins surrounding the volcano. This addition of fresh particulate material alters the local/regional hydrogeological equilibrium and increases the frequency and magnitude of sediment-rich aqueous flows, commonly known as lahars. The initiation and volume of rain-triggered lahars may depend on: rainfall intensity and duration; antecedent rainfall; terrain slope; thickness, permeability and hydraulic diffusivity of the tephra deposit; etc. Quantifying these complex interrelationships (and their uncertainties), in a tractable manner, requires a structured but flexible probabilistic approach. A Bayesian Belief Network (BBN) is a directed acyclic graph that allows the representation of the joint probability distribution for a set of uncertain variables in a compact and efficient way, by exploiting unconditional and conditional independences between these variables. Once constructed and parametrized, the BBN uses Bayesian inference to perform causal (e.g. forecast) and/or evidential reasoning (e.g. explanation) about query variables, given some evidence. In this work, we illustrate how BBNs can be used to model the influence of several variables on the generation of rain-triggered lahars

  8. Maximum likelihood estimation of semiparametric mixture component models for competing risks data.

    Science.gov (United States)

    Choi, Sangbum; Huang, Xuelin

    2014-09-01

    In the analysis of competing risks data, the cumulative incidence function is a useful quantity to characterize the crude risk of failure from a specific event type. In this article, we consider an efficient semiparametric analysis of mixture component models on cumulative incidence functions. Under the proposed mixture model, latency survival regressions given the event type are performed through a class of semiparametric models that encompasses the proportional hazards model and the proportional odds model, allowing for time-dependent covariates. The marginal proportions of the occurrences of cause-specific events are assessed by a multinomial logistic model. Our mixture modeling approach is advantageous in that it makes a joint estimation of model parameters associated with all competing risks under consideration, satisfying the constraint that the cumulative probability of failing from any cause adds up to one given any covariates. We develop a novel maximum likelihood scheme based on semiparametric regression analysis that facilitates efficient and reliable estimation. Statistical inferences can be conveniently made from the inverse of the observed information matrix. We establish the consistency and asymptotic normality of the proposed estimators. We validate small sample properties with simulations and demonstrate the methodology with a data set from a study of follicular lymphoma. © 2014, The International Biometric Society.

  9. Multiwire proportional chamber development

    Science.gov (United States)

    Doolittle, R. F.; Pollvogt, U.; Eskovitz, A. J.

    1973-01-01

    The development of large area multiwire proportional chambers, to be used as high resolution spatial detectors in cosmic ray experiments is described. A readout system was developed which uses a directly coupled, lumped element delay-line whose characteristics are independent of the MWPC design. A complete analysis of the delay-line and the readout electronic system shows that a spatial resolution of about 0.1 mm can be reached with the MWPC operating in the strictly proportional region. This was confirmed by measurements with a small MWPC and Fe-55 X-rays. A simplified analysis was carried out to estimate the theoretical limit of spatial resolution due to delta-rays, spread of the discharge along the anode wire, and inclined trajectories. To calculate the gas gain of MWPC's of different geometrical configurations a method was developed which is based on the knowledge of the first Townsend coefficient of the chamber gas.

  10. Debris flows: behavior and hazard assessment

    Science.gov (United States)

    Iverson, Richard M.

    2014-01-01

    Debris flows are water-laden masses of soil and fragmented rock that rush down mountainsides, funnel into stream channels, entrain objects in their paths, and form lobate deposits when they spill onto valley floors. Because they have volumetric sediment concentrations that exceed 40 percent, maximum speeds that surpass 10 m/s, and sizes that can range up to ~109 m3, debris flows can denude slopes, bury floodplains, and devastate people and property. Computational models can accurately represent the physics of debris-flow initiation, motion and deposition by simulating evolution of flow mass and momentum while accounting for interactions of debris' solid and fluid constituents. The use of physically based models for hazard forecasting can be limited by imprecise knowledge of initial and boundary conditions and material properties, however. Therefore, empirical methods continue to play an important role in debris-flow hazard assessment.

  11. ''Hazardous'' terminology

    International Nuclear Information System (INIS)

    Powers, J.

    1991-01-01

    A number of terms (e.g., ''hazardous chemicals,'' ''hazardous materials,'' ''hazardous waste,'' and similar nomenclature) refer to substances that are subject to regulation under one or more federal environmental laws. State laws and regulations also provide additional, similar, or identical terminology that may be confused with the federally defined terms. Many of these terms appear synonymous, and it easy to use them interchangeably. However, in a regulatory context, inappropriate use of narrowly defined terms can lead to confusion about the substances referred to, the statutory provisions that apply, and the regulatory requirements for compliance under the applicable federal statutes. This information Brief provides regulatory definitions, a brief discussion of compliance requirements, and references for the precise terminology that should be used when referring to ''hazardous'' substances regulated under federal environmental laws. A companion CERCLA Information Brief (EH-231-004/0191) addresses ''toxic'' nomenclature

  12. FDA-iRISK--a comparative risk assessment system for evaluating and ranking food-hazard pairs: case studies on microbial hazards.

    Science.gov (United States)

    Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Régis; Ruthman, Todd; Wilson, Margaret

    2013-03-01

    Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.

  13. Exposure to hazardous workplace noise and use of hearing protection devices among US workers--NHANES, 1999-2004.

    Science.gov (United States)

    Tak, Sangwoo; Davis, Rickie R; Calvert, Geoffrey M

    2009-05-01

    To estimate the prevalence of workplace noise exposure and use of hearing protection devices (HPDs) at noisy work, NIOSH analyzed 1999-2004 data from the National Health and Nutrition Examination Survey (NHANES). A total of 9,275 currently employed workers aged > or =16 years were included in the weighted analysis. Hazardous workplace noise exposure was defined as self-reported exposure to noise at their current job that was so loud that the respondent had to speak in a raised voice to be heard. Industry and occupation were determined based on the respondent's current place and type of work. Twenty-two million US workers (17%) reported exposure to hazardous workplace noise. The weighted prevalence of workplace noise exposure was highest for mining (76%, SE = 7.0) followed by lumber/wood product manufacturing (55%, SE = 2.5). High-risk occupations included repair and maintenance, motor vehicle operators, and construction trades. Overall, 34% of the estimated 22 million US workers reporting hazardous workplace exposure reported non-use of HPDs. The proportion of noise-exposed workers who reported non-use of HPDs was highest for healthcare and social services (73.7%, SE = 8.1), followed by educational services (55.5%). Hearing loss prevention and intervention programs should be targeted at those industries and occupations identified to have a high prevalence of workplace noise exposure and those industries with the highest proportion of noise-exposed workers who reported non-use of HPDs. Published 2009 Wiley-Liss, Inc.

  14. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  15. Lin Receives 2010 Natural Hazards Focus Group Award for Graduate Research

    Science.gov (United States)

    2010-11-01

    Ning Lin has been awarded the Natural Hazards Focus Group Award for Graduate Research, given annually to a recent Ph.D. recipient for outstanding contributions to natural hazards research. Lin's thesis is entitled “Multi-hazard risk analysis related to hurricanes.” She is scheduled to present an invited talk in the Extreme Natural Events: Modeling, Prediction, and Mitigation session (NH20) during the 2010 AGU Fall Meeting, held 13-17 December in San Francisco, Calif. Lin will be formally presented with the award at the Natural Hazards focus group reception on 14 December 2010.

  16. The costs of hazardous alcohol consumption in Germany.

    Science.gov (United States)

    Effertz, Tobias; Verheyen, Frank; Linder, Roland

    2017-07-01

    Hazardous alcohol consumption in Germany is a main threat to health. By using insurance claim data from the German Statutory Health Insurance and a classification strategy based on ICD10 diagnoses-codes we analyzed a sample of 146,000 subjects with more than 19,000 hazardous alcohol consumers. Employing different regression models with a control function approach, we calculate life years lost due to alcohol consumption, annual direct and indirect health costs, and the burden of pain and suffering measured by the Charlson-Index and assessed pain diagnoses. Additionally, we simulate the net accumulated premium payments over expenses in the German Statutory Health Insurance and the Statutory Pension Fund for hazardous alcohol consumers from a lifecycle perspective. In total, €39.3 billion each year result from hazardous alcohol consumption with an average loss of 7 years in life expectancy. Hazardous alcohol consumers clearly do not "pay their way" in the two main German social security systems and also display a higher intangible burden according to our definitions of pain and suffering.

  17. Risk Evaluation of Multiple Hazards during Sediment and Water Related Disasters in a Small Basin

    Science.gov (United States)

    Yamanoi, Kazuki; Fujita, Masaharu

    2016-04-01

    To reduce human damage due to sediment and water related disasters induced by heavy rainfall, warning and evacuation system is very important. In Japan, the Meteorological Agency issues the sediment disaster alert when the potential of sediment disaster increases. Following the alert, local government issues evacuation advisory considering the alert and premonitory phenomena. However, it is very difficult for local people to perceive the dangerousness around them because the alert and advisory do not contain any definite information. Therefore, they sometimes misjudge the evacuation action. One reason of this is not only crucial hazards but also relatively small-scale multiple hazards take place and rise evacuation difficulties during sediment and water related disaster. Examples of small-scale hazards include: rainfall-associated hazards such as poor visibility or road submergence; landslide-associated hazards such as slope failure or sediment inflow; and flood-associated hazards such as overtopping of river dike, inundation, or destruction of bridges. The purpose of this study was to estimate the risk of multiple hazards during disaster events by numerical simulation. We applied the integrated sediment runoff model on unit channels, unit slopes, and slope units to an actual sediment and water related disaster occurred in a small basin in Tamba city, Hyogo, Japan. The maximum rainfall per hour was 91 mm (17/09/2014 2:00˜3:00) and the maximum daily precipitation was 414mm. The integrated model contains semi-physical based landslide prediction (sediment production) model, rainfall runoff model employing the kinematic wave method, model of sediment supply to channels, and bedload and suspended sediment transport model. We evaluated the risk of rainfall-associated hazards in each slope unit into 4 levels (Level I ˜ IV) using the rainfall intensity Ir [mm/hour]. The risk of flood- associated hazards were also estimated using the ratio of calculated water level and

  18. Critical asset and portfolio risk analysis: an all-hazards framework.

    Science.gov (United States)

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  19. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    Science.gov (United States)

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  20. Constant Proportion Portfolio Insurance Strategies in Contagious Markets

    DEFF Research Database (Denmark)

    Buccioli, Alice; Kokholm, Thomas

    2018-01-01

    charging and for risk management. The literature on CPPI modeling typically assumes diffusive or Lévy-driven dynamics for the risky asset underlying the strategy. In either case the self-contagious nature of asset prices is not taken into account. In order to account for contagion while preserving......Constant Proportion Portfolio Insurance (CPPI) strategies are popular as they allow to gear up the upside potential of a stock index while limiting its downside risk. From the issuer's perspective it is important to adequately assess the risks associated with the CPPI, both for correct "gap'' fee...