WorldWideScience

Sample records for additive hazards model

  1. The Additive Hazard Mixing Models

    Institute of Scientific and Technical Information of China (English)

    Ping LI; Xiao-liang LING

    2012-01-01

    This paper is concerned with the aging and dependence properties in the additive hazard mixing models including some stochastic comparisons.Further,some useful bounds of reliability functions in additive hazard mixing models are obtained.

  2. Further Results on Dynamic Additive Hazard Rate Model

    Directory of Open Access Journals (Sweden)

    Zhengcheng Zhang

    2014-01-01

    Full Text Available In the past, the proportional and additive hazard rate models have been investigated in the works. Nanda and Das (2011 introduced and studied the dynamic proportional (reversed hazard rate model. In this paper we study the dynamic additive hazard rate model, and investigate its aging properties for different aging classes. The closure of the model under some stochastic orders has also been investigated. Some examples are also given to illustrate different aging properties and stochastic comparisons of the model.

  3. Coordinate descent methods for the penalized semiparametric additive hazards model

    DEFF Research Database (Denmark)

    Gorst-Rasmussen, Anders; Scheike, Thomas

    . The semiparametric additive hazards model is a flexible alternative which is a natural survival analogue of the standard linear regression model. Building on this analogy, we develop a cyclic coordinate descent algorithm for fitting the lasso and elastic net penalized additive hazards model. The algorithm requires...

  4. Coordinate descent methods for the penalized semiprarametric additive hazard model

    DEFF Research Database (Denmark)

    Gorst-Rasmussen, Anders; Scheike, Thomas

    2012-01-01

    . The semiparametric additive hazards model is a flexible alternative which is a natural survival analogue of the standard linear regression model. Building on this analogy, we develop a cyclic coordinate descent algorithm for fitting the lasso and elastic net penalized additive hazards model. The algorithm requires...

  5. Coordinate descent methods for the penalized semiprarametric additive hazard model

    DEFF Research Database (Denmark)

    Gorst-Rasmussen, Anders; Scheike, Thomas

    2012-01-01

    For survival data with a large number of explanatory variables, lasso penalized Cox regression is a popular regularization strategy. However, a penalized Cox model may not always provide the best fit to data and can be difficult to estimate in high dimension because of its intrinsic nonlinearity....... The semiparametric additive hazards model is a flexible alternative which is a natural survival analogue of the standard linear regression model. Building on this analogy, we develop a cyclic coordinate descent algorithm for fitting the lasso and elastic net penalized additive hazards model. The algorithm requires...

  6. High-dimensional additive hazard models and the Lasso

    CERN Document Server

    Gaïffas, Séphane

    2011-01-01

    We consider a general high-dimensional additive hazard model in a non-asymptotic setting, including regression for censored-data. In this context, we consider a Lasso estimator with a fully data-driven $\\ell_1$ penalization, which is tuned for the estimation problem at hand. We prove sharp oracle inequalities for this estimator. Our analysis involves a new "data-driven" Bernstein's inequality, that is of independent interest, where the predictable variation is replaced by the optional variation.

  7. The additive hazards model with high-dimensional regressors

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas

    2009-01-01

    model. A standard PLS algorithm can also be constructed, but it turns out that the resulting predictor can only be related to the original covariates via time-dependent coefficients. The methods are applied to a breast cancer data set with gene expression recordings and to the well known primary biliary...

  8. An estimating equation for parametric shared frailty models with marginal additive hazards

    DEFF Research Database (Denmark)

    Pipper, Christian Bressen; Martinussen, Torben

    2004-01-01

    Multivariate failure time data arise when data consist of clusters in which the failure times may be dependent. A popular approach to such data is the marginal proportional hazards model with estimation under the working independence assumption. In some contexts, however, it may be more reasonable...... to use the marginal additive hazards model. We derive asymptotic properties of the Lin and Ying estimators for the marginal additive hazards model for multivariate failure time data. Furthermore we suggest estimating equations for the regression parameters and association parameters in parametric shared...

  9. Additive Hazard Regression Models: An Application to the Natural History of Human Papillomavirus

    Directory of Open Access Journals (Sweden)

    Xianhong Xie

    2013-01-01

    Full Text Available There are several statistical methods for time-to-event analysis, among which is the Cox proportional hazards model that is most commonly used. However, when the absolute change in risk, instead of the risk ratio, is of primary interest or when the proportional hazard assumption for the Cox proportional hazards model is violated, an additive hazard regression model may be more appropriate. In this paper, we give an overview of this approach and then apply a semiparametric as well as a nonparametric additive model to a data set from a study of the natural history of human papillomavirus (HPV in HIV-positive and HIV-negative women. The results from the semiparametric model indicated on average an additional 14 oncogenic HPV infections per 100 woman-years related to CD4 count < 200 relative to HIV-negative women, and those from the nonparametric additive model showed an additional 40 oncogenic HPV infections per 100 women over 5 years of followup, while the estimated hazard ratio in the Cox model was 3.82. Although the Cox model can provide a better understanding of the exposure disease association, the additive model is often more useful for public health planning and intervention.

  10. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    Science.gov (United States)

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  11. [download] (1035Coordinate Descent Methods for the Penalized Semiparametric Additive Hazards Model

    Directory of Open Access Journals (Sweden)

    Anders Gorst-Rasmussen

    2012-04-01

    Full Text Available For survival data with a large number of explanatory variables, lasso penalized Cox regression is a popular regularization strategy. However, a penalized Cox model may not always provide the best fit to data and can be difficult to estimate in high dimension because of its intrinsic nonlinearity. The semiparametric additive hazards model is a flexible alternative which is a natural survival analogue of the standard linear regression model. Building on this analogy, we develop a cyclic coordinate descent algorithm for fitting the lasso and elastic net penalized additive hazards model. The algorithm requires no nonlinear optimization steps and offers excellent performance and stability. An implementation is available in the R package ahaz. We demonstrate this implementation in a small timing study and in an application to real data.

  12.  The additive hazards model with high-dimensional regressors

    DEFF Research Database (Denmark)

    Martinussen, Torben

    2009-01-01

    This paper considers estimation and prediction in the Aalen additive hazards model in the case where the covariate vector is high-dimensional such as gene expression measurements. Some form of dimension reduction of the covariate space is needed to obtain useful statistical analyses. We study the...... model. A standard PLS algorithm can also be constructed, but it turns out that the resulting predictor can only be related to the original covariates via time-dependent coefficients...

  13. Asymptotics on Semiparametric Analysis of Multivariate Failure Time Data Under the Additive Hazards Model

    Institute of Scientific and Technical Information of China (English)

    Huan-bin Liu; Liu-quan Sun; Li-xing Zhu

    2005-01-01

    Many survival studies record the times to two or more distinct failures on each subject. The failures may be events of different natures or may be repetitions of the same kind of event. In this article, we consider the regression analysis of such multivariate failure time data under the additive hazards model. Simple weighted estimating functions for the regression parameters are proposed, and asymptotic distribution theory of the resulting estimators are derived. In addition, a class of generalized Wald and generalized score statistics for hypothesis testing and model selection are presented, and the asymptotic properties of these statistics are examined.

  14. On adjustment for auxiliary covariates in additive hazard models for the analysis of randomized experiments

    DEFF Research Database (Denmark)

    Vansteelandt, S.; Martinussen, Torben; Tchetgen, E. J Tchetgen

    2014-01-01

    's dependence on time or on the auxiliary covariates is misspecified, and even away from the null hypothesis of no treatment effect. We furthermore show that adjustment for auxiliary baseline covariates does not change the asymptotic variance of the estimator of the effect of a randomized treatment. We conclude......We consider additive hazard models (Aalen, 1989) for the effect of a randomized treatment on a survival outcome, adjusting for auxiliary baseline covariates. We demonstrate that the Aalen least-squares estimator of the treatment effect parameter is asymptotically unbiased, even when the hazard...... that, in view of its robustness against model misspecification, Aalen least-squares estimation is attractive for evaluating treatment effects on a survival outcome in randomized experiments, and the primary reasons to consider baseline covariate adjustment in such settings could be interest in subgroup...

  15. Analysis of two-phase sampling data with semiparametric additive hazards models.

    Science.gov (United States)

    Sun, Yanqing; Qian, Xiyuan; Shou, Qiong; Gilbert, Peter B

    2017-07-01

    Under the case-cohort design introduced by Prentice (Biometrica 73:1-11, 1986), the covariate histories are ascertained only for the subjects who experience the event of interest (i.e., the cases) during the follow-up period and for a relatively small random sample from the original cohort (i.e., the subcohort). The case-cohort design has been widely used in clinical and epidemiological studies to assess the effects of covariates on failure times. Most statistical methods developed for the case-cohort design use the proportional hazards model, and few methods allow for time-varying regression coefficients. In addition, most methods disregard data from subjects outside of the subcohort, which can result in inefficient inference. Addressing these issues, this paper proposes an estimation procedure for the semiparametric additive hazards model with case-cohort/two-phase sampling data, allowing the covariates of interest to be missing for cases as well as for non-cases. A more flexible form of the additive model is considered that allows the effects of some covariates to be time varying while specifying the effects of others to be constant. An augmented inverse probability weighted estimation procedure is proposed. The proposed method allows utilizing the auxiliary information that correlates with the phase-two covariates to improve efficiency. The asymptotic properties of the proposed estimators are established. An extensive simulation study shows that the augmented inverse probability weighted estimation is more efficient than the widely adopted inverse probability weighted complete-case estimation method. The method is applied to analyze data from a preventive HIV vaccine efficacy trial.

  16. Estimation of direct effects for survival data by using the Aalen additive hazards model

    DEFF Research Database (Denmark)

    Martinussen, T.; Vansteelandt, S.; Gerster, M.

    2011-01-01

    We extend the definition of the controlled direct effect of a point exposure on a survival outcome, other than through some given, time-fixed intermediate variable, to the additive hazard scale. We propose two-stage estimators for this effect when the exposure is dichotomous and randomly assigned...

  17. Statistical inference for the additive hazards model under outcome-dependent sampling.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo

    2015-09-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.

  18. Education and risk of coronary heart disease: assessment of mediation by behavioral risk factors using the additive hazards model.

    Science.gov (United States)

    Nordahl, Helene; Rod, Naja Hulvej; Frederiksen, Birgitte Lidegaard; Andersen, Ingelise; Lange, Theis; Diderichsen, Finn; Prescott, Eva; Overvad, Kim; Osler, Merete

    2013-02-01

    Educational-related gradients in coronary heart disease (CHD) and mediation by behavioral risk factors are plausible given previous research; however this has not been comprehensively addressed in absolute measures. Questionnaire data on health behavior of 69,513 participants, 52 % women, from seven Danish cohort studies were linked to registry data on education and incidence of CHD. Mediation by smoking, low physical activity, and body mass index (BMI) on the association between education and CHD were estimated by applying newly proposed methods for mediation based on the additive hazards model, and compared with results from the Cox proportional hazards model. Short (vs. long) education was associated with 277 (95 % CI: 219, 336) additional cases of CHD per 100,000 person-years at risk among women, and 461 (95 % CI: 368, 555) additional cases among men. Of these additional cases 17 (95 % CI: 12, 22) for women and 37 (95 % CI: 28, 46) for men could be ascribed to the pathway through smoking. Further, 39 (95 % CI: 30, 49) cases for women and 94 (95 % CI: 79, 110) cases for men could be ascribed to the pathway through BMI. The effects of low physical activity were negligible. Using contemporary methods, the additive hazards model, for mediation we indicated the absolute numbers of CHD cases prevented when modifying smoking and BMI. This study confirms previous claims based on the Cox proportional hazards model that behavioral risk factors partially mediates the effect of education on CHD, and the results seems not to be particularly model dependent.

  19. Predicting the Survival Time for Bladder Cancer Using an Addi-tive Hazards Model in Microarray Data

    Directory of Open Access Journals (Sweden)

    Leili TAPAK

    2016-02-01

    Full Text Available Background: One substantial part of microarray studies is to predict patients’ survival based on their gene expression profile. Variable selection techniques are powerful tools to handle high dimensionality in analysis of microarray data. However, these techniques have not been investigated in competing risks setting. This study aimed to investigate the performance of four sparse variable selection methods in estimating the survival time.Methods: The data included 1381 gene expression measurements and clinical information from 301 patients with bladder cancer operated in the years 1987 to 2000 in hospitals in Denmark, Sweden, Spain, France, and England. Four methods of the least absolute shrinkage and selection operator, smoothly clipped absolute deviation, the smooth integration of counting and absolute deviation and elastic net were utilized for simultaneous variable selection and estimation under an additive hazards model. The criteria of area under ROC curve, Brier score and c-index were used to compare the methods.Results: The median follow-up time for all patients was 47 months. The elastic net approach was indicated to outperform other methods. The elastic net had the lowest integrated Brier score (0.137±0.07 and the greatest median of the over-time AUC and C-index (0.803±0.06 and 0.779±0.13, respectively. Five out of 19 selected genes by the elastic net were significant (P<0.05 under an additive hazards model. It was indicated that the expression of RTN4, SON, IGF1R and CDC20 decrease the survival time, while the expression of SMARCAD1 increase it.Conclusion: The elastic net had higher capability than the other methods for the prediction of survival time in patients with bladder cancer in the presence of competing risks base on additive hazards model.Keywords: Survival analysis, Microarray data, Additive hazards model, Variable selection, Bladder cancer 

  20. 信用传染违约Aalen加性风险模型%Credit Contagion Default Aalen Additive Hazard Model

    Institute of Scientific and Technical Information of China (English)

    田军; 周勇

    2012-01-01

    In this paper we consider the credit risk default models based on additive hazard model. Not only do we incorporate the macroeconomic and firm-specific conditions, but also by introducing industry-specific covariates we characterize the credit contagion between industries. In this way, we overcome underestimating by models before. We provide maximum likelihood estimators and their asymptotic properties for the parametric additive hazard model. Two estimating methods are considered, and then we get that optimal weight estimate is more effective. This paper also consider the semi-parametric additive hazard model, under this model we provide estimators and their asymptotic properties based on estimating equations of martingale. Finally we get good results through simulation.%本文考虑了基于加性风险模型的信用风险违约预报模型,不但考虑了宏观因素和公司个体因素,并且通过引入行业因素来刻画公司间可能存在的不同于宏观因素的信用传染效应,由此克服了以往模型对违约相关性的低估.本文在参数加性风险模型下给出极大似然估计及渐近性,提出两种估计方法并比较二者表现,得到最优权估计更加有效.同时本文还考虑了半参数的风险模型,并基于鞅的估计方程得到其估计及渐近性,均得到不错的结果.

  1. The Additive-multiplicative Hazards Mo del for Multiple Typ e of Recurrent Gap Times

    Institute of Scientific and Technical Information of China (English)

    Zhang Qi-xian; Liu Ji-cai; Guan Qiang

    2015-01-01

    Recurrent event gap times data frequently arise in biomedical studies and often more than one type of event is of interest. To evaluate the effects of covariates on the marginal recurrent event hazards functions, there exist two types of hazards models: the multiplicative hazards model and the additive hazards model. In the paper, we propose a more flexible additive-multiplicative hazards model for multiple type of recurrent gap times data, wherein some covariates are assumed to be additive while others are multiplicative. An estimating equation approach is presented to estimate the regression parameters. We establish asymptotic properties of the proposed estimators.

  2. Identifying and modeling safety hazards

    Energy Technology Data Exchange (ETDEWEB)

    DANIELS,JESSE; BAHILL,TERRY; WERNER,PAUL W.

    2000-03-29

    The hazard model described in this paper is designed to accept data over the Internet from distributed databases. A hazard object template is used to ensure that all necessary descriptors are collected for each object. Three methods for combining the data are compared and contrasted. Three methods are used for handling the three types of interactions between the hazard objects.

  3. Computer Model Locates Environmental Hazards

    Science.gov (United States)

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  4. Models of volcanic eruption hazards

    Energy Technology Data Exchange (ETDEWEB)

    Wohletz, K.H.

    1992-01-01

    Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluid flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.

  5. Crossing Hazard Functions in Common Survival Models.

    Science.gov (United States)

    Zhang, Jiajia; Peng, Yingwei

    2009-10-15

    Crossing hazard functions have extensive applications in modeling survival data. However, existing studies in the literature mainly focus on comparing crossed hazard functions and estimating the time at which the hazard functions cross, and there is little theoretical work on conditions under which hazard functions from a model will have a crossing. In this paper, we investigate crossing status of hazard functions from the proportional hazards (PH) model, the accelerated hazard (AH) model, and the accelerated failure time (AFT) model. We provide and prove conditions under which the hazard functions from the AH and the AFT models have no crossings or a single crossing. A few examples are also provided to demonstrate how the conditions can be used to determine crossing status of hazard functions from the three models.

  6. Modeling lahar behavior and hazards

    Science.gov (United States)

    Manville, Vernon; Major, Jon J.; Fagents, Sarah A.

    2013-01-01

    Lahars are highly mobile mixtures of water and sediment of volcanic origin that are capable of traveling tens to > 100 km at speeds exceeding tens of km hr-1. Such flows are among the most serious ground-based hazards at many volcanoes because of their sudden onset, rapid advance rates, long runout distances, high energy, ability to transport large volumes of material, and tendency to flow along existing river channels where populations and infrastructure are commonly concentrated. They can grow in volume and peak discharge through erosion and incorporation of external sediment and/or water, inundate broad areas, and leave deposits many meters thick. Furthermore, lahars can recur for many years to decades after an initial volcanic eruption, as fresh pyroclastic material is eroded and redeposited during rainfall events, resulting in a spatially and temporally evolving hazard. Improving understanding of the behavior of these complex, gravitationally driven, multi-phase flows is key to mitigating the threat to communities at lahar-prone volcanoes. However, their complexity and evolving nature pose significant challenges to developing the models of flow behavior required for delineating their hazards and hazard zones.

  7. Applied the additive hazard model to predict the survival time of patient with diffuse large B- cell lymphoma and determine the effective genes, using microarray data

    Directory of Open Access Journals (Sweden)

    Arefa Jafarzadeh Kohneloo

    2015-09-01

    Full Text Available Background: Recent studies have shown that effective genes on survival time of cancer patients play an important role as a risk factor or preventive factor. Present study was designed to determine effective genes on survival time for diffuse large B-cell lymphoma patients and predict the survival time using these selected genes. Materials & Methods: Present study is a cohort study was conducted on 40 patients with diffuse large B-cell lymphoma. For these patients, 2042 gene expression was measured. In order to predict the survival time, the composition of the semi-parametric additive survival model with two gene selection methods elastic net and lasso were used. Two methods were evaluated by plotting area under the ROC curve over time and calculating the integral of this curve. Results: Based on our findings, the elastic net method identified 10 genes, and Lasso-Cox method identified 7 genes. GENE3325X increased the survival time (P=0.006, Whereas GENE3980X and GENE377X reduced the survival time (P=0.004. These three genes were selected as important genes in both methods. Conclusion: This study showed that the elastic net method outperformed the common Lasso method in terms of predictive power. Moreover, apply the additive model instead Cox regression and using microarray data is usable way for predict the survival time of patients.

  8. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  9. Satellite image collection modeling for large area hazard emergency response

    Science.gov (United States)

    Liu, Shufan; Hodgson, Michael E.

    2016-08-01

    Timely collection of critical hazard information is the key to intelligent and effective hazard emergency response decisions. Satellite remote sensing imagery provides an effective way to collect critical information. Natural hazards, however, often have large impact areas - larger than a single satellite scene. Additionally, the hazard impact area may be discontinuous, particularly in flooding or tornado hazard events. In this paper, a spatial optimization model is proposed to solve the large area satellite image acquisition planning problem in the context of hazard emergency response. In the model, a large hazard impact area is represented as multiple polygons and image collection priorities for different portion of impact area are addressed. The optimization problem is solved with an exact algorithm. Application results demonstrate that the proposed method can address the satellite image acquisition planning problem. A spatial decision support system supporting the optimization model was developed. Several examples of image acquisition problems are used to demonstrate the complexity of the problem and derive optimized solutions.

  10. 多类型复发事件间隔时间下可加危险率模型%The Additive Hazards Model for Multiple Type Recurrent Gap Times *

    Institute of Scientific and Technical Information of China (English)

    刘吉彩; 张日权; 刘焕彬

    2013-01-01

    In many biomedical and engineering studies, recurrent event data and gap times between successive events are common and often more than one type of recurrent events is of interest. It is well known that the proportional hazards model may not be appropriate for fitting survival times in some settings. In the paper, we consider an additive hazards model for multiple type recurrent gap times data to assess the effect of covariates. For inferences about regression coe?cients and baseline cumulative hazard functions, an estimating equation approach is developed. Furthermore, we establish asymptotic properties of the proposed estimators.%在许多的生物医学和工程研究中,多类型复发事件的间隔时间数据是很常见的。众所周知,比例危险率模型在一些情况下不能很好拟合生存数据。本文,在多类型复发事件的间隔时间数据下,我们利用可加危险率模型来研究协变量对生存时间的影响程度。我们采用估计方程方法获得回归系数和基准累积危险率函数估计。并且,我们建立了所提估计的渐近分布。

  11. Additive hazards regression and partial likelihood estimation for ecological monitoring data across space.

    Science.gov (United States)

    Lin, Feng-Chang; Zhu, Jun

    2012-01-01

    We develop continuous-time models for the analysis of environmental or ecological monitoring data such that subjects are observed at multiple monitoring time points across space. Of particular interest are additive hazards regression models where the baseline hazard function can take on flexible forms. We consider time-varying covariates and take into account spatial dependence via autoregression in space and time. We develop statistical inference for the regression coefficients via partial likelihood. Asymptotic properties, including consistency and asymptotic normality, are established for parameter estimates under suitable regularity conditions. Feasible algorithms utilizing existing statistical software packages are developed for computation. We also consider a simpler additive hazards model with homogeneous baseline hazard and develop hypothesis testing for homogeneity. A simulation study demonstrates that the statistical inference using partial likelihood has sound finite-sample properties and offers a viable alternative to maximum likelihood estimation. For illustration, we analyze data from an ecological study that monitors bark beetle colonization of red pines in a plantation of Wisconsin.

  12. Mixed additive models

    Science.gov (United States)

    Carvalho, Francisco; Covas, Ricardo

    2016-06-01

    We consider mixed models y =∑i =0 w Xiβi with V (y )=∑i =1 w θiMi Where Mi=XiXi⊤ , i = 1, . . ., w, and µ = X0β0. For these we will estimate the variance components θ1, . . ., θw, aswell estimable vectors through the decomposition of the initial model into sub-models y(h), h ∈ Γ, with V (y (h ))=γ (h )Ig (h )h ∈Γ . Moreover we will consider L extensions of these models, i.e., y˚=Ly+ɛ, where L=D (1n1, . . ., 1nw) and ɛ, independent of y, has null mean vector and variance covariance matrix θw+1Iw, where w =∑i =1 n wi .

  13. Development of Additional Hazard Assessment Models

    Science.gov (United States)

    1977-03-01

    the case of continuous release of liquid, the pool teaches a maximum radius for constant liquid regression rate. Therefore, when a fire results on an...Propylene", Ind. J. Tech 5, 81 (1967). 11. Kobe, K. A. and Long, E. G., " Thermochemistry for the Petrochemical Industry, Part III - Monoolefinic

  14. Parametric Regression Models Using Reversed Hazard Rates

    Directory of Open Access Journals (Sweden)

    Asokan Mulayath Variyath

    2014-01-01

    Full Text Available Proportional hazard regression models are widely used in survival analysis to understand and exploit the relationship between survival time and covariates. For left censored survival times, reversed hazard rate functions are more appropriate. In this paper, we develop a parametric proportional hazard rates model using an inverted Weibull distribution. The estimation and construction of confidence intervals for the parameters are discussed. We assess the performance of the proposed procedure based on a large number of Monte Carlo simulations. We illustrate the proposed method using a real case example.

  15. POTENTIAL HAZARDS DUE TO FOOD ADDITIVES IN ORAL HYGIENE PRODUCTS

    Directory of Open Access Journals (Sweden)

    Damla TUNCER-BUDANUR

    2016-04-01

    Full Text Available Food additives used to preserve flavor or to enhance the taste and appearance of foods are also available in oral hygiene products. The aim of this review is to provide information concerning food additives in oral hygiene products and their adverse effects. A great many of food additives in oral hygiene products are potential allergens and they may lead to allergic reactions such as urticaria, contact dermatitis, rhinitis, and angioedema. Dental practitioners, as well as health care providers, must be aware of the possibility of allergic reactions due to food additives in oral hygiene products. Proper dosage levels, delivery vehicles, frequency, potential benefits, and adverse effects of oral health products should be explained completely to the patients. There is a necessity to raise the awareness among dental professionals on this subject and to develop a data gathering system for possible adverse reactions.

  16. Physical vulnerability modelling in natural hazard risk assessment

    Science.gov (United States)

    Douglas, J.

    2007-04-01

    An evaluation of the risk to an exposed element from a hazardous event requires a consideration of the element's vulnerability, which expresses its propensity to suffer damage. This concept allows the assessed level of hazard to be translated to an estimated level of risk and is often used to evaluate the risk from earthquakes and cyclones. However, for other natural perils, such as mass movements, coastal erosion and volcanoes, the incorporation of vulnerability within risk assessment is not well established and consequently quantitative risk estimations are not often made. This impedes the study of the relative contributions from different hazards to the overall risk at a site. Physical vulnerability is poorly modelled for many reasons: the cause of human casualties (from the event itself rather than by building damage); lack of observational data on the hazard, the elements at risk and the induced damage; the complexity of the structural damage mechanisms; the temporal and geographical scales; and the ability to modify the hazard level. Many of these causes are related to the nature of the peril therefore for some hazards, such as coastal erosion, the benefits of considering an element's physical vulnerability may be limited. However, for hazards such as volcanoes and mass movements the modelling of vulnerability should be improved by, for example, following the efforts made in earthquake risk assessment. For example, additional observational data on induced building damage and the hazardous event should be routinely collected and correlated and also numerical modelling of building behaviour during a damaging event should be attempted.

  17. Hazard Warning: model misuse ahead

    DEFF Research Database (Denmark)

    Dickey-Collas, M.; Payne, Mark; Trenkel, V.

    2014-01-01

    The use of modelling approaches in marine science, and in particular fisheries science, is explored. We highlight that the choice of model used for an analysis should account for the question being posed or the context of the management problem. We examine a model-classification scheme based......-users (i.e. mangers or policy developers). The combination of attributes leads to models that are considered to have empirical, mechanistic, or analytical characteristics, but not a combination of them. In fisheries science, many examples can be found of models with these characteristics. However, we...

  18. A conflict model for the international hazardous waste disposal dispute

    Energy Technology Data Exchange (ETDEWEB)

    Hu Kaixian, E-mail: k2hu@engmail.uwaterloo.ca [Department of Systems Design Engineering, University of Waterloo, 200 University Avenue West, Waterloo, Ontario, N2L 3G1 (Canada); Hipel, Keith W., E-mail: kwhipel@uwaterloo.ca [Department of Systems Design Engineering, University of Waterloo, 200 University Avenue West, Waterloo, Ontario, N2L 3G1 (Canada); Fang, Liping, E-mail: lfang@ryerson.ca [Department of Mechanical and Industrial Engineering, Ryerson University, 350 Victoria Street, Toronto, Ontario, M5B 2K3 (Canada)

    2009-12-15

    A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.

  19. Virtual Research Environments for Natural Hazard Modelling

    Science.gov (United States)

    Napier, Hazel; Aldridge, Tim

    2017-04-01

    The Natural Hazards Partnership (NHP) is a group of 17 collaborating public sector organisations providing a mechanism for co-ordinated advice to government and agencies responsible for civil contingency and emergency response during natural hazard events. The NHP has set up a Hazard Impact Model (HIM) group tasked with modelling the impact of a range of UK hazards with the aim of delivery of consistent hazard and impact information. The HIM group consists of 7 partners initially concentrating on modelling the socio-economic impact of 3 key hazards - surface water flooding, land instability and high winds. HIM group partners share scientific expertise and data within their specific areas of interest including hydrological modelling, meteorology, engineering geology, GIS, data delivery, and modelling of socio-economic impacts. Activity within the NHP relies on effective collaboration between partners distributed across the UK. The NHP are acting as a use case study for a new Virtual Research Environment (VRE) being developed by the EVER-EST project (European Virtual Environment for Research - Earth Science Themes: a solution). The VRE is allowing the NHP to explore novel ways of cooperation including improved capabilities for e-collaboration, e-research, automation of processes and e-learning. Collaboration tools are complemented by the adoption of Research Objects, semantically rich aggregations of resources enabling the creation of uniquely identified digital artefacts resulting in reusable science and research. Application of the Research Object concept to HIM development facilitates collaboration, by encapsulating scientific knowledge in a shareable format that can be easily shared and used by partners working on the same model but within their areas of expertise. This paper describes the application of the VRE to the NHP use case study. It outlines the challenges associated with distributed partnership working and how they are being addressed in the VRE. A case

  20. Two models for evaluating landslide hazards

    Science.gov (United States)

    Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.

    2006-01-01

    Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.

  1. Business models for additive manufacturing

    DEFF Research Database (Denmark)

    Hadar, Ronen; Bilberg, Arne; Bogers, Marcel

    2015-01-01

    Digital fabrication — including additive manufacturing (AM), rapid prototyping and 3D printing — has the potential to revolutionize the way in which products are produced and delivered to the customer. Therefore, it challenges companies to reinvent their business model — describing the logic...... of creating and capturing value. In this paper, we explore the implications that AM technologies have for manufacturing systems in the new business models that they enable. In particular, we consider how a consumer goods manufacturer can organize the operations of a more open business model when moving from...... a manufacturer-centric to a consumer-centric value logic. A major shift includes a move from centralized to decentralized supply chains, where consumer goods manufacturers can implement a “hybrid” approach with a focus on localization and accessibility or develop a fully personalized model where the consumer...

  2. Proportional hazards models with discrete frailty.

    Science.gov (United States)

    Caroni, Chrys; Crowder, Martin; Kimber, Alan

    2010-07-01

    We extend proportional hazards frailty models for lifetime data to allow a negative binomial, Poisson, Geometric or other discrete distribution of the frailty variable. This might represent, for example, the unknown number of flaws in an item under test. Zero frailty corresponds to a limited failure model containing a proportion of units that never fail (long-term survivors). Ways of modifying the model to avoid this are discussed. The models are illustrated on a previously published set of data on failures of printed circuit boards and on new data on breaking strengths of samples of cord.

  3. TsuPy: Computational robustness in Tsunami hazard modelling

    Science.gov (United States)

    Schäfer, Andreas M.; Wenzel, Friedemann

    2017-05-01

    Modelling wave propagation is the most essential part in assessing the risk and hazard of tsunami and storm surge events. For the computational assessment of the variability of such events, many simulations are necessary. Even today, most of these simulations are generally run on supercomputers due to the large amount of computations necessary. In this study, a simulation framework, named TsuPy, is introduced to quickly compute tsunami events on a personal computer. It uses the parallelized power of GPUs to accelerate computation. The system is tailored to the application of robust tsunami hazard and risk modelling. It links up to geophysical models to simulate event sources. The system is tested and validated using various benchmarks and real-world case studies. In addition, the robustness criterion is assessed based on a sensitivity study comparing the error impact of various model elements e.g. of topo-bathymetric resolution, knowledge of Manning friction parameters and the knowledge of the tsunami source itself. This sensitivity study is tested on inundation modelling of the 2011 Tohoku tsunami, showing that the major contributor to model uncertainty is in fact the representation of earthquake slip as part of the tsunami source profile. TsuPy provides a fast and reliable tool to quickly assess ocean hazards from tsunamis and thus builds the foundation for a globally uniform hazard and risk assessment for tsunamis.

  4. Experimental Concepts for Testing Seismic Hazard Models

    Science.gov (United States)

    Marzocchi, W.; Jordan, T. H.

    2015-12-01

    Seismic hazard analysis is the primary interface through which useful information about earthquake rupture and wave propagation is delivered to society. To account for the randomness (aleatory variability) and limited knowledge (epistemic uncertainty) of these natural processes, seismologists must formulate and test hazard models using the concepts of probability. In this presentation, we will address the scientific objections that have been raised over the years against probabilistic seismic hazard analysis (PSHA). Owing to the paucity of observations, we must rely on expert opinion to quantify the epistemic uncertainties of PSHA models (e.g., in the weighting of individual models from logic-tree ensembles of plausible models). The main theoretical issue is a frequentist critique: subjectivity is immeasurable; ergo, PSHA models cannot be objectively tested against data; ergo, they are fundamentally unscientific. We have argued (PNAS, 111, 11973-11978) that the Bayesian subjectivity required for casting epistemic uncertainties can be bridged with the frequentist objectivity needed for pure significance testing through "experimental concepts." An experimental concept specifies collections of data, observed and not yet observed, that are judged to be exchangeable (i.e., with a joint distribution independent of the data ordering) when conditioned on a set of explanatory variables. We illustrate, through concrete examples, experimental concepts useful in the testing of PSHA models for ontological errors in the presence of aleatory variability and epistemic uncertainty. In particular, we describe experimental concepts that lead to exchangeable binary sequences that are statistically independent but not identically distributed, showing how the Bayesian concept of exchangeability generalizes the frequentist concept of experimental repeatability. We also address the issue of testing PSHA models using spatially correlated data.

  5. Ground-level ozone following astrophysical ionizing radiation events: an additional biological hazard?

    CERN Document Server

    Thomas, Brian C

    2015-01-01

    Astrophysical ionizing radiation events such as supernovae, gamma-ray bursts, and solar proton events have been recognized as a potential threat to life on Earth, primarily through depletion of stratospheric ozone and subsequent increase in solar UV radiation at Earth's surface and in the upper levels of the ocean. Other work has also considered the potential impact of nitric acid rainout, concluding that no significant threat is likely. Not yet studied to-date is the potential impact of ozone produced in the lower atmosphere following an ionizing radiation event. Ozone is a known irritant to organisms on land and in water and therefore may be a significant additional hazard. Using previously completed atmospheric chemistry modeling we have examined the amount of ozone produced in the lower atmosphere for the case of a gamma-ray burst and find that the values are too small to pose a significant additional threat to the biosphere. These results may be extended to other ionizing radiation events, including supe...

  6. Integrated Modeling for Flood Hazard Mapping Using Watershed Modeling System

    Directory of Open Access Journals (Sweden)

    Seyedeh S. Sadrolashrafi

    2008-01-01

    Full Text Available In this stduy, a new framework which integrates the Geographic Information System (GIS with the Watershed Modeling System (WMS for flood modeling is developed. It also interconnects the terrain models and the GIS software, with commercial standard hydrological and hydraulic models, including HEC-1, HEC-RAS, etc. The Dez River Basin (about 16213 km2 in Khuzestan province, IRAN, is domain of study because of occuring frequent severe flash flooding. As a case of study, a major flood in autumn of 2001 is chosen to examine the modeling framework. The model consists of a rainfall-runoff model (HEC-1 that converts excess precipitation to overland flow and channel runoff and a hydraulic model (HEC-RAS that simulates steady state flow through the river channel network based on the HEC-1, peak hydrographs. In addition, it delineates the maps of potential flood zonation for the Dez River Basin. These are achieved based on the state of the art GIS with using WMS software. Watershed parameters are calibrated manually to perform a good simulation of discharge at three sub-basins. With the calibrated discharge, WMS is capable of producing flood hazard map. The modeling framework presented in this study demonstrates the accuracy and usefulness of the WMS software for flash flooding control. The results of this research will benefit future modeling efforts by providing validate hydrological software to forecast flooding on a regional scale. This model designed for the Dez River Basin, while this regional scale model may be used as a prototype for model applications in other areas.

  7. SCIPUFF - a generalized hazard dispersion model

    Energy Technology Data Exchange (ETDEWEB)

    Sykes, R.I.; Henn, D.S.; Parker, S.F.; Gabruk, R.S. [Titan Research and Technology, Princeton, NJ (United States)

    1996-12-31

    One of the more popular techniques for efficiently representing the dispersion process is the Gaussian puff model, which uses a collection of Lagrangian puffs with Gaussian concentration profiles. SCIPUFF (Second-order Closure Integrated Puff) is an advanced Gaussian puff model. SCIPUFF which uses second-order turbulence closure techniques to relate the dispersion rates to measurable turbulent velocity statistics, providing a wide range of applicability. In addition, the closure model provides a prediction of the statistical variance in the concentration field which can be used to estimate the uncertainty in the dispersion prediction resulting from the inherent uncertainty in the wind field. SCIPUFF has been greatly extended from a power plant plume model to describe more general source characteristics, material properties, and longer range dispersion. In addition, a Graphical User Interface has been developed to provide interactive problem definition and output display. This presentation describes the major features of the model, and presents several example calculations.

  8. Uncertainty and Probability in Natural Hazard Assessment and Their Role in the Testability of Hazard Models

    Science.gov (United States)

    Marzocchi, Warner; Jordan, Thomas

    2014-05-01

    Probabilistic assessment has become a widely accepted procedure to estimate quantitatively natural hazards. In essence probabilities are meant to quantify the ubiquitous and deep uncertainties that characterize the evolution of natural systems. However, notwithstanding the very wide use of the terms 'uncertainty' and 'probability' in natural hazards, the way in which they are linked, how they are estimated and their scientific meaning are far from being clear, as testified by the last Intergovernmental Panel on Climate Change (IPCC) report and by its subsequent review. The lack of a formal framework to interpret uncertainty and probability coherently has paved the way for some of the strongest critics of hazard analysis; in fact, it has been argued that most of natural hazard analyses are intrinsically 'unscientific'. For example, among the concerns is the use of expert opinion to characterize the so-called epistemic uncertainties; many have argued that such personal degrees of belief cannot be measured and, by implication, cannot be tested. The purpose of this talk is to confront and clarify the conceptual issues associated with the role of uncertainty and probability in natural hazard analysis and the conditions that make a hazard model testable and then 'scientific'. Specifically, we show that testability of hazard models requires a suitable taxonomy of uncertainty embedded in a proper logical framework. This taxonomy of uncertainty is composed by aleatory variability, epistemic uncertainty, and ontological error. We discuss their differences, the link with the probability, and their estimation using data, models, and subjective expert opinion. We show that these different uncertainties, and the testability of hazard models, can be unequivocally defined only for a well-defined experimental concept that is a concept external to the model under test. All these discussions are illustrated through simple examples related to the probabilistic seismic hazard analysis.

  9. Lahar Hazard Modeling at Tungurahua Volcano, Ecuador

    Science.gov (United States)

    Sorensen, O. E.; Rose, W. I.; Jaya, D.

    2003-04-01

    lahar-hazard-zones using a digital elevation model (DEM), was used to construct a hazard map for the volcano. The 10 meter resolution DEM was constructed for Tungurahua Volcano using scanned topographic lines obtained from the GIS Department at the Escuela Politécnica Nacional, Quito, Ecuador. The steep topographic gradients and rapid downcutting of most rivers draining the edifice prevents the deposition of lahars on the lower flanks of Tungurahua. Modeling confirms the high degree of flow channelization in the deep Tungurahua canyons. Inundation zones observed and shown by LAHARZ at Baños yield identification of safe zones within the city which would provide safety from even the largest magnitude lahar expected.

  10. Recent Experiences in Aftershock Hazard Modelling in New Zealand

    Science.gov (United States)

    Gerstenberger, M.; Rhoades, D. A.; McVerry, G.; Christophersen, A.; Bannister, S. C.; Fry, B.; Potter, S.

    2014-12-01

    The occurrence of several sequences of earthquakes in New Zealand in the last few years has meant that GNS Science has gained significant recent experience in aftershock hazard and forecasting. First was the Canterbury sequence of events which began in 2010 and included the destructive Christchurch earthquake of February, 2011. This sequence is occurring in what was a moderate-to-low hazard region of the National Seismic Hazard Model (NSHM): the model on which the building design standards are based. With the expectation that the sequence would produce a 50-year hazard estimate in exceedance of the existing building standard, we developed a time-dependent model that combined short-term (STEP & ETAS) and longer-term (EEPAS) clustering with time-independent models. This forecast was combined with the NSHM to produce a forecast of the hazard for the next 50 years. This has been used to revise building design standards for the region and has contributed to planning of the rebuilding of Christchurch in multiple aspects. An important contribution to this model comes from the inclusion of EEPAS, which allows for clustering on the scale of decades. EEPAS is based on three empirical regressions that relate the magnitudes, times of occurrence, and locations of major earthquakes to regional precursory scale increases in the magnitude and rate of occurrence of minor earthquakes. A second important contribution comes from the long-term rate to which seismicity is expected to return in 50-years. With little seismicity in the region in historical times, a controlling factor in the rate is whether-or-not it is based on a declustered catalog. This epistemic uncertainty in the model was allowed for by using forecasts from both declustered and non-declustered catalogs. With two additional moderate sequences in the capital region of New Zealand in the last year, we have continued to refine our forecasting techniques, including the use of potential scenarios based on the aftershock

  11. A quantitative model for volcanic hazard assessment

    OpenAIRE

    W. Marzocchi; Sandri, L.; Furlan, C

    2006-01-01

    Volcanic hazard assessment is a basic ingredient for risk-based decision-making in land-use planning and emergency management. Volcanic hazard is defined as the probability of any particular area being affected by a destructive volcanic event within a given period of time (Fournier d’Albe 1979). The probabilistic nature of such an important issue derives from the fact that volcanic activity is a complex process, characterized by several and usually unknown degrees o...

  12. Parametric hazard rate models for long-term sickness absence

    NARCIS (Netherlands)

    Koopmans, Petra C.; Roelen, Corne A. M.; Groothoff, Johan W.

    2009-01-01

    In research on the time to onset of sickness absence and the duration of sickness absence episodes, Cox proportional hazard models are in common use. However, parametric models are to be preferred when time in itself is considered as independent variable. This study compares parametric hazard rate m

  13. Parametric hazard rate models for long-term sickness absence

    NARCIS (Netherlands)

    Koopmans, Petra C.; Roelen, Corne A. M.; Groothoff, Johan W.

    2009-01-01

    In research on the time to onset of sickness absence and the duration of sickness absence episodes, Cox proportional hazard models are in common use. However, parametric models are to be preferred when time in itself is considered as independent variable. This study compares parametric hazard rate m

  14. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  15. Incident duration modeling using flexible parametric hazard-based models.

    Science.gov (United States)

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  16. High-Dimensional Additive Hazards Regression for Oral Squamous Cell Carcinoma Using Microarray Data: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Omid Hamidi

    2014-01-01

    Full Text Available Microarray technology results in high-dimensional and low-sample size data sets. Therefore, fitting sparse models is substantial because only a small number of influential genes can reliably be identified. A number of variable selection approaches have been proposed for high-dimensional time-to-event data based on Cox proportional hazards where censoring is present. The present study applied three sparse variable selection techniques of Lasso, smoothly clipped absolute deviation and the smooth integration of counting, and absolute deviation for gene expression survival time data using the additive risk model which is adopted when the absolute effects of multiple predictors on the hazard function are of interest. The performances of used techniques were evaluated by time dependent ROC curve and bootstrap .632+ prediction error curves. The selected genes by all methods were highly significant (P<0.001. The Lasso showed maximum median of area under ROC curve over time (0.95 and smoothly clipped absolute deviation showed the lowest prediction error (0.105. It was observed that the selected genes by all methods improved the prediction of purely clinical model indicating the valuable information containing in the microarray features. So it was concluded that used approaches can satisfactorily predict survival based on selected gene expression measurements.

  17. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    Science.gov (United States)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  18. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    Directory of Open Access Journals (Sweden)

    Mark Stirling

    2017-06-01

    Full Text Available We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration, and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g., short-term forecasting. The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  19. VHub - Cyberinfrastructure for volcano eruption and hazards modeling and simulation

    Science.gov (United States)

    Valentine, G. A.; Jones, M. D.; Bursik, M. I.; Calder, E. S.; Gallo, S. M.; Connor, C.; Carn, S. A.; Rose, W. I.; Moore-Russo, D. A.; Renschler, C. S.; Pitman, B.; Sheridan, M. F.

    2009-12-01

    Volcanic risk is increasing as populations grow in active volcanic regions, and as national economies become increasingly intertwined. In addition to their significance to risk, volcanic eruption processes form a class of multiphase fluid dynamics with rich physics on many length and time scales. Risk significance, physics complexity, and the coupling of models to complex dynamic spatial datasets all demand the development of advanced computational techniques and interdisciplinary approaches to understand and forecast eruption dynamics. Innovative cyberinfrastructure is needed to enable global collaboration and novel scientific creativity, while simultaneously enabling computational thinking in real-world risk mitigation decisions - an environment where quality control, documentation, and traceability are key factors. Supported by NSF, we are developing a virtual organization, referred to as VHub, to address this need. Overarching goals of the VHub project are: Dissemination. Make advanced modeling and simulation capabilities and key data sets readily available to researchers, students, and practitioners around the world. Collaboration. Provide a mechanism for participants not only to be users but also co-developers of modeling capabilities, and contributors of experimental and observational data sets for use in modeling and simulation, in a collaborative environment that reaches far beyond local work groups. Comparison. Facilitate comparison between different models in order to provide the practitioners with guidance for choosing the "right" model, depending upon the intended use, and provide a platform for multi-model analysis of specific problems and incorporation into probabilistic assessments. Application. Greatly accelerate access and application of a wide range of modeling tools and related data sets to agencies around the world that are charged with hazard planning, mitigation, and response. Education. Provide resources that will promote the training of the

  20. A high-resolution global flood hazard model

    Science.gov (United States)

    Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-09-01

    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.

  1. NBC Hazard Prediction Model Capability Analysis

    Science.gov (United States)

    1999-09-01

    Puff( SCIPUFF ) Model Verification and Evaluation Study, Air Resources Laboratory, NOAA, May 1998. Based on the NOAA review, the VLSTRACK developers...TO SUBSTANTIAL DIFFERENCES IN PREDICTIONS HPAC uses a transport and dispersion (T&D) model called SCIPUFF and an associated mean wind field model... SCIPUFF is a model for atmospheric dispersion that uses the Gaussian puff method - an arbitrary time-dependent concentration field is represented

  2. Hazard identification by extended multilevel flow modelling with function roles

    DEFF Research Database (Denmark)

    Wu, Jing; Zhang, Laibin; Jørgensen, Sten Bay

    2014-01-01

    HAZOP studies are widely accepted in chemical and petroleum industries as the method for conducting process hazard analysis related to design, maintenance and operation of th e systems. In this paper, a HAZOP reasoning method based on function-oriented modelling, multilevel flow modelling (MFM) i...

  3. Regional landslide hazard assessment based on Distance Evaluation Model

    Institute of Scientific and Technical Information of China (English)

    Jiacun LI; Yan QIN; Jing LI

    2008-01-01

    There are many factors influencing landslide occurrence. The key for landslide control is to confirm the regional landslide hazard factors. The Cameron Highlands of Malaysia was selected as the study area. By bivariate statistical analysis method with GIS software the authors analyzed the relationships among landslides and environmental factors such as lithology, geomorphy, elevation, road and land use. Distance Evaluation Model was developed with Landslide Density(LD). And the assessment of landslide hazard of Cameron Highlands was performed. The result shows that the model has higher prediction precision.

  4. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    Science.gov (United States)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  5. Hazard Response Modeling Uncertainty (A Quantitative Method)

    Science.gov (United States)

    1988-10-01

    ersio 114-11aiaiI I I I II L I ATINI Iri Ig IN Ig - ISI I I s InWLS I I I II I I I IWILLa RguOSmI IT$ INDS In s list INDIN I Im Ad inla o o ILLS I...OesotASII II I I I" GASau ATI im IVS ES Igo Igo INC 9 -U TIg IN ImS. I IgoIDI II i t I I ol f i isI I I I I I * WOOL ETY tGMIM (SU I YESMI jWM# GUSA imp I...is the concentration predicted by some component or model.P The variance of C /C is calculated and defined as var(Model I), where Modelo p I could be

  6. Research collaboration, hazard modeling and dissemination in volcanology with Vhub

    Science.gov (United States)

    Palma Lizana, J. L.; Valentine, G. A.

    2011-12-01

    Vhub (online at vhub.org) is a cyberinfrastructure for collaboration in volcanology research, education, and outreach. One of the core objectives of this project is to accelerate the transfer of research tools to organizations and stakeholders charged with volcano hazard and risk mitigation (such as observatories). Vhub offers a clearinghouse for computational models of volcanic processes and data analysis, documentation of those models, and capabilities for online collaborative groups focused on issues such as code development, configuration management, benchmarking, and validation. A subset of simulations is already available for online execution, eliminating the need to download and compile locally. In addition, Vhub is a platform for sharing presentations and other educational material in a variety of media formats, which are useful in teaching university-level volcanology. VHub also has wikis, blogs and group functions around specific topics to encourage collaboration and discussion. In this presentation we provide examples of the vhub capabilities, including: (1) tephra dispersion and block-and-ash flow models; (2) shared educational materials; (3) online collaborative environment for different types of research, including field-based studies and plume dispersal modeling; (4) workshops. Future goals include implementation of middleware to allow access to data and databases that are stored and maintained at various institutions around the world. All of these capabilities can be exercised with a user-defined level of privacy, ranging from completely private (only shared and visible to specified people) to completely public. The volcanological community is encouraged to use the resources of vhub and also to contribute models, datasets, and other items that authors would like to disseminate. The project is funded by the US National Science Foundation and includes a core development team at University at Buffalo, Michigan Technological University, and University

  7. Toward Building a New Seismic Hazard Model for Mainland China

    Science.gov (United States)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  8. Identifying confounders using additive noise models

    CERN Document Server

    Janzing, Dominik; Mooij, Joris; Schoelkopf, Bernhard

    2012-01-01

    We propose a method for inferring the existence of a latent common cause ('confounder') of two observed random variables. The method assumes that the two effects of the confounder are (possibly nonlinear) functions of the confounder plus independent, additive noise. We discuss under which conditions the model is identifiable (up to an arbitrary reparameterization of the confounder) from the joint distribution of the effects. We state and prove a theoretical result that provides evidence for the conjecture that the model is generically identifiable under suitable technical conditions. In addition, we propose a practical method to estimate the confounder from a finite i.i.d. sample of the effects and illustrate that the method works well on both simulated and real-world data.

  9. Application of remote sensed precipitation for landslide hazard assessment models

    Science.gov (United States)

    Kirschbaum, D. B.; Peters-Lidard, C. D.; Adler, R. F.; Kumar, S.; Harrison, K.

    2010-12-01

    The increasing availability of remotely sensed land surface and precipitation information provides new opportunities to improve upon existing landslide hazard assessment methods. This research considers how satellite precipitation information can be applied in two types of landslide hazard assessment frameworks: a global, landslide forecasting framework and a deterministic slope-stability model. Examination of both landslide hazard frameworks points to the need for higher resolution spatial and temporal precipitation inputs to better identify small-scale precipitation forcings that contribute to significant landslide triggering. This research considers how satellite precipitation information may be downscaled to account for local orographic impacts and better resolve peak intensities. Precipitation downscaling is employed in both models to better approximate local rainfall distribution, antecedent conditions, and intensities. Future missions, such as the Global Precipitation Measurement (GPM) mission will provide more frequent and extensive estimates of precipitation at the global scale and have the potential to significantly advance landslide hazard assessment tools. The first landslide forecasting tool, running in near real-time at http://trmm.gsfc.nasa.gov, considers potential landslide activity at the global scale and relies on Tropical Rainfall Measuring Mission (TRMM) precipitation data and surface products to provide a near real-time picture of where landslides may be triggered. Results of the algorithm evaluation indicate that considering higher resolution susceptibility information is a key factor in better resolving potentially hazardous areas. However, success in resolving when landslide activity is probable is closely linked to appropriate characterization of the empirical rainfall intensity-duration thresholds. We test a variety of rainfall thresholds to evaluate algorithmic performance accuracy and determine the optimal set of conditions that

  10. The Framework of a Coastal Hazards Model - A Tool for Predicting the Impact of Severe Storms

    Science.gov (United States)

    Barnard, Patrick L.; O'Reilly, Bill; van Ormondt, Maarten; Elias, Edwin; Ruggiero, Peter; Erikson, Li H.; Hapke, Cheryl; Collins, Brian D.; Guza, Robert T.; Adams, Peter N.; Thomas, Julie

    2009-01-01

    The U.S. Geological Survey (USGS) Multi-Hazards Demonstration Project in Southern California (Jones and others, 2007) is a five-year project (FY2007-FY2011) integrating multiple USGS research activities with the needs of external partners, such as emergency managers and land-use planners, to produce products and information that can be used to create more disaster-resilient communities. The hazards being evaluated include earthquakes, landslides, floods, tsunamis, wildfires, and coastal hazards. For the Coastal Hazards Task of the Multi-Hazards Demonstration Project in Southern California, the USGS is leading the development of a modeling system for forecasting the impact of winter storms threatening the entire Southern California shoreline from Pt. Conception to the Mexican border. The modeling system, run in real-time or with prescribed scenarios, will incorporate atmospheric information (that is, wind and pressure fields) with a suite of state-of-the-art physical process models (that is, tide, surge, and wave) to enable detailed prediction of currents, wave height, wave runup, and total water levels. Additional research-grade predictions of coastal flooding, inundation, erosion, and cliff failure will also be performed. Initial model testing, performance evaluation, and product development will be focused on a severe winter-storm scenario developed in collaboration with the Winter Storm Working Group of the USGS Multi-Hazards Demonstration Project in Southern California. Additional offline model runs and products will include coastal-hazard hindcasts of selected historical winter storms, as well as additional severe winter-storm simulations based on statistical analyses of historical wave and water-level data. The coastal-hazards model design will also be appropriate for simulating the impact of storms under various sea level rise and climate-change scenarios. The operational capabilities of this modeling system are designed to provide emergency planners with

  11. Current Methods of Natural Hazards Communication used within Catastrophe Modelling

    Science.gov (United States)

    Dawber, C.; Latchman, S.

    2012-04-01

    In the field of catastrophe modelling, natural hazards need to be explained every day to (re)insurance professionals so that they may understand estimates of the loss potential of their portfolio. The effective communication of natural hazards to city professionals requires different strategies to be taken depending on the audience, their prior knowledge and respective backgrounds. It is best to have at least three sets of tools in your arsenal for a specific topic, 1) an illustration/animation, 2) a mathematical formula and 3) a real world case study example. This multi-faceted approach will be effective for those that learn best by pictorial means, mathematical means or anecdotal means. To show this we will use a set of real examples employed in the insurance industry of how different aspects of natural hazards and the uncertainty around them are explained to city professionals. For example, explaining the different modules within a catastrophe model such as the hazard, vulnerability and loss modules. We highlight how recent technology such as 3d plots, video recording and Google Earth maps, when used properly can help explain concepts quickly and easily. Finally we also examine the pitfalls of using overly-complicated visualisations and in general how counter-intuitive deductions may be made.

  12. Seismic hazard assessment over time: Modelling earthquakes in Taiwan

    Science.gov (United States)

    Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting

    2017-04-01

    To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.

  13. Analysis of time to event outcomes in randomized controlled trials by generalized additive models.

    Directory of Open Access Journals (Sweden)

    Christos Argyropoulos

    Full Text Available Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking.By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population.PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data.By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial results under proportional and

  14. A generalized additive regression model for survival times

    DEFF Research Database (Denmark)

    Scheike, Thomas H.

    2001-01-01

    Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models......Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models...

  15. A generalized additive regression model for survival times

    DEFF Research Database (Denmark)

    Scheike, Thomas H.

    2001-01-01

    Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models......Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models...

  16. Rockfall hazard analysis using LiDAR and spatial modeling

    Science.gov (United States)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  17. Random weighting method for Cox’s proportional hazards model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Variance of parameter estimate in Cox’s proportional hazards model is based on asymptotic variance. When sample size is small, variance can be estimated by bootstrap method. However, if censoring rate in a survival data set is high, bootstrap method may fail to work properly. This is because bootstrap samples may be even more heavily censored due to repeated sampling of the censored observations. This paper proposes a random weighting method for variance estimation and confidence interval estimation for proportional hazards model. This method, unlike the bootstrap method, does not lead to more severe censoring than the original sample does. Its large sample properties are studied and the consistency and asymptotic normality are proved under mild conditions. Simulation studies show that the random weighting method is not as sensitive to heavy censoring as bootstrap method is and can produce good variance estimates or confidence intervals.

  18. Random weighting method for Cox's proportional hazards model

    Institute of Scientific and Technical Information of China (English)

    CUI WenQuan; LI Kai; YANG YaNing; WU YueHua

    2008-01-01

    Variance of parameter estimate in Cox's proportional hazards model is based on asymptotic variance.When sample size is small,variance can be estimated by bootstrap method.However,if censoring rate in a survival data set is high,bootstrap method may fail to work properly.This is because bootstrap samples may be even more heavily censored due to repeated sampling of the censored observations.This paper proposes a random weighting method for variance estimation and confidence interval estimation for proportional hazards model.This method,unlike the bootstrap method,does not lead to more severe censoring than the original sample does.Its large sample properties are studied and the consistency and asymptotic normality are proved under mild conditions.Simulation studies show that the random weighting method is not as sensitive to heavy censoring as bootstrap method is and can produce good variance estimates or confidence intervals.

  19. Defaultable Game Options in a Hazard Process Model

    Directory of Open Access Journals (Sweden)

    Tomasz R. Bielecki

    2009-01-01

    Full Text Available The valuation and hedging of defaultable game options is studied in a hazard process model of credit risk. A convenient pricing formula with respect to a reference filteration is derived. A connection of arbitrage prices with a suitable notion of hedging is obtained. The main result shows that the arbitrage prices are the minimal superhedging prices with sigma martingale cost under a risk neutral measure.

  20. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    Science.gov (United States)

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  1. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  2. Integrating Community Volcanic Hazard Mapping, Geographic Information Systems, and Modeling to Reduce Volcanic Hazard Vulnerability

    Science.gov (United States)

    Bajo Sanchez, Jorge V.

    This dissertation is composed of an introductory chapter and three papers about vulnerability and volcanic hazard maps with emphasis on lahars. The introductory chapter reviews definitions of the term vulnerability by the social and natural hazard community and it provides a new definition of hazard vulnerability that includes social and natural hazard factors. The first paper explains how the Community Volcanic Hazard Map (CVHM) is used for vulnerability analysis and explains in detail a new methodology to obtain valuable information about ethnophysiographic differences, hazards, and landscape knowledge of communities in the area of interest: the Canton Buenos Aires situated on the northern flank of the Santa Ana (Ilamatepec) Volcano, El Salvador. The second paper is about creating a lahar hazard map in data poor environments by generating a landslide inventory and obtaining potential volumes of dry material that can potentially be carried by lahars. The third paper introduces an innovative lahar hazard map integrating information generated by the previous two papers. It shows the differences in hazard maps created by the communities and experts both visually as well as quantitatively. This new, integrated hazard map was presented to the community with positive feedback and acceptance. The dissertation concludes with a summary chapter on the results and recommendations.

  3. CREATION OF THE MODEL ADDITIONAL PROTOCOL

    Energy Technology Data Exchange (ETDEWEB)

    Houck, F.; Rosenthal, M.; Wulf, N.

    2010-05-25

    In 1991, the international nuclear nonproliferation community was dismayed to discover that the implementation of safeguards by the International Atomic Energy Agency (IAEA) under its NPT INFCIRC/153 safeguards agreement with Iraq had failed to detect Iraq's nuclear weapon program. It was now clear that ensuring that states were fulfilling their obligations under the NPT would require not just detecting diversion but also the ability to detect undeclared materials and activities. To achieve this, the IAEA initiated what would turn out to be a five-year effort to reappraise the NPT safeguards system. The effort engaged the IAEA and its Member States and led to agreement in 1997 on a new safeguards agreement, the Model Protocol Additional to the Agreement(s) between States and the International Atomic Energy Agency for the Application of Safeguards. The Model Protocol makes explicit that one IAEA goal is to provide assurance of the absence of undeclared nuclear material and activities. The Model Protocol requires an expanded declaration that identifies a State's nuclear potential, empowers the IAEA to raise questions about the correctness and completeness of the State's declaration, and, if needed, allows IAEA access to locations. The information required and the locations available for access are much broader than those provided for under INFCIRC/153. The negotiation was completed in quite a short time because it started with a relatively complete draft of an agreement prepared by the IAEA Secretariat. This paper describes how the Model Protocol was constructed and reviews key decisions that were made both during the five-year period and in the actual negotiation.

  4. Mathematical-statistical models of generated hazardous hospital solid waste.

    Science.gov (United States)

    Awad, A R; Obeidat, M; Al-Shareef, M

    2004-01-01

    This research work was carried out under the assumption that wastes generated from hospitals in Irbid, Jordan were hazardous. The hazardous and non-hazardous wastes generated from the different divisions in the three hospitals under consideration were not separated during collection process. Three hospitals, Princess Basma hospital (public), Princess Bade'ah hospital (teaching), and Ibn Al-Nafis hospital (private) in Irbid were selected for this study. The research work took into account the amounts of solid waste accumulated from each division and also determined the total amount generated from each hospital. The generation rates were determined (kilogram per patient, per day; kilogram per bed, per day) for the three hospitals. These generation rates were compared with similar hospitals in Europe. The evaluation suggested that the current situation regarding the management of these wastes in the three studied hospitals needs revision as these hospitals do not follow methods of waste disposals that would reduce risk to human health and the environment practiced in developed countries. Statistical analysis was carried out to develop models for the prediction of the quantity of waste generated at each hospital (public, teaching, private). In these models number of patients, beds, and type of hospital were revealed to be significant factors on quantity of waste generated. Multiple regressions were also used to estimate the quantities of wastes generated from similar divisions in the three hospitals (surgery, internal diseases, and maternity).

  5. Development of hazard-compatible building fragility and vulnerability models

    Science.gov (United States)

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  6. 76 FR 5370 - Potential Addition of Vapor Intrusion Component to the Hazard Ranking System

    Science.gov (United States)

    2011-01-31

    ... vapor intrusion contamination to be evaluated for placement on the NPL. EPA is accepting public feedback.... Listening Session: Oral and written comments on the topics in the SUPPLEMENTARY INFORMATION section of this... Notice to allow interested parties to present feedback on the potential HRS addition. EPA welcomes...

  7. Integrated Modeling for Flood Hazard Mapping Using Watershed Modeling System

    National Research Council Canada - National Science Library

    Seyedeh S. Sadrolashrafi; Thamer A. Mohamed; Ahmad R.B. Mahmud; Majid K. Kholghi; Amir Samadi

    2008-01-01

    ...) with the Watershed Modeling System (WMS) for flood modeling is developed. It also interconnects the terrain models and the GIS software, with commercial standard hydrological and hydraulic models, including HEC-1, HEC-RAS, etc...

  8. Mark-specific hazard ratio model with missing multivariate marks.

    Science.gov (United States)

    Juraska, Michal; Gilbert, Peter B

    2016-10-01

    An objective of randomized placebo-controlled preventive HIV vaccine efficacy (VE) trials is to assess the relationship between vaccine effects to prevent HIV acquisition and continuous genetic distances of the exposing HIVs to multiple HIV strains represented in the vaccine. The set of genetic distances, only observed in failures, is collectively termed the 'mark.' The objective has motivated a recent study of a multivariate mark-specific hazard ratio model in the competing risks failure time analysis framework. Marks of interest, however, are commonly subject to substantial missingness, largely due to rapid post-acquisition viral evolution. In this article, we investigate the mark-specific hazard ratio model with missing multivariate marks and develop two inferential procedures based on (i) inverse probability weighting (IPW) of the complete cases, and (ii) augmentation of the IPW estimating functions by leveraging auxiliary data predictive of the mark. Asymptotic properties and finite-sample performance of the inferential procedures are presented. This research also provides general inferential methods for semiparametric density ratio/biased sampling models with missing data. We apply the developed procedures to data from the HVTN 502 'Step' HIV VE trial.

  9. Uncertainties in modeling hazardous gas releases for emergency response

    Directory of Open Access Journals (Sweden)

    Kathrin Baumann-Stanzer

    2011-02-01

    Full Text Available In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms-1 in wind speed, on the scale of 50 degrees in wind direction, up to 4°C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders.

  10. Uncertainty in natural hazards, modeling and decision support: An introduction to this volume [Chapter 1

    Science.gov (United States)

    Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde

    2017-01-01

    Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...

  11. On penalized likelihood estimation for a non-proportional hazards regression model.

    Science.gov (United States)

    Devarajan, Karthik; Ebrahimi, Nader

    2013-07-01

    In this paper, a semi-parametric generalization of the Cox model that permits crossing hazard curves is described. A theoretical framework for estimation in this model is developed based on penalized likelihood methods. It is shown that the optimal solution to the baseline hazard, baseline cumulative hazard and their ratio are hyperbolic splines with knots at the distinct failure times.

  12. Jackknifed random weighting for Cox proportional hazards model

    Institute of Scientific and Technical Information of China (English)

    LI Xiao; WU YaoHua; TU DongSheng

    2012-01-01

    The Cox proportional hazards model is the most used statistical model in the analysis of survival time data.Recently,a random weighting method was proposed to approximate the distribution of the maximum partial likelihood estimate for the regression coefficient in the Cox model.This method was shown not as sensitive to heavy censoring as the bootstrap method in simulation studies but it may not be second-order accurate as was shown for the bootstrap approximation.In this paper,we propose an alternative random weighting method based on one-step linear jackknife pseudo values and prove the second accuracy of the proposed method.Monte Carlo simulations are also performed to evaluate the proposed method for fixed sample sizes.

  13. Regularization for Cox's Proportional Hazards Model With NP-Dimensionality

    CERN Document Server

    Bradic, Jelena; Jiang, Jiancheng

    2010-01-01

    High throughput genetic sequencing arrays with thousands of measurements per sample and a great amount of related censored clinical data have increased demanding need for better measurement specific model selection. In this paper we establish strong oracle properties of non-concave penalized methods for {\\it non-polynomial} (NP) dimensional data with censoring in the framework of Cox's proportional hazards model. A class of folded-concave penalties are employed and both LASSO and SCAD are discussed specifically. We unveil the question under which dimensionality and correlation restrictions can an oracle estimator be constructed and grasped. It is demonstrated that non-concave penalties lead to significant reduction of the "irrepresentable condition" needed for LASSO model selection consistency. The large deviation result for martingales, bearing interests of its own, is developed for characterizing the strong oracle property. Moreover, the non-concave regularized estimator, is shown to achieve asymptotically ...

  14. Conveying Lava Flow Hazards Through Interactive Computer Models

    Science.gov (United States)

    Thomas, D.; Edwards, H. K.; Harnish, E. P.

    2007-12-01

    As part of an Information Sciences senior class project, a software package of an interactive version of the FLOWGO model was developed for the Island of Hawaii. The software is intended for use in an ongoing public outreach and hazards awareness program that educates the public about lava flow hazards on the island. The design parameters for the model allow an unsophisticated user to initiate a lava flow anywhere on the island and allow it to flow down-slope to the shoreline while displaying a timer to show the rate of advance of the flow. The user is also able to modify a range of input parameters including eruption rate, the temperature of the lava at the vent, and crystal fraction present in the lava at the source. The flow trajectories are computed using a 30 m digital elevation model for the island and the rate of advance of the flow is estimated using the average slope angle and the computed viscosity of the lava as it cools in either a channel (high heat loss) or lava tube (low heat loss). Even though the FLOWGO model is not intended to, and cannot, accurately predict the rate of advance of a tube- fed or channel-fed flow, the relative rates of flow advance for steep or flat-lying terrain convey critically important hazard information to the public: communities located on the steeply sloping western flanks of Mauna Loa may have no more than a few hours to evacuate in the face of a threatened flow from Mauna Loa's southwest rift whereas communities on the more gently sloping eastern flanks of Mauna Loa and Kilauea may have weeks to months to prepare for evacuation. Further, the model also can show the effects of loss of critical infrastructure with consequent impacts on access into and out of communities, loss of electrical supply, and communications as a result of lava flow implacement. The interactive model has been well received in an outreach setting and typically generates greater involvement by the participants than has been the case with static maps

  15. New Elements To Consider When Modeling the Hazards Associated with Botulinum Neurotoxin in Food.

    Science.gov (United States)

    Ihekwaba, Adaoha E C; Mura, Ivan; Malakar, Pradeep K; Walshaw, John; Peck, Michael W; Barker, G C

    2015-09-08

    Botulinum neurotoxins (BoNTs) produced by the anaerobic bacterium Clostridium botulinum are the most potent biological substances known to mankind. BoNTs are the agents responsible for botulism, a rare condition affecting the neuromuscular junction and causing a spectrum of diseases ranging from mild cranial nerve palsies to acute respiratory failure and death. BoNTs are a potential biowarfare threat and a public health hazard, since outbreaks of foodborne botulism are caused by the ingestion of preformed BoNTs in food. Currently, mathematical models relating to the hazards associated with C. botulinum, which are largely empirical, make major contributions to botulinum risk assessment. Evaluated using statistical techniques, these models simulate the response of the bacterium to environmental conditions. Though empirical models have been successfully incorporated into risk assessments to support food safety decision making, this process includes significant uncertainties so that relevant decision making is frequently conservative and inflexible. Progression involves encoding into the models cellular processes at a molecular level, especially the details of the genetic and molecular machinery. This addition drives the connection between biological mechanisms and botulism risk assessment and hazard management strategies. This review brings together elements currently described in the literature that will be useful in building quantitative models of C. botulinum neurotoxin production. Subsequently, it outlines how the established form of modeling could be extended to include these new elements. Ultimately, this can offer further contributions to risk assessments to support food safety decision making.

  16. Hazard based models for freeway traffic incident duration.

    Science.gov (United States)

    Tavassoli Hojati, Ahmad; Ferreira, Luis; Washington, Simon; Charles, Phil

    2013-03-01

    Assessing and prioritising cost-effective strategies to mitigate the impacts of traffic incidents and accidents on non-recurrent congestion on major roads represents a significant challenge for road network managers. This research examines the influence of numerous factors associated with incidents of various types on their duration. It presents a comprehensive traffic incident data mining and analysis by developing an incident duration model based on twelve months of incident data obtained from the Australian freeway network. Parametric accelerated failure time (AFT) survival models of incident duration were developed, including log-logistic, lognormal, and Weibul-considering both fixed and random parameters, as well as a Weibull model with gamma heterogeneity. The Weibull AFT models with random parameters were appropriate for modelling incident duration arising from crashes and hazards. A Weibull model with gamma heterogeneity was most suitable for modelling incident duration of stationary vehicles. Significant variables affecting incident duration include characteristics of the incidents (severity, type, towing requirements, etc.), and location, time of day, and traffic characteristics of the incident. Moreover, the findings reveal no significant effects of infrastructure and weather on incident duration. A significant and unique contribution of this paper is that the durations of each type of incident are uniquely different and respond to different factors. The results of this study are useful for traffic incident management agencies to implement strategies to reduce incident duration, leading to reduced congestion, secondary incidents, and the associated human and economic losses.

  17. Advancements in the global modelling of coastal flood hazard

    Science.gov (United States)

    Muis, Sanne; Verlaan, Martin; Nicholls, Robert J.; Brown, Sally; Hinkel, Jochen; Lincke, Daniel; Vafeidis, Athanasios T.; Scussolini, Paolo; Winsemius, Hessel C.; Ward, Philip J.

    2017-04-01

    Storm surges and high tides can cause catastrophic floods. Due to climate change and socio-economic development the potential impacts of coastal floods are increasing globally. Global modelling of coastal flood hazard provides an important perspective to quantify and effectively manage this challenge. In this contribution we show two recent advancements in global modelling of coastal flood hazard: 1) a new improved global dataset of extreme sea levels, and 2) an improved vertical datum for extreme sea levels. Both developments have important implications for estimates of exposure and inundation modelling. For over a decade, the only global dataset of extreme sea levels was the DINAS-COAST Extreme Sea Levels (DCESL), which uses a static approximation to estimate total water levels for different return periods. Recent advances have enabled the development of a new dynamically derived dataset: the Global Tide and Surge Reanalysis (GTSR) dataset. Here we present a comparison of the DCESL and GTSR extreme sea levels and the resulting global flood exposure for present-day conditions. While DCESL generally overestimates extremes, GTSR underestimates extremes, particularly in the tropics. This results in differences in estimates of flood exposure. When using the 1 in 100-year GTSR extremes, the exposed global population is 28% lower than when using the 1 in 100-year DCESL extremes. Previous studies at continental to global-scales have not accounted for the fact that GTSR and DCESL are referenced to mean sea level, whereas global elevation datasets, such as SRTM, are referenced to the EGM96 geoid. We propose a methodology to correct for the difference in vertical datum and demonstrate that this also has a large effect on exposure. For GTSR, the vertical datum correction results in a 60% increase in global exposure.

  18. Model averaging for semiparametric additive partial linear models

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    To improve the prediction accuracy of semiparametric additive partial linear models(APLM) and the coverage probability of confidence intervals of the parameters of interest,we explore a focused information criterion for model selection among ALPM after we estimate the nonparametric functions by the polynomial spline smoothing,and introduce a general model average estimator.The major advantage of the proposed procedures is that iterative backfitting implementation is avoided,which thus results in gains in computational simplicity.The resulting estimators are shown to be asymptotically normal.A simulation study and a real data analysis are presented for illustrations.

  19. Preliminary deformation model for National Seismic Hazard map of Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Meilano, Irwan; Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z. [Geodesy Research Division, Faculty of Earth Science and Technology, Institute of Technology Bandung (Indonesia); Susilo,; Efendi, Joni [Agency for Geospatial Information (BIG) (Indonesia)

    2015-04-24

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  20. Optimization of maintenance policy using the proportional hazard model

    Energy Technology Data Exchange (ETDEWEB)

    Samrout, M. [Information Sciences and Technologies Institute, University of Technology of Troyes, 10000 Troyes (France)], E-mail: mohamad.el_samrout@utt.fr; Chatelet, E. [Information Sciences and Technologies Institute, University of Technology of Troyes, 10000 Troyes (France)], E-mail: chatelt@utt.fr; Kouta, R. [M3M Laboratory, University of Technology of Belfort Montbeliard (France); Chebbo, N. [Industrial Systems Laboratory, IUT, Lebanese University (Lebanon)

    2009-01-15

    The evolution of system reliability depends on its structure as well as on the evolution of its components reliability. The latter is a function of component age during a system's operating life. Component aging is strongly affected by maintenance activities performed on the system. In this work, we consider two categories of maintenance activities: corrective maintenance (CM) and preventive maintenance (PM). Maintenance actions are characterized by their ability to reduce this age. PM consists of actions applied on components while they are operating, whereas CM actions occur when the component breaks down. In this paper, we expound a new method to integrate the effect of CM while planning for the PM policy. The proportional hazard function was used as a modeling tool for that purpose. Interesting results were obtained when comparison between policies that take into consideration the CM effect and those that do not is established.

  1. A DNA based model for addition computation

    Institute of Scientific and Technical Information of China (English)

    GAO Lin; YANG Xiao; LIU Wenbin; XU Jin

    2004-01-01

    Much effort has been made to solve computing problems by using DNA-an organic simulating method, which in some cases is preferable to the current electronic computer. However, No one at present has proposed an effective and applicable method to solve addition problem with molecular algorithm due to the difficulty in solving the carry problem which can be easily solved by hardware of an electronic computer. In this article, we solved this problem by employing two kinds of DNA strings, one is called result and operation string while the other is named carrier. The result and operation string contains some carry information by its own and denotes the ultimate result while the carrier is just for carrying use. The significance of this algorithm is the original code, the fairly easy steps to follow and the feasibility under current molecular biological technology.

  2. Beyond Flood Hazard Maps: Detailed Flood Characterization with Remote Sensing, GIS and 2d Modelling

    Science.gov (United States)

    Santillan, J. R.; Marqueso, J. T.; Makinano-Santillan, M.; Serviano, J. L.

    2016-09-01

    Flooding is considered to be one of the most destructive among many natural disasters such that understanding floods and assessing the risks associated to it are becoming more important nowadays. In the Philippines, Remote Sensing (RS) and Geographic Information System (GIS) are two main technologies used in the nationwide modelling and mapping of flood hazards. Although the currently available high resolution flood hazard maps have become very valuable, their use for flood preparedness and mitigation can be maximized by enhancing the layers of information these maps portrays. In this paper, we present an approach based on RS, GIS and two-dimensional (2D) flood modelling to generate new flood layers (in addition to the usual flood depths and hazard layers) that are also very useful in flood disaster management such as flood arrival times, flood velocities, flood duration, flood recession times, and the percentage within a given flood event period a particular location is inundated. The availability of these new layers of flood information are crucial for better decision making before, during, and after occurrence of a flood disaster. The generation of these new flood characteristic layers is illustrated using the Cabadbaran River Basin in Mindanao, Philippines as case study area. It is envisioned that these detailed maps can be considered as additional inputs in flood disaster risk reduction and management in the Philippines.

  3. BEYOND FLOOD HAZARD MAPS: DETAILED FLOOD CHARACTERIZATION WITH REMOTE SENSING, GIS AND 2D MODELLING

    Directory of Open Access Journals (Sweden)

    J. R. Santillan

    2016-09-01

    Full Text Available Flooding is considered to be one of the most destructive among many natural disasters such that understanding floods and assessing the risks associated to it are becoming more important nowadays. In the Philippines, Remote Sensing (RS and Geographic Information System (GIS are two main technologies used in the nationwide modelling and mapping of flood hazards. Although the currently available high resolution flood hazard maps have become very valuable, their use for flood preparedness and mitigation can be maximized by enhancing the layers of information these maps portrays. In this paper, we present an approach based on RS, GIS and two-dimensional (2D flood modelling to generate new flood layers (in addition to the usual flood depths and hazard layers that are also very useful in flood disaster management such as flood arrival times, flood velocities, flood duration, flood recession times, and the percentage within a given flood event period a particular location is inundated. The availability of these new layers of flood information are crucial for better decision making before, during, and after occurrence of a flood disaster. The generation of these new flood characteristic layers is illustrated using the Cabadbaran River Basin in Mindanao, Philippines as case study area. It is envisioned that these detailed maps can be considered as additional inputs in flood disaster risk reduction and management in the Philippines.

  4. Hydraulic modeling for lahar hazards at cascades volcanoes

    Science.gov (United States)

    Costa, J.E.

    1997-01-01

    The National Weather Service flood routing model DAMBRK is able to closely replicate field-documented stages of historic and prehistoric lahars from Mt. Rainier, Washington, and Mt. Hood, Oregon. Modeled time-of-travel of flow waves are generally consistent with documented lahar travel-times from other volcanoes around the world. The model adequately replicates a range of lahars and debris flows, including the 230 million km3 Electron lahar from Mt. Rainier, as well as a 10 m3 debris flow generated in a large outdoor experimental flume. The model is used to simulate a hypothetical lahar with a volume of 50 million m3 down the East Fork Hood River from Mt. Hood, Oregon. Although a flow such as this is thought to be possible in the Hood River valley, no field evidence exists on which to base a hazards assessment. DAMBRK seems likely to be usable in many volcanic settings to estimate discharge, velocity, and inundation areas of lahars when input hydrographs and energy-loss coefficients can be reasonably estimated.

  5. Opinion:the use of natural hazard modeling for decision making under uncertainty

    Institute of Scientific and Technical Information of China (English)

    David E Calkin; Mike Mentis

    2015-01-01

    Decision making to mitigate the effects of natural hazards is a complex undertaking fraught with uncertainty. Models to describe risks associated with natural hazards have proliferated in recent years. Concurrently, there is a growing body of work focused on developing best practices for natural hazard modeling and to create structured evaluation criteria for complex environmental models. However, to our knowledge there has been less focus on the conditions where decision makers can confidently rely on results from these models. In this review we propose a preliminary set of conditions necessary for the appropriate application of modeled results to natural hazard decision making and provide relevant examples within US wildfire management programs.

  6. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  7. Regional Integrated Meteorological Forecasting and Warning Model for Geological Hazards Based on Logistic Regression

    Institute of Scientific and Technical Information of China (English)

    XU Jing; YANG Chi; ZHANG Guoping

    2007-01-01

    Information model is adopted to integrate factors of various geosciences to estimate the susceptibility of geological hazards. Further combining the dynamic rainfall observations, Logistic regression is used for modeling the probabilities of geological hazard occurrences, upon which hierarchical warnings for rainfall-induced geological hazards are produced. The forecasting and warning model takes numerical precipitation forecasts on grid points as its dynamic input, forecasts the probabilities of geological hazard occurrences on the same grid, and translates the results into likelihoods in the form of a 5-level hierarchy. Validation of the model with observational data for the year 2004 shows that 80% of the geological hazards of the year have been identified as "likely enough to release warning messages". The model can satisfy the requirements of an operational warning system, thus is an effective way to improve the meteorological warnings for geological hazards.

  8. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  9. Seismic hazard assessment of Sub-Saharan Africa using geodetic strain rate models

    Science.gov (United States)

    Poggi, Valerio; Pagani, Marco; Weatherill, Graeme; Garcia, Julio; Durrheim, Raymond J.; Mavonga Tuluka, Georges

    2016-04-01

    The East African Rift System (EARS) is the major active tectonic feature of the Sub-Saharan Africa (SSA) region. Although the seismicity level of such a divergent plate boundary can be described as moderate, several earthquakes have been reported in historical times causing a non-negligible level of damage, albeit mostly due to the high vulnerability of the local buildings and structures. Formulation and enforcement of national seismic codes is therefore an essential future risk mitigation strategy. Nonetheless, a reliable risk assessment cannot be done without the calibration of an updated seismic hazard model for the region. Unfortunately, the major issue in assessing seismic hazard in Sub-Saharan Africa is the lack of basic information needed to construct source and ground motion models. The historical earthquake record is largely incomplete, while instrumental catalogue is complete down to sufficient magnitude only for a relatively short time span. In addition, mapping of seimogenically active faults is still an on-going program. Recent studies have identified major seismogenic lineaments, but there is substantial lack of kinematic information for intermediate-to-small scale tectonic features, information that is essential for the proper calibration of earthquake recurrence models. To compensate this lack of information, we experiment the use of a strain rate model recently developed by Stamps et al. (2015) in the framework of a earthquake hazard and risk project along the EARS supported by USAID and jointly carried out by GEM and AfricaArray. We use the inferred geodetic strain rates to derive estimates of total scalar moment release, subsequently used to constrain earthquake recurrence relationships for both area (as distributed seismicity) and fault source models. The rates obtained indirectly from strain rates and more classically derived from the available seismic catalogues are then compared and combined into a unique mixed earthquake recurrence model

  10. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  11. A median voter model of health insurance with ex post moral hazard.

    Science.gov (United States)

    Jacob, Johanna; Lundin, Douglas

    2005-03-01

    One of the main features of health insurance is moral hazard, as defined by Pauly [Pauly, M.V., 1968. The economics of moral hazard: comment. American Economic Review 58, 531-537), people face incentives for excess utilization of medical care since they do not pay the full marginal cost for provision. To mitigate the moral hazard problem, a coinsurance can be included in the insurance contract. But health insurance is often publicly provided. Having a uniform coinsurance rate determined in a political process is quite different from having different rates varying in accordance with one's preferences, as is possible with private insurance. We construct a political economy model in order to characterize the political equilibrium and answer questions like: "Under what conditions is there a conflict in society on what coinsurance rate should be set?" and "Which groups of individuals will vote for a higher and lower than equilibrium coinsurance rate, respectively?". We also extend our basic model and allow people to supplement the coverage provided by the government with private insurance. Then, we answer two questions: "Who will buy the additional coverage?" and "How do the coinsurance rates people are now faced with compare with the rates chosen with pure private provision?".

  12. Assessment and indirect adjustment for confounding by smoking in cohort studies using relative hazards models.

    Science.gov (United States)

    Richardson, David B; Laurier, Dominique; Schubauer-Berigan, Mary K; Tchetgen Tchetgen, Eric; Cole, Stephen R

    2014-11-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950-2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950-2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer--a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented.

  13. Multiple Landslide-Hazard Scenarios Modeled for the Oakland-Berkeley Area, Northern California

    Science.gov (United States)

    Pike, Richard J.; Graymer, Russell W.

    2008-01-01

    With the exception of Los Angeles, perhaps no urban area in the United States is more at risk from landsliding, triggered by either precipitation or earthquake, than the San Francisco Bay region of northern California. By January each year, seasonal winter storms usually bring moisture levels of San Francisco Bay region hillsides to the point of saturation, after which additional heavy rainfall may induce landslides of various types and levels of severity. In addition, movement at any time along one of several active faults in the area may generate an earthquake large enough to trigger landslides. The danger to life and property rises each year as local populations continue to expand and more hillsides are graded for development of residential housing and its supporting infrastructure. The chapters in the text consist of: *Introduction by Russell W. Graymer *Chapter 1 Rainfall Thresholds for Landslide Activity, San Francisco Bay Region, Northern California by Raymond C. Wilson *Chapter 2 Susceptibility to Deep-Seated Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike and Steven Sobieszczyk *Chapter 3 Susceptibility to Shallow Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Kevin M. Schmidt and Steven Sobieszczyk *Chapter 4 Landslide Hazard Modeled for the Cities of Oakland, Piedmont, and Berkeley, Northern California, from a M=7.1 Scenario Earthquake on the Hayward Fault Zone by Scott B. Miles and David K. Keefer *Chapter 5 Synthesis of Landslide-Hazard Scenarios Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike The plates consist of: *Plate 1 Susceptibility to Deep-Seated Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike, Russell W. Graymer, Sebastian Roberts, Naomi B. Kalman, and Steven Sobieszczyk *Plate 2 Susceptibility to Shallow Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Kevin M. Schmidt and Steven

  14. On Model Specification and Selection of the Cox Proportional Hazards Model*

    OpenAIRE

    Lin, Chen-Yen; Halabi, Susan

    2013-01-01

    Prognosis plays a pivotal role in patient management and trial design. A useful prognostic model should correctly identify important risk factors and estimate their effects. In this article, we discuss several challenges in selecting prognostic factors and estimating their effects using the Cox proportional hazards model. Although a flexible semiparametric form, the Cox’s model is not entirely exempt from model misspecification. To minimize possible misspecification, instead of imposing tradi...

  15. Hidden Markov models for estimating animal mortality from anthropogenic hazards

    Science.gov (United States)

    Carcasses searches are a common method for studying the risk of anthropogenic hazards to wildlife, including non-target poisoning and collisions with anthropogenic structures. Typically, numbers of carcasses found must be corrected for scavenging rates and imperfect detection. ...

  16. Modelling Inland Flood Events for Hazard Maps in Taiwan

    Science.gov (United States)

    Ghosh, S.; Nzerem, K.; Sassi, M.; Hilberts, A.; Assteerawatt, A.; Tillmanns, S.; Mathur, P.; Mitas, C.; Rafique, F.

    2015-12-01

    Taiwan experiences significant inland flooding, driven by torrential rainfall from plum rain storms and typhoons during summer and fall. From last 13 to 16 years data, 3,000 buildings were damaged by such floods annually with a loss US$0.41 billion (Water Resources Agency). This long, narrow island nation with mostly hilly/mountainous topography is located at tropical-subtropical zone with annual average typhoon-hit-frequency of 3-4 (Central Weather Bureau) and annual average precipitation of 2502mm (WRA) - 2.5 times of the world's average. Spatial and temporal distributions of countrywide precipitation are uneven, with very high local extreme rainfall intensities. Annual average precipitation is 3000-5000mm in the mountainous regions, 78% of it falls in May-October, and the 1-hour to 3-day maximum rainfall are about 85 to 93% of the world records (WRA). Rivers in Taiwan are short with small upstream areas and high runoff coefficients of watersheds. These rivers have the steepest slopes, the shortest response time with rapid flows, and the largest peak flows as well as specific flood peak discharge (WRA) in the world. RMS has recently developed a countrywide inland flood model for Taiwan, producing hazard return period maps at 1arcsec grid resolution. These can be the basis for evaluating and managing flood risk, its economic impacts, and insured flood losses. The model is initiated with sub-daily historical meteorological forcings and calibrated to daily discharge observations at about 50 river gauges over the period 2003-2013. Simulations of hydrologic processes, via rainfall-runoff and routing models, are subsequently performed based on a 10000 year set of stochastic forcing. The rainfall-runoff model is physically based continuous, semi-distributed model for catchment hydrology. The 1-D wave propagation hydraulic model considers catchment runoff in routing and describes large-scale transport processes along the river. It also accounts for reservoir storage

  17. Global Hydrological Hazard Evaluation System (Global BTOP) Using Distributed Hydrological Model

    Science.gov (United States)

    Gusyev, M.; Magome, J.; Hasegawa, A.; Takeuchi, K.

    2015-12-01

    A global hydrological hazard evaluation system based on the BTOP models (Global BTOP) is introduced and quantifies flood and drought hazards with simulated river discharges globally for historical, near real-time monitoring and climate change impact studies. The BTOP model utilizes a modified topographic index concept and simulates rainfall-runoff processes including snowmelt, overland flow, soil moisture in the root and unsaturated zones, sub-surface flow, and river flow routing. The current global BTOP is constructed from global data on 10-min grid and is available to conduct river basin analysis on local, regional, and global scale. To reduce the impact of a coarse resolution, topographical features of global BTOP were obtained using river network upscaling algorithm that preserves fine resolution characteristics of 3-arcsec HydroSHEDS and 30-arcsec Hydro1K datasets. In addition, GLCC-IGBP land cover (USGS) and the DSMW(FAO) were used for the root zone depth and soil properties, respectively. The long-term seasonal potential evapotranspiration within BTOP model was estimated by the Shuttleworth-Wallace model using climate forcing data CRU TS3.1 and a GIMMS-NDVI(UMD/GLCF). The global BTOP was run with globally available precipitation such APHRODITE dataset and showed a good statistical performance compared to the global and local river discharge data in the major river basins. From these simulated daily river discharges at each grid, the flood peak discharges of selected return periods were obtained using the Gumbel distribution with L-moments and the hydrological drought hazard was quantified using standardized runoff index (SRI). For the dynamic (near real-time) applications, the global BTOP model is run with GSMaP-NRT global precipitation and simulated daily river discharges are utilized in a prototype near-real time discharge simulation system (GFAS-Streamflow), which is used to issue flood peak discharge alerts globally. The global BTOP system and GFAS

  18. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    Science.gov (United States)

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  19. Conceptual geoinformation model of natural hazards risk assessment

    Science.gov (United States)

    Kulygin, Valerii

    2016-04-01

    Natural hazards are the major threat to safe interactions between nature and society. The assessment of the natural hazards impacts and their consequences is important in spatial planning and resource management. Today there is a challenge to advance our understanding of how socio-economical and climate changes will affect the frequency and magnitude of hydro-meteorological hazards and associated risks. However, the impacts from different types of natural hazards on various marine and coastal economic activities are not of the same type. In this study, the conceptual geomodel of risk assessment is presented to highlight the differentiation by the type of economic activities in extreme events risk assessment. The marine and coastal ecosystems are considered as the objects of management, on the one hand, and as the place of natural hazards' origin, on the other hand. One of the key elements in describing of such systems is the spatial characterization of their components. Assessment of ecosystem state is based on ecosystem indicators (indexes). They are used to identify the changes in time. The scenario approach is utilized to account for the spatio-temporal dynamics and uncertainty factors. Two types of scenarios are considered: scenarios of using ecosystem services by economic activities and scenarios of extreme events and related hazards. The reported study was funded by RFBR, according to the research project No. 16-35-60043 mol_a_dk.

  20. Use of agent-based modelling in emergency management under a range of flood hazards

    Directory of Open Access Journals (Sweden)

    Tagg Andrew

    2016-01-01

    Full Text Available The Life Safety Model (LSM was developed some 15 years ago, originally for dam break assessments and for informing reservoir evacuation and emergency plans. Alongside other technological developments, the model has evolved into a very useful agent-based tool, with many applications for a range of hazards and receptor behaviour. HR Wallingford became involved in its use in 2006, and is now responsible for its technical development and commercialisation. Over the past 10 years the model has been applied to a range of flood hazards, including coastal surge, river flood, dam failure and tsunami, and has been verified against historical events. Commercial software licences are being used in Canada, Italy, Malaysia and Australia. A core group of LSM users and analysts has been specifying and delivering a programme of model enhancements. These include improvements to traffic behaviour at intersections, new algorithms for sheltering in high-rise buildings, and the addition of monitoring points to allow detailed analysis of vehicle and pedestrian movement. Following user feedback, the ability of LSM to handle large model ‘worlds’ and hydrodynamic meshes has been improved. Recent developments include new documentation, performance enhancements, better logging of run-time events and bug fixes. This paper describes some of the recent developments and summarises some of the case study applications, including dam failure analysis in Japan and mass evacuation simulation in England.

  1. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    Science.gov (United States)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes

  2. Debris flow hazard modelling on medium scale: Valtellina di Tirano, Italy

    OpenAIRE

    J. Blahut; P. Horton; S. Sterlacchini; Jaboyedoff, M.

    2010-01-01

    Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of th...

  3. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    Science.gov (United States)

    Grasso, S.; Maugeri, M.

    rigorous complex methods of analysis or qualitative procedures. A semi quantitative procedure based on the definition of the geotechnical hazard index has been applied for the zonation of the seismic geotechnical hazard of the city of Catania. In particular this procedure has been applied to define the influence of geotechnical properties of soil in a central area of the city of Catania, where some historical buildings of great importance are sited. It was also performed an investigation based on the inspection of more than one hundred historical ecclesiastical buildings of great importance, located in the city. Then, in order to identify the amplification effects due to the site conditions, a geotechnical survey form was prepared, to allow a semi quantitative evaluation of the seismic geotechnical hazard for all these historical buildings. In addition, to evaluate the foundation soil time -history response, a 1-D dynamic soil model was employed for all these buildings, considering the non linearity of soil behaviour. Using a GIS, a map of the seismic geotechnical hazard, of the liquefaction hazard and a preliminary map of the seismic hazard for the city of Catania have been obtained. From the analysis of obtained results it may be noticed that high hazard zones are mainly clayey sites

  4. Modelling the costs of natural hazards in games

    Science.gov (United States)

    Bostenaru-Dan, M.

    2012-04-01

    City are looked for today, including a development at the University of Torino called SimTorino, which simulates the development of the city in the next 20 years. The connection to another games genre as video games, the board games, will be investigated, since there are games on construction and reconstruction of a cathedral and its tower and a bridge in an urban environment of the middle ages based on the two novels of Ken Follett, "Pillars of the Earth" and "World Without End" and also more recent games, such as "Urban Sprawl" or the Romanian game "Habitat", dealing with the man-made hazard of demolition. A review of these games will be provided based on first hand playing experience. In games like "World without End" or "Pillars of the Earth", just like in the recently popular games of Zynga on social networks, construction management is done through providing "building" an item out of stylised materials, such as "stone", "sand" or more specific ones as "nail". Such approach could be used also for retrofitting buildings for earthquakes, in the series of "upgrade", not just for extension as it is currently in games, and this is what our research is about. "World without End" includes a natural disaster not so analysed today but which was judged by the author as the worst of manhood: the Black Death. The Black Death has effects and costs as well, not only modelled through action cards, but also on the built environment, by buildings remaining empty. On the other hand, games such as "Habitat" rely on role playing, which has been recently recognised as a way to bring games theory to decision making through the so-called contribution of drama, a way to solve conflicts through balancing instead of weighting, and thus related to Analytic Hierarchy Process. The presentation aims to also give hints on how to design a game for the problem of earthquake retrofit, translating the aims of the actors in such a process into role playing. Games are also employed in teaching of urban

  5. Expert elicitation for a national-level volcano hazard model

    Science.gov (United States)

    Bebbington, Mark; Stirling, Mark; Cronin, Shane; Wang, Ting; Jolly, Gill

    2016-04-01

    The quantification of volcanic hazard at national level is a vital pre-requisite to placing volcanic risk on a platform that permits meaningful comparison with other hazards such as earthquakes. New Zealand has up to a dozen dangerous volcanoes, with the usual mixed degrees of knowledge concerning their temporal and spatial eruptive history. Information on the 'size' of the eruptions, be it in terms of VEI, volume or duration, is sketchy at best. These limitations and the need for a uniform approach lend themselves to a subjective hazard analysis via expert elicitation. Approximately 20 New Zealand volcanologists provided estimates for the size of the next eruption from each volcano and, conditional on this, its location, timing and duration. Opinions were likewise elicited from a control group of statisticians, seismologists and (geo)chemists, all of whom had at least heard the term 'volcano'. The opinions were combined via the Cooke classical method. We will report on the preliminary results from the exercise.

  6. The influence of mapped hazards on risk beliefs: a proximity-based modeling approach.

    Science.gov (United States)

    Severtson, Dolores J; Burt, James E

    2012-02-01

    Interview findings suggest perceived proximity to mapped hazards influences risk beliefs when people view environmental hazard maps. For dot maps, four attributes of mapped hazards influenced beliefs: hazard value, proximity, prevalence, and dot patterns. In order to quantify the collective influence of these attributes for viewers' perceived or actual map locations, we present a model to estimate proximity-based hazard or risk (PBH) and share study results that indicate how modeled PBH and map attributes influenced risk beliefs. The randomized survey study among 447 university students assessed risk beliefs for 24 dot maps that systematically varied by the four attributes. Maps depicted water test results for a fictitious hazardous substance in private residential wells and included a designated "you live here" location. Of the nine variables that assessed risk beliefs, the numerical susceptibility variable was most consistently and strongly related to map attributes and PBH. Hazard value, location in or out of a clustered dot pattern, and distance had the largest effects on susceptibility. Sometimes, hazard value interacted with other attributes, for example, distance had stronger effects on susceptibility for larger than smaller hazard values. For all combined maps, PBH explained about the same amount of variance in susceptibility as did attributes. Modeled PBH may have utility for studying the influence of proximity to mapped hazards on risk beliefs, protective behavior, and other dependent variables. Further work is needed to examine these influences for more realistic maps and representative study samples.

  7. DEVELOPMENT OF A CRASH RISK PROBABILITY MODEL FOR FREEWAYS BASED ON HAZARD PREDICTION INDEX

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2014-12-01

    Full Text Available This study presents a method for the identification of hazardous situations on the freeways. The hazard identification is done using a crash risk probability model. For this study, about 18 km long section of Eastern Freeway in Melbourne (Australia is selected as a test bed. Two categories of data i.e. traffic and accident record data are used for the analysis and modelling. In developing the crash risk probability model, Hazard Prediction Index is formulated in this study by the differences of traffic parameters with threshold values. Seven different prediction indices are examined and the best one is selected as crash risk probability model based on prediction error minimisation.

  8. Delayed geochemical hazard: Concept, digital model and case study

    Institute of Scientific and Technical Information of China (English)

    CHEN Ming; FENG Liu; Jacques Yvon

    2005-01-01

    Delayed Geochemical Hazard (DGH briefly) presents the whole process of a kind of serious ecological and environmental hazard caused by sudden reactivation and sharp release of long-term accumulated pollutant from stable species to active ones in soil or sediment system due to the change of physical-chemical conditions (such as temperature, pH, Eh, moisture, the concentrations of organic matters, etc.) or the decrease of environment capacity. The characteristics of DGH are discussed. The process of a typical DGH can be expressed as a nonlinear polynomial. The points where the derivative functions of the first and second orders of the polynomial reach zero, minimum and maximum are keys for risk assessment and harzard pridication.The process and mechanism of the hazard is due to the transform of pollutant among different species principally. The concepts of "total releasable content of pollutant", TRCP, and "total concentration of active specie", TCAS, are necessarily defined to describe the mechanism of DGH. The possibility of the temporal and spatial propagation is discussed. Case study shows that there exists a transform mechanism of "gradual release" and "chain reaction" among the species of the exchangeable and the bounds to carbonate, iron and manganese oxides and organic matter, thus causing the delayed geochemical hazard.

  9. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  10. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  11. A Quasi-Poisson Approach on Modeling Accident Hazard Index for Urban Road Segments

    Directory of Open Access Journals (Sweden)

    Lu Ma

    2014-01-01

    Full Text Available In light of the recently emphasized studies on risk evaluation of crashes, accident counts under specific transportation facilities are adopted to reflect the chance of crash occurrence. The current study introduces more comprehensive measure with the supplement information of accidental harmfulness into the expression of accident risks which are also named Accident Hazard Index (AHI in the following context. Before the statistical analysis, datasets from various sources are integrated under a GIS platform and the corresponding procedures are presented as an illustrated example for similar analysis. Then, a quasi-Poisson regression model is suggested for analyses and the results show that the model is appropriate for dealing with overdispersed count data and several key explanatory variables were found to have significant impact on the estimation of AHI. In addition, the effect of weight on different severity levels of accidents is examined and the selection of the weight is also discussed.

  12. Comparison of empirical and data driven hydrometeorological hazard models on coastal cities of São Paulo, Brazil

    Science.gov (United States)

    Koga-Vicente, A.; Friedel, M. J.

    2010-12-01

    Every year thousands of people are affected by floods and landslide hazards caused by rainstorms. The problem is more serious in tropical developing countries because of the susceptibility as a result of the high amount of available energy to form storms, and the high vulnerability due to poor economic and social conditions. Predictive models of hazards are important tools to manage this kind of risk. In this study, a comparison of two different modeling approaches was made for predicting hydrometeorological hazards in 12 cities on the coast of São Paulo, Brazil, from 1994 to 2003. In the first approach, an empirical multiple linear regression (MLR) model was developed and used; the second approach used a type of unsupervised nonlinear artificial neural network called a self-organized map (SOM). By using twenty three independent variables of susceptibility (precipitation, soil type, slope, elevation, and regional atmospheric system scale) and vulnerability (distribution and total population, income and educational characteristics, poverty intensity, human development index), binary hazard responses were obtained. Model performance by cross-validation indicated that the respective MLR and SOM model accuracy was about 67% and 80%. Prediction accuracy can be improved by the addition of information, but the SOM approach is preferred because of sparse data and highly nonlinear relations among the independent variables.

  13. Opinion: the use of natural hazard modeling for decision making under uncertainty

    Directory of Open Access Journals (Sweden)

    David E Calkin

    2015-04-01

    Full Text Available Decision making to mitigate the effects of natural hazards is a complex undertaking fraught with uncertainty. Models to describe risks associated with natural hazards have proliferated in recent years. Concurrently, there is a growing body of work focused on developing best practices for natural hazard modeling and to create structured evaluation criteria for complex environmental models. However, to our knowledge there has been less focus on the conditions where decision makers can confidently rely on results from these models. In this review we propose a preliminary set of conditions necessary for the appropriate application of modeled results to natural hazard decision making and provide relevant examples within US wildfire management programs.

  14. Debris flow hazard modelling on medium scale: Valtellina di Tirano, Italy

    Directory of Open Access Journals (Sweden)

    J. Blahut

    2010-11-01

    Full Text Available Debris flow hazard modelling at medium (regional scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal, and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy. The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R, developed at the University of Lausanne (Switzerland. An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise

  15. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.

  16. a study of the slope of cox proportional hazard and weibull models

    African Journals Online (AJOL)

    Adejumo & Ahmadu

    the values of the unknown parameters. These include the ... semi parametric cox proportional hazard model when the parametric ... simulated and the real life data approach. ... greatest risk progression of TB infection to active disease. People.

  17. Investigation of the Effect of Traffic Parameters on Road Hazard Using Classification Tree Model

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2012-09-01

    Full Text Available This paper presents a method for the identification of hazardous situations on the freeways. For this study, about 18 km long section of Eastern Freeway in Melbourne, Australia was selected as a test bed. Three categories of data i.e. traffic, weather and accident record data were used for the analysis and modelling. In developing the crash risk probability model, classification tree based model was developed in this study. In formulating the models, it was found that weather conditions did not have significant impact on accident occurrence so the classification tree was built using two traffic indices; traffic flow and vehicle speed only. The formulated classification tree is able to identify the possible hazard and non-hazard situations on freeway. The outcome of the study will aid the hazard mitigation strategies.

  18. Deriving global flood hazard maps of fluvial floods through a physical model cascade

    OpenAIRE

    Pappenberger, F.; E. Dutra; Wetterhall, F.; Cloke, H

    2012-01-01

    Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteoro...

  19. Fitting Additive Binomial Regression Models with the R Package blm

    Directory of Open Access Journals (Sweden)

    Stephanie Kovalchik

    2013-09-01

    Full Text Available The R package blm provides functions for fitting a family of additive regression models to binary data. The included models are the binomial linear model, in which all covariates have additive effects, and the linear-expit (lexpit model, which allows some covariates to have additive effects and other covariates to have logisitc effects. Additive binomial regression is a model of event probability, and the coefficients of linear terms estimate covariate-adjusted risk differences. Thus, in contrast to logistic regression, additive binomial regression puts focus on absolute risk and risk differences. In this paper, we give an overview of the methodology we have developed to fit the binomial linear and lexpit models to binary outcomes from cohort and population-based case-control studies. We illustrate the blm packages methods for additive model estimation, diagnostics, and inference with risk association analyses of a bladder cancer nested case-control study in the NIH-AARP Diet and Health Study.

  20. Accelerated propor tional degradation hazards-odds model in accelerated degradation test

    Institute of Scientific and Technical Information of China (English)

    Tingting Huang; Zhizhong Li

    2015-01-01

    An accelerated proportional degradation hazards-odds model is proposed. It is a non-parametric model and thus has path-free and distribution-free properties, avoiding the errors caused by faulty assumptions of degradation paths or distribution of degra-dation measurements. It is established based on a link function which combines the degradation cumulative hazard rate function and the degradation odds function through a transformation pa-rameter, and this makes the accelerated proportional degradation hazards model and the accelerated proportional degradation odds model special cases of it. Hypothesis tests are discussed, and the proposed model is applicable when some model assumptions are satisfied. This model is utilized to estimate the reliability of minia-ture bulbs under low stress levels based on the degradation data obtained under high stress levels to validate the effectiveness of this model.

  1. An Additive-Multiplicative Restricted Mean Residual Life Model

    DEFF Research Database (Denmark)

    Mansourvar, Zahra; Martinussen, Torben; Scheike, Thomas H.

    2016-01-01

    mean residual life model to study the association between the restricted mean residual life function and potential regression covariates in the presence of right censoring. This model extends the proportional mean residual life model using an additive model as its covariate dependent baseline....... For the suggested model, some covariate effects are allowed to be time-varying. To estimate the model parameters, martingale estimating equations are developed, and the large sample properties of the resulting estimators are established. In addition, to assess the adequacy of the model, we investigate a goodness...... of fit test that is asymptotically justified. The proposed methodology is evaluated via simulation studies and further applied to a kidney cancer data set collected from a clinical trial....

  2. Comprehensive European dietary exposure model (CEDEM) for food additives.

    Science.gov (United States)

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.

  3. Spatio-Temporal Risk Assessment Process Modeling for Urban Hazard Events in Sensor Web Environment

    Directory of Open Access Journals (Sweden)

    Wei Wang

    2016-11-01

    Full Text Available Immediate risk assessment and analysis are crucial in managing urban hazard events (UHEs. However, it is a challenge to develop an immediate risk assessment process (RAP that can integrate distributed sensors and data to determine the uncertain model parameters of facilities, environments, and populations. To solve this problem, this paper proposes a RAP modeling method within a unified spatio-temporal framework and forms a 10-tuple process information description structure based on a Meta-Object Facility (MOF. A RAP is designed as an abstract RAP chain that collects urban information resources and performs immediate risk assessments. In addition, we propose a prototype system known as Risk Assessment Process Management (RAPM to achieve the functions of RAP modeling, management, execution and visualization. An urban gas leakage event is simulated as an example in which individual risk and social risk are used to illustrate the applicability of the RAP modeling method based on the 10-tuple metadata framework. The experimental results show that the proposed RAP immediately assesses risk by the aggregation of urban sensors, data, and model resources. Moreover, an extension mechanism is introduced in the spatio-temporal RAP modeling method to assess risk and to provide decision-making support for different UHEs.

  4. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  5. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  6. Efficient pan-European river flood hazard modelling through a combination of statistical and physical models

    Science.gov (United States)

    Paprotny, Dominik; Morales-Nápoles, Oswaldo; Jonkman, Sebastiaan N.

    2017-07-01

    Flood hazard is currently being researched on continental and global scales, using models of increasing complexity. In this paper we investigate a different, simplified approach, which combines statistical and physical models in place of conventional rainfall-run-off models to carry out flood mapping for Europe. A Bayesian-network-based model built in a previous study is employed to generate return-period flow rates in European rivers with a catchment area larger than 100 km2. The simulations are performed using a one-dimensional steady-state hydraulic model and the results are post-processed using Geographical Information System (GIS) software in order to derive flood zones. This approach is validated by comparison with Joint Research Centre's (JRC) pan-European map and five local flood studies from different countries. Overall, the two approaches show a similar performance in recreating flood zones of local maps. The simplified approach achieved a similar level of accuracy, while substantially reducing the computational time. The paper also presents the aggregated results on the flood hazard in Europe, including future projections. We find relatively small changes in flood hazard, i.e. an increase of flood zones area by 2-4 % by the end of the century compared to the historical scenario. However, when current flood protection standards are taken into account, the flood-prone area increases substantially in the future (28-38 % for a 100-year return period). This is because in many parts of Europe river discharge with the same return period is projected to increase in the future, thus making the protection standards insufficient.

  7. A Remote Sensing Based Approach For Modeling and Assessing Glacier Hazards

    Science.gov (United States)

    Huggel, C.; Kääb, A.; Salzmann, N.; Haeberli, W.; Paul, F.

    Glacier-related hazards such as ice avalanches and glacier lake outbursts can pose a significant threat to population and installations in high mountain regions. They are well documented in the Swiss Alps and the high data density is used to build up sys- tematic knowledge of glacier hazard locations and potentials. Experiences from long research activities thereby form an important basis for ongoing hazard monitoring and assessment. However, in the context of environmental changes in general, and the highly dynamic physical environment of glaciers in particular, historical experience may increasingly loose its significance with respect to impact zone of hazardous pro- cesses. On the other hand, in large and remote high mountains such as the Himalayas, exact information on location and potential of glacier hazards is often missing. There- fore, it is crucial to develop hazard monitoring and assessment concepts including area-wide applications. Remote sensing techniques offer a powerful tool to narrow current information gaps. The present contribution proposes an approach structured in (1) detection, (2) evaluation and (3) modeling of glacier hazards. Remote sensing data is used as the main input to (1). Algorithms taking advantage of multispectral, high-resolution data are applied for detecting glaciers and glacier lakes. Digital terrain modeling, and classification and fusion of panchromatic and multispectral satellite im- agery is performed in (2) to evaluate the hazard potential of possible hazard sources detected in (1). The locations found in (1) and (2) are used as input to (3). The models developed in (3) simulate the processes of lake outbursts and ice avalanches based on hydrological flow modeling and empirical values for average trajectory slopes. A probability-related function allows the model to indicate areas with lower and higher risk to be affected by catastrophic events. Application of the models for recent ice avalanches and lake outbursts show

  8. Additive Intensity Regression Models in Corporate Default Analysis

    DEFF Research Database (Denmark)

    Lando, David; Medhat, Mamdouh; Nielsen, Mads Stenbo

    2013-01-01

    We consider additive intensity (Aalen) models as an alternative to the multiplicative intensity (Cox) models for analyzing the default risk of a sample of rated, nonfinancial U.S. firms. The setting allows for estimating and testing the significance of time-varying effects. We use a variety of mo...

  9. Snakes as hazards: modelling risk by chasing chimpanzees.

    Science.gov (United States)

    McGrew, William C

    2015-04-01

    Snakes are presumed to be hazards to primates, including humans, by the snake detection hypothesis (Isbell in J Hum Evol 51:1-35, 2006; Isbell, The fruit, the tree, and the serpent. Why we see so well, 2009). Quantitative, systematic data to test this idea are lacking for the behavioural ecology of living great apes and human foragers. An alternative proxy is snakes encountered by primatologists seeking, tracking, and observing wild chimpanzees. We present 4 years of such data from Mt. Assirik, Senegal. We encountered 14 species of snakes a total of 142 times. Almost two-thirds of encounters were with venomous snakes. Encounters occurred most often in forest and least often in grassland, and more often in the dry season. The hypothesis seems to be supported, if frequency of encounter reflects selective risk of morbidity or mortality.

  10. Traffic Incident Clearance Time and Arrival Time Prediction Based on Hazard Models

    Directory of Open Access Journals (Sweden)

    Yang beibei Ji

    2014-01-01

    Full Text Available Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an effective input for travel time prediction. In this paper, the hazard based prediction models are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.

  11. Flexible parametric modelling of cause-specific hazards to estimate cumulative incidence functions

    Science.gov (United States)

    2013-01-01

    Background Competing risks are a common occurrence in survival analysis. They arise when a patient is at risk of more than one mutually exclusive event, such as death from different causes, and the occurrence of one of these may prevent any other event from ever happening. Methods There are two main approaches to modelling competing risks: the first is to model the cause-specific hazards and transform these to the cumulative incidence function; the second is to model directly on a transformation of the cumulative incidence function. We focus on the first approach in this paper. This paper advocates the use of the flexible parametric survival model in this competing risk framework. Results An illustrative example on the survival of breast cancer patients has shown that the flexible parametric proportional hazards model has almost perfect agreement with the Cox proportional hazards model. However, the large epidemiological data set used here shows clear evidence of non-proportional hazards. The flexible parametric model is able to adequately account for these through the incorporation of time-dependent effects. Conclusion A key advantage of using this approach is that smooth estimates of both the cause-specific hazard rates and the cumulative incidence functions can be obtained. It is also relatively easy to incorporate time-dependent effects which are commonly seen in epidemiological studies. PMID:23384310

  12. Investigating applicability of Haeri-Samiee landslide hazard zonation model in Moalemkalayeh watershed, Iran

    Science.gov (United States)

    Beheshtirad, M.; Noormandipour, N.

    2009-04-01

    Identification of regions having potential for landslide occurrence is one of the basic measures in natural resources management which decreases damages causes by these phenomena. For this purpose different landslide hazard zonation models were proposed based on the environmental conditions and goals. In this research applicability of Haeri-samiee Landslide Hazard Zonations Model has been investigated in Moalemkalayeh watershed. For doing this, existing landslides identified and their inventory map was prepared as earthly evidence. Topographical map (1:50000) was divided into 514 cellular network as working unit. Landslide hazard zonation map provided based on H.S. model. We investigated the level of similarity potential hazard classes and figures of the two models with earthly evidence (landslide inventory map) in the SPSS and Minitab environments. Our results showed that there is a significant correlation at the 0.01 level between potential hazard classes and figures with the number of landslides, area of landslide, as well as the multiplication of the number and area of landslides in the H.S. model. Therefore H.S. model is the suitable model for Moalemkalayeh watershed.

  13. A model standardized risk assessment protocol for use with hazardous waste sites.

    OpenAIRE

    Marsh, G M; Day, R.

    1991-01-01

    This paper presents a model standardized risk assessment protocol (SRAP) for use with hazardous waste sites. The proposed SRAP focuses on the degree and patterns of evidence that exist for a significant risk to human populations from exposure to a hazardous waste site. The SRAP was designed with at least four specific goals in mind: to organize the available scientific data on a specific site and to highlight important gaps in this knowledge; to facilitate rational, cost-effective decision ma...

  14. Implications of different digital elevation models and preprocessing techniques to delineate debris flow inundation hazard zones in El Salvador

    Science.gov (United States)

    Anderson, E. R.; Griffin, R.; Irwin, D.

    2013-12-01

    . Optimized pit filling techniques use both cut and fill operations to minimize modifications of the original DEM. Satellite image interpretation and field surveying provide the baseline upon which to test the accuracy of each model simulation. By outlining areas that could potentially be inundated by debris flows, these efforts can be used to more accurately identify the places and assets immediately exposed to landslide hazards. We contextualize the results of the previous and ongoing efforts into how they may be incorporated into decision support systems. We also discuss if and how these analyses would have provided additional knowledge in the past, and identify specific recommendations as to how they could contribute to a more robust decision support system in the future.

  15. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    Science.gov (United States)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and

  16. Global Volcano Model: progress towards an international co-ordinated network for volcanic hazard and risk

    Science.gov (United States)

    Loughlin, Susan

    2013-04-01

    GVM is a growing international collaboration that aims to create a sustainable, accessible information platform on volcanic hazard and risk. GVM is a network that aims to co-ordinate and integrate the efforts of the international volcanology community. Major international initiatives and partners such as the Smithsonian Institution - Global Volcanism Program, State University of New York at Buffalo - VHub, Earth Observatory of Singapore - WOVOdat and many others underpin GVM. Activities currently include: design and development of databases of volcano data, volcanic hazards, vulnerability and exposure with internationally agreed metadata standards; establishment of methodologies for analysis of the data (e.g. hazard and exposure indices) to inform risk assessment; development of complementary hazards models and create relevant hazards and risk assessment tools. GVM acts through establishing task forces to deliver explicit deliverables in finite periods of time. GVM has a task force to deliver a global assessment of volcanic risk for UN ISDR, a task force for indices, and a task force for volcano deformation from satellite observations. GVM is organising a Volcano Best Practices workshop in 2013. A recent product of GVM is a global database on large magnitude explosive eruptions. There is ongoing work to develop databases on debris avalanches, lava dome hazards and ash hazard. GVM aims to develop the capability to anticipate future volcanism and its consequences.

  17. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    Science.gov (United States)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  18. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    Science.gov (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    location data. These results show a high concordance between the landslide inventory and the high susceptibility estimated zone with an adjustment of 95.1 % for ANN model and 89.4% for LR model. In addition, we make a comparative analysis of both techniques using the Receiver Operating Characteristic (ROC) curve, a graphical plot of the sensitivity vs. (1 - specificity) for a binary classifier system in function of its discrimination threshold, and calculating the Area Under the ROC (AUROC) value for each model. Finally, the previous models are used for the developing a new probabilistic landslide hazard map for future events. They are obtained combining the expected triggering factor (calculated earthquake ground motion) for a return period of 475 years with the susceptibility map.

  19. Using Set Model for Learning Addition of Integers

    Directory of Open Access Journals (Sweden)

    Umi Puji Lestari

    2015-07-01

    Full Text Available This study aims to investigate how set model can help students' understanding of addition of integers in fourth grade. The study has been carried out to 23 students and a teacher of IVC SD Iba Palembang in January 2015. This study is a design research that also promotes PMRI as the underlying design context and activity. Results showed that the use of set models that is packaged in activity of recording of financial transactions in two color chips and card game can help students to understand the concept of zero pair, addition with the same colored chips, and cancellation strategy.

  20. Process chain modeling and selection in an additive manufacturing context

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Stolfi, Alessandro; Mischkot, Michael

    2016-01-01

    can compete with traditional process chains for small production runs. Combining both types of technology added cost but no benefit in this case. The new process chain model can be used to explain the results and support process selection, but process chain prototyping is still important for rapidly......This paper introduces a new two-dimensional approach to modeling manufacturing process chains. This approach is used to consider the role of additive manufacturing technologies in process chains for a part with micro scale features and no internal geometry. It is shown that additive manufacturing...

  1. Discriminative accuracy of genomic profiling comparing multiplicative and additive risk models.

    Science.gov (United States)

    Moonesinghe, Ramal; Khoury, Muin J; Liu, Tiebin; Janssens, A Cecile J W

    2011-02-01

    Genetic prediction of common diseases is based on testing multiple genetic variants with weak effect sizes. Standard logistic regression and Cox Proportional Hazard models that assess the combined effect of multiple variants on disease risk assume multiplicative joint effects of the variants, but this assumption may not be correct. The risk model chosen may affect the predictive accuracy of genomic profiling. We investigated the discriminative accuracy of genomic profiling by comparing additive and multiplicative risk models. We examined genomic profiles of 40 variants with genotype frequencies varying from 0.1 to 0.4 and relative risks varying from 1.1 to 1.5 in separate scenarios assuming a disease risk of 10%. The discriminative accuracy was evaluated by the area under the receiver operating characteristic curve. Predicted risks were more extreme at the lower and higher risks for the multiplicative risk model compared with the additive model. The discriminative accuracy was consistently higher for multiplicative risk models than for additive risk models. The differences in discriminative accuracy were negligible when the effect sizes were small (risk genotypes were common or when they had stronger effects. Unraveling the exact mode of biological interaction is important when effect sizes of genetic variants are moderate at the least, to prevent the incorrect estimation of risks.

  2. Electroacoustics modeling of piezoelectric welders for ultrasonic additive manufacturing processes

    Science.gov (United States)

    Hehr, Adam; Dapino, Marcelo J.

    2016-04-01

    Ultrasonic additive manufacturing (UAM) is a recent 3D metal printing technology which utilizes ultrasonic vibrations from high power piezoelectric transducers to additively weld similar and dissimilar metal foils. CNC machining is used intermittent of welding to create internal channels, embed temperature sensitive components, sensors, and materials, and for net shaping parts. Structural dynamics of the welder and work piece influence the performance of the welder and part quality. To understand the impact of structural dynamics on UAM, a linear time-invariant model is used to relate system shear force and electric current inputs to the system outputs of welder velocity and voltage. Frequency response measurements are combined with in-situ operating measurements of the welder to identify model parameters and to verify model assumptions. The proposed LTI model can enhance process consistency, performance, and guide the development of improved quality monitoring and control strategies.

  3. AN INSTRUCTURAL SYSTEM MODEL OF COASTAL MANAGEMENT TO THE WATER RELATED HAZARDS IN CHINA

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Coastal lowlands have large areas of hazard impact and relativelylow capacity of prevention to the water related hazards,which have been indicated by the wide-spread flood hazards,high percentages of land with high flood vulnerability.Increasing population pressure and the shift of resources exploitation from land to sea will force more and more coastal lowlands to be developed in the future,further enhancing the danger of water-related hazards.In this paper,the coastal lowlands in the northern Jiangsu province,China,were selected as a case study.The Interpretation Structural Model (ISM) was employed to analyze the direct and indirect impacts among the elements within the system,and thereby,to identify the causal elements,middle linkages,their expressions,and relations.

  4. Three multimedia models used at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; Fthenakis, V.M.; Holtzman, S.; Sun, L.C. [Brookhaven National Lab., Upton, NY (United States); Rambaugh, J.O.; Potter, S. [Geraghty and Miller, Inc., Plainview, NY (United States)

    1996-02-01

    Multimedia models are used commonly in the initial phases of the remediation process where technical interest is focused on determining the relative importance of various exposure pathways. This report provides an approach for evaluating and critically reviewing the capabilities of multimedia models. This study focused on three specific models MEPAS Version 3.0, MMSOILS Version 2.2, and PRESTO-EPA-CPG Version 2.0. These models evaluate the transport and fate of contaminants from source to receptor through more than a single pathway. The presence of radioactive and mixed wastes at a site poses special problems. Hence, in this report, restrictions associated with the selection and application of multimedia models for sites contaminated with radioactive and mixed wastes are highlighted. This report begins with a brief introduction to the concept of multimedia modeling, followed by an overview of the three models. The remaining chapters present more technical discussions of the issues associated with each compartment and their direct application to the specific models. In these analyses, the following components are discussed: source term; air transport; ground water transport; overland flow, runoff, and surface water transport; food chain modeling; exposure assessment; dosimetry/risk assessment; uncertainty; default parameters. The report concludes with a description of evolving updates to the model; these descriptions were provided by the model developers.

  5. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    Science.gov (United States)

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    the outburst of landslide-dammed lakes) remains a challenge: • The knowledge about the onset of the process is often limited (bathymetry of the lakes, subsurface water, properties of dam (content of ice), type of dam breach, understanding of process chains and interactions). • The size of glacial lakes may change rapidly but continuously, and many lakes break out within a short time after their development. Continuous monitoring is therefore required to keep updated on the existing hazards. • Also the outburst of small glacial lakes may lead to significant debris floods or even debris flows if there is plenty of erodible material available. • The available modeling software packages are of limited suitability for lake outburst floods: e.g. software developed by the hydrological community is specialized to simulate (debris) floods with input hydrographs on moderately steep flow channels and with lower sediment loads. In contrast to this, programs for rapid mass movements are better suited on steeper slopes and sudden onset of the movement. The typical characteristics of GLOFs are in between and vary for different channel sections. In summary, the major bottlenecks remain in deriving realistic or worst case scenarios and predicting their magnitude and area of impact. This mainly concerns uncertainties in the dam break process, involved volumes, erosion rates, changing rheologies, and the limited capabilities of available software packages to simulate process interactions and transformations such as the development of a hyperconcentrated flow into a debris flow. In addition, many areas prone to lake outburst floods are located in developing countries with a limited scope of the threatened population for decision-making and limited resources for mitigation.

  6. Modelling tropical cyclone hazards under climate change scenario using geospatial techniques

    Science.gov (United States)

    Hoque, M. A.; Phinn, S.; Roelfsema, C.; Childs, I.

    2016-11-01

    Tropical cyclones are a common and devastating natural disaster in many coastal areas of the world. As the intensity and frequency of cyclones will increase under the most likely future climate change scenarios, appropriate approaches at local scales (1-5 km) are essential for producing sufficiently detailed hazard models. These models are used to develop mitigation plans and strategies for reducing the impacts of cyclones. This study developed and tested a hazard modelling approach for cyclone impacts in Sarankhola upazila, a 151 km2 local government area in coastal Bangladesh. The study integrated remote sensing, spatial analysis and field data to model cyclone generated hazards under a climate change scenario at local scales covering model integrating historical cyclone data and Digital Elevation Model (DEM) was used to generate the cyclone hazard maps for different cyclone return periods. Frequency analysis was carried out using historical cyclone data (1960--2015) to calculate the storm surge heights of 5, 10, 20, 50 and 100 year return periods of cyclones. Local sea level rise scenario of 0.34 m for the year 2050 was simulated with 20 and 50 years return periods. Our results showed that cyclone affected areas increased with the increase of return periods. Around 63% of study area was located in the moderate to very high hazard zones for 50 year return period, while it was 70% for 100 year return period. The climate change scenarios increased the cyclone impact area by 6-10 % in every return period. Our findings indicate this approach has potential to model the cyclone hazards for developing mitigation plans and strategies to reduce the future impacts of cyclones.

  7. DEM resolution effects on shallow landslide hazard and soil redistribution modelling

    NARCIS (Netherlands)

    Claessens, L.F.G.; Heuvelink, G.B.M.; Schoorl, J.M.; Veldkamp, A.

    2005-01-01

    In this paper we analyse the effects of digital elevation model (DEM) resolution on the results of a model that simulates spatially explicit relative shallow landslide hazard and soil redistribution patterns and quantities. We analyse distributions of slope, specific catchment area and relative haza

  8. Measures to assess the prognostic ability of the stratified Cox proportional hazards model

    DEFF Research Database (Denmark)

    (Tybjaerg-Hansen, A.) The Fibrinogen Studies Collaboration.The Copenhagen City Heart Study; Tybjærg-Hansen, Anne

    2009-01-01

    Many measures have been proposed to summarize the prognostic ability of the Cox proportional hazards (CPH) survival model, although none is universally accepted for general use. By contrast, little work has been done to summarize the prognostic ability of the stratified CPH model; such measures w...

  9. Validation and evaluation of predicitive models in hazard assessment and risk management

    NARCIS (Netherlands)

    Beguería, S.

    2007-01-01

    The paper deals with the validation and evaluation of mathematical models in natural hazard analysis, with a special focus on establishing their predictive power. Although most of the tools and statistics available are common to general classification models, some peculiarites arise in the case of h

  10. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  11. Modelling long term survival with non-proportional hazards

    NARCIS (Netherlands)

    Perperoglou, Aristidis

    2006-01-01

    In this work I consider models for survival data when the assumption of proportionality does not hold. The thesis consists of an Introduction, five papers, a Discussion and an Appendix. The Introduction presents technical information about the Cox model and introduces the ideas behind the extensions

  12. Modeling and Testing Landslide Hazard Using Decision Tree

    Directory of Open Access Journals (Sweden)

    Mutasem Sh. Alkhasawneh

    2014-01-01

    Full Text Available This paper proposes a decision tree model for specifying the importance of 21 factors causing the landslides in a wide area of Penang Island, Malaysia. These factors are vegetation cover, distance from the fault line, slope angle, cross curvature, slope aspect, distance from road, geology, diagonal length, longitude curvature, rugosity, plan curvature, elevation, rain perception, soil texture, surface area, distance from drainage, roughness, land cover, general curvature, tangent curvature, and profile curvature. Decision tree models are used for prediction, classification, and factors importance and are usually represented by an easy to interpret tree like structure. Four models were created using Chi-square Automatic Interaction Detector (CHAID, Exhaustive CHAID, Classification and Regression Tree (CRT, and Quick-Unbiased-Efficient Statistical Tree (QUEST. Twenty-one factors were extracted using digital elevation models (DEMs and then used as input variables for the models. A data set of 137570 samples was selected for each variable in the analysis, where 68786 samples represent landslides and 68786 samples represent no landslides. 10-fold cross-validation was employed for testing the models. The highest accuracy was achieved using Exhaustive CHAID (82.0% compared to CHAID (81.9%, CRT (75.6%, and QUEST (74.0% model. Across the four models, five factors were identified as most important factors which are slope angle, distance from drainage, surface area, slope aspect, and cross curvature.

  13. Modeling contractor and company employee behavior in high hazard operation

    NARCIS (Netherlands)

    Lin, P.H.; Hanea, D.; Ale, B.J.M.

    2013-01-01

    The recent blow-out and subsequent environmental disaster in the Gulf of Mexico have highlighted a number of serious problems in scientific thinking about safety. Risk models have generally concentrated on technical failures, which are easier to model and for which there are more concrete data. Howe

  14. Modeling contractor and company employee behavior in high hazard operation

    NARCIS (Netherlands)

    Lin, P.H.; Hanea, D.; Ale, B.J.M.

    2013-01-01

    The recent blow-out and subsequent environmental disaster in the Gulf of Mexico have highlighted a number of serious problems in scientific thinking about safety. Risk models have generally concentrated on technical failures, which are easier to model and for which there are more concrete data. Howe

  15. Modeling contractor and company employee behavior in high hazard operation

    NARCIS (Netherlands)

    Lin, P.H.; Hanea, D.; Ale, B.J.M.

    2013-01-01

    The recent blow-out and subsequent environmental disaster in the Gulf of Mexico have highlighted a number of serious problems in scientific thinking about safety. Risk models have generally concentrated on technical failures, which are easier to model and for which there are more concrete data.

  16. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  17. A REVIEW OF MULTI-HAZARD RISK ASSESSMENT (MHRA USING 4D DYNAMIC MODELS

    Directory of Open Access Journals (Sweden)

    T. Bibi

    2015-10-01

    Full Text Available This paper reviews the 4D dynamic models for multi natural hazard risk assessment. It is important to review the characteristic of the different dynamic models and to choose the most suitable model for certain application. The characteristic of the different 4D dynamic models are based on several main aspects (e.g. space, time, event or phenomenon etc.. The most suitable 4D dynamic model depends on the type of application it is used for. There is no single 4D Dynamic model suitable for all types of application. Therefore, it is very important to define the requirements of the 4D Dynamic model. The main context of this paper is spatio temporal modelling for multi hazards.

  18. Validation of transport models using additive flux minimization technique

    Energy Technology Data Exchange (ETDEWEB)

    Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

    2013-10-15

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

  19. Just Another Gibbs Additive Modeler: Interfacing JAGS and mgcv

    Directory of Open Access Journals (Sweden)

    Simon N. Wood

    2016-12-01

    Full Text Available The BUGS language offers a very flexible way of specifying complex statistical models for the purposes of Gibbs sampling, while its JAGS variant offers very convenient R integration via the rjags package. However, including smoothers in JAGS models can involve some quite tedious coding, especially for multivariate or adaptive smoothers. Further, if an additive smooth structure is required then some care is needed, in order to centre smooths appropriately, and to find appropriate starting values. R package mgcv implements a wide range of smoothers, all in a manner appropriate for inclusion in JAGS code, and automates centring and other smooth setup tasks. The purpose of this note is to describe an interface between mgcv and JAGS, based around an R function, jagam, which takes a generalized additive model (GAM as specified in mgcv and automatically generates the JAGS model code and data required for inference about the model via Gibbs sampling. Although the auto-generated JAGS code can be run as is, the expectation is that the user would wish to modify it in order to add complex stochastic model components readily specified in JAGS. A simple interface is also provided for visualisation and further inference about the estimated smooth components using standard mgcv functionality. The methods described here will be un-necessarily inefficient if all that is required is fully Bayesian inference about a standard GAM, rather than the full flexibility of JAGS. In that case the BayesX package would be more efficient.

  20. Modeling the influence of limestone addition on cement hydration

    Directory of Open Access Journals (Sweden)

    Ashraf Ragab Mohamed

    2015-03-01

    Full Text Available This paper addresses the influence of using Portland limestone cement “PLC” on cement hydration by characterization of its microstructure development. The European Standard EN 197-1:2011 and Egyptian specification ESS 4756-1/2009 permit the cement to contain up to 20% ground limestone. The computational tools assist in better understanding the influence of limestone additions on cement hydration and microstructure development to facilitate the acceptance of these more economical and ecological materials. μic model has been developed to enable the modeling of microstructural evolution of cementitious materials. In this research μic model is used to simulate both the influence of limestone as fine filler, providing additional surfaces for the nucleation and growth of hydration products. Limestone powder also reacts relatively slow with hydrating cement to form monocarboaluminate (AFmc phase, similar to the mono-sulfoaluminate (AFm phase formed in ordinary Portland cement. The model results reveal that limestone cement has accelerated cement hydration rate, previous experimental results and computer model “cemhyd3d” are used to validate this model.

  1. Deterministic slope failure hazard assessment in a model catchment and its replication in neighbourhood terrain

    Directory of Open Access Journals (Sweden)

    Kiran Prasad Acharya

    2016-01-01

    Full Text Available In this work, we prepare and replicate a deterministic slope failure hazard model in small-scale catchments of tertiary sedimentary terrain of Niihama city in western Japan. It is generally difficult to replicate a deterministic model from one catchment to another due to lack of exactly similar geo-mechanical and hydrological parameters. To overcome this problem, discriminant function modelling was done with the deterministic slope failure hazard model and the DEM-based causal factors of slope failure, which yielded an empirical parametric relationship or a discriminant function equation. This parametric relationship was used to predict the slope failure hazard index in a total of 40 target catchments in the study area. From ROC plots, the prediction rate between 0.719–0.814 and 0.704–0.805 was obtained with inventories of September and October slope failures, respectively. This means September slope failures were better predicted than October slope failures by approximately 1%. The results show that the prediction of the slope failure hazard index is possible, even in a small catchment scale, in similar geophysical settings. Moreover, the replication of the deterministic model through discriminant function modelling was found to be successful in predicting typhoon rainfall-induced slope failures with moderate to good accuracy without any use of geo-mechanical and hydrological parameters.

  2. Predicting the Probability of Lightning Occurrence with Generalized Additive Models

    Science.gov (United States)

    Fabsic, Peter; Mayr, Georg; Simon, Thorsten; Zeileis, Achim

    2017-04-01

    This study investigates the predictability of lightning in complex terrain. The main objective is to estimate the probability of lightning occurrence in the Alpine region during summertime afternoons (12-18 UTC) at a spatial resolution of 64 × 64 km2. Lightning observations are obtained from the ALDIS lightning detection network. The probability of lightning occurrence is estimated using generalized additive models (GAM). GAMs provide a flexible modelling framework to estimate the relationship between covariates and the observations. The covariates, besides spatial and temporal effects, include numerous meteorological fields from the ECMWF ensemble system. The optimal model is chosen based on a forward selection procedure with out-of-sample mean squared error as a performance criterion. Our investigation shows that convective precipitation and mid-layer stability are the most influential meteorological predictors. Both exhibit intuitive, non-linear trends: higher values of convective precipitation indicate higher probability of lightning, and large values of the mid-layer stability measure imply low lightning potential. The performance of the model was evaluated against a climatology model containing both spatial and temporal effects. Taking the climatology model as a reference forecast, our model attains a Brier Skill Score of approximately 46%. The model's performance can be further enhanced by incorporating the information about lightning activity from the previous time step, which yields a Brier Skill Score of 48%. These scores show that the method is able to extract valuable information from the ensemble to produce reliable spatial forecasts of the lightning potential in the Alps.

  3. Efficient pan-European flood hazard modelling through a combination of statistical and physical models

    Science.gov (United States)

    Paprotny, Dominik; Morales Nápoles, Oswaldo

    2016-04-01

    Low-resolution hydrological models are often applied to calculate extreme river discharges and delimitate flood zones on continental and global scale. Still, the computational expense is very large and often limits the extent and depth of such studies. Here, we present a quick yet similarly accurate procedure for flood hazard assessment in Europe. Firstly, a statistical model based on Bayesian Networks is used. It describes the joint distribution of annual maxima of daily discharges of European rivers with variables describing the geographical characteristics of their catchments. It was quantified with 75,000 station-years of river discharge, as well as climate, terrain and land use data. The model's predictions of average annual maxima or discharges with certain return periods are of similar performance to physical rainfall-runoff models applied at continental scale. A database of discharge scenarios - return periods under present and future climate - was prepared for the majority of European rivers. Secondly, those scenarios were used as boundary conditions for one-dimensional (1D) hydrodynamic model SOBEK. Utilizing 1D instead of 2D modelling conserved computational time, yet gave satisfactory results. The resulting pan-European flood map was contrasted with some local high-resolution studies. Indeed, the comparison shows that, in overall, the methods presented here gave similar or better alignment with local studies than previously released pan-European flood map.

  4. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  5. A NOVEL SOFT COMPUTING MODEL ON LANDSLIDE HAZARD ZONE MAPPING

    Directory of Open Access Journals (Sweden)

    Iqbal Quraishi

    2012-11-01

    Full Text Available The effect of landslide is very prominent in India as well as world over. In India North-East region and all the areas beneath the Himalayan range is prone to landslide. As state wise Uttrakhand, Himachal Pradesh and northern part of West Bengal are identified as a risk zone for landslide. In West Bengal, Darjeeling area is identified as our focus zone. There are several types of landslides depending upon various conditions. Most contributing factor of landslide is Earthquakes. Both field and the GIS data are very versatile and large in amount. Creating a proper data warehouse includes both Remote and field studies. Our proposed soft computing model merge the field and remote sensing data and create an optimized landslide susceptible map of the zone and also provide a broad risk assessment. It takes into account census and economic survey data as an input to calculate and predict the probable number of damaged houses, roads, other amenities including the effect on GDP. The model is highly customizable and tends to provide situation specific results. A fuzzy logic based approach has been considered to partially implement the model in terms of different parameter data sets to show the effectiveness of the proposed model.

  6. Predictive models in hazard assessment of Great Lakes contaminants for fish

    Science.gov (United States)

    Passino, Dora R. May

    1986-01-01

    A hazard assessment scheme was developed and applied to predict potential harm to aquatic biota of nearly 500 organic compounds detected by gas chromatography/mass spectrometry (GC/MS) in Great Lakes fish. The frequency of occurrence and estimated concentrations of compounds found in lake trout (Salvelinus namaycush) and walleyes (Stizostedion vitreum vitreum) were compared with available manufacturing and discharge information. Bioconcentration potential of the compounds was estimated from available data or from calculations of quantitative structure-activity relationships (QSAR). Investigators at the National Fisheries Research Center-Great Lakes also measured the acute toxicity (48-h EC50's) of 35 representative compounds to Daphnia pulex and compared the results with acute toxicity values generated by QSAR. The QSAR-derived toxicities for several chemicals underestimated the actual acute toxicity by one or more orders of magnitude. A multiple regression of log EC50 on log water solubility and molecular volume proved to be a useful predictive model. Additional models providing insight into toxicity incorporate solvatochromic parameters that measure dipolarity/polarizability, hydrogen bond acceptor basicity, and hydrogen bond donor acidity of the solute (toxicant).

  7. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    Science.gov (United States)

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological

  8. Medical Modeling of Particle Size Effects for CB Inhalation Hazards

    Science.gov (United States)

    2015-09-01

    of body orientation ( posture ) on deposition, and availability of 30 stochastic lung geometry for more realistic assessment of variation of dose in... hygiene community, with their role of monitoring and protecting workers in the workplace, has had a key role in developing standard models of...M.G., Miller, F.J. and Raabe, O.G. (1995). Particle Inhalability Curves for Humans and Small Laboratory Animals. Annals of Occupational Hygiene 39

  9. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    Science.gov (United States)

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after

  10. Modeling geomagnetic induction hazards using a 3-D electrical conductivity model of Australia

    Science.gov (United States)

    Wang, Liejun; Lewis, Andrew M.; Ogawa, Yasuo; Jones, William V.; Costelloe, Marina T.

    2016-12-01

    The surface electric field induced by external geomagnetic source fields is modeled for a continental-scale 3-D electrical conductivity model of Australia at periods of a few minutes to a few hours. The amplitude and orientation of the induced electric field at periods of 360 s and 1800 s are presented and compared to those derived from a simplified ocean-continent (OC) electrical conductivity model. It is found that the induced electric field in the Australian region is distorted by the heterogeneous continental electrical conductivity structures and surrounding oceans. On the northern coastlines, the induced electric field is decreased relative to the simple OC model due to a reduced conductivity contrast between the seas and the enhanced conductivity structures inland. In central Australia, the induced electric field is less distorted with respect to the OC model as the location is remote from the oceans, but inland crustal high-conductivity anomalies are the major source of distortion of the induced electric field. In the west of the continent, the lower conductivity of the Western Australia Craton increases the conductivity contrast between the deeper oceans and land and significantly enhances the induced electric field. Generally, the induced electric field in southern Australia, south of latitude -20°, is higher compared to northern Australia. This paper provides a regional indicator of geomagnetic induction hazards across Australia.

  11. Semiparametric Additive Transformation Model under Current Status Data

    CERN Document Server

    Cheng, Guang

    2011-01-01

    We consider the efficient estimation of the semiparametric additive transformation model with current status data. A wide range of survival models and econometric models can be incorporated into this general transformation framework. We apply the B-spline approach to simultaneously estimate the linear regression vector, the nondecreasing transformation function, and a set of nonparametric regression functions. We show that the parametric estimate is semiparametric efficient in the presence of multiple nonparametric nuisance functions. An explicit consistent B-spline estimate of the asymptotic variance is also provided. All nonparametric estimates are smooth, and shown to be uniformly consistent and have faster than cubic rate of convergence. Interestingly, we observe the convergence rate interfere phenomenon, i.e., the convergence rates of B-spline estimators are all slowed down to equal the slowest one. The constrained optimization is not required in our implementation. Numerical results are used to illustra...

  12. Thermal modelling of extrusion based additive manufacturing of composite materials

    DEFF Research Database (Denmark)

    Jensen, Mathias Laustsen; Sonne, Mads Rostgaard; Hattel, Jesper Henri

    2017-01-01

    One of the hottest topics regarding manufacturing these years is additive manufacturing (AM). AM is a young branch of manufacturing techniques, which by nature is disruptive due to its completely different manufacturing approach, wherein material is added instead of removed. By adding material...... process knowledge, and validating the generated toolpaths before the real manufacturing process takes place: Hence removing time consuming and expensive trial-and-error processes for new products. This study applies a 2D restricted finite volume model aimed to describe thermoplastic Acrylonitrille......-butadiene-styrene (ABS) and thermosetting polyurethane (PU) material extrusion processes. During the experimental evaluation of the produced models it is found that some critical material properties needs to be further investigated to increase the precision of the model. It is however also found that even with only...

  13. An example of debris-flows hazard modeling using GIS

    Directory of Open Access Journals (Sweden)

    L. Melelli

    2004-01-01

    Full Text Available We present a GIS-based model for predicting debris-flows occurrence. The availability of two different digital datasets and the use of a Digital Elevation Model (at a given scale have greatly enhanced our ability to quantify and to analyse the topography in relation to debris-flows. In particular, analysing the relationship between debris-flows and the various causative factors provides new understanding of the mechanisms. We studied the contact zone between the calcareous basement and the fluvial-lacustrine infill adjacent northern area of the Terni basin (Umbria, Italy, and identified eleven basins and corresponding alluvial fans. We suggest that accumulations of colluvium in topographic hollows, whatever the sources might be, should be considered potential debris-flow source areas. In order to develop a susceptibility map for the entire area, an index was calculated from the number of initiation locations in each causative factor unit divided by the areal extent of that unit within the study area. This index identifies those units that produce the most debris-flows in each Representative Elementary Area (REA. Finally, the results are presented with the advantages and the disadvantages of the approach, and the need for further research.

  14. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    Science.gov (United States)

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  15. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    Science.gov (United States)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  16. Combining computational models for landslide hazard assessment of Guantánamo province, Cuba

    NARCIS (Netherlands)

    Castellanos Abella, E.A.

    2009-01-01

    As part of the Cuban system for landslide disaster management, a methodology was developed for regional scale landslide hazard assessment, which is a combination of different models. The method was applied in Guantánamo province at 1:100 000 scale. The analysis started with an extensive aerial photo

  17. On estimation and tests of time-varying effects in the proportional hazards model

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder; Martinussen, Torben

    Grambsch and Therneau (1994) suggest to use Schoenfeld's Residuals to investigate whether some of the regression coefficients in Cox'(1972) proportional hazards model are time-dependent. Their method is a one-step procedure based on Cox' initial estimate. We suggest an algorithm which in the first...

  18. Combining computational models for landslide hazard assessment of Guantánamo province, Cuba

    NARCIS (Netherlands)

    Castellanos Abella, E.A.

    2008-01-01

    As part of the Cuban system for landslide disaster management, a methodology was developed for regional scale landslide hazard assessment, which is a combination of different models. The method was applied in Guantánamo province at 1:100 000 scale. The analysis started with an extensive aerial

  19. Teen Court Referral, Sentencing, and Subsequent Recidivism: Two Proportional Hazards Models and a Little Speculation

    Science.gov (United States)

    Rasmussen, Andrew

    2004-01-01

    This study extends literature on recidivism after teen court to add system-level variables to demographic and sentence content as relevant covariates. Interviews with referral agents and survival analysis with proportional hazards regression supplement quantitative models that include demographic, sentencing, and case-processing variables in a…

  20. Testing exclusion restrictions and additive separability in sample selection models

    DEFF Research Database (Denmark)

    Huber, Martin; Mellace, Giovanni

    2014-01-01

    Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction of these......Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction...... of these assumptions by applying the approach of Huber and Mellace (Testing instrument validity for LATE identification based on inequality moment constraints, 2011) (for testing instrument validity under treatment endogeneity) to the sample selection framework. We show that the exclusion restriction and additive...... separability imply two testable inequality constraints that come from both point identifying and bounding the outcome distribution of the subpopulation that is always selected/observed. We apply the tests to two variables for which the exclusion restriction is frequently invoked in female wage regressions: non...

  1. Utilizing NASA Earth Observations to Model Volcanic Hazard Risk Levels in Areas Surrounding the Copahue Volcano in the Andes Mountains

    Science.gov (United States)

    Keith, A. M.; Weigel, A. M.; Rivas, J.

    2014-12-01

    Copahue is a stratovolcano located along the rim of the Caviahue Caldera near the Chile-Argentina border in the Andes Mountain Range. There are several small towns located in proximity of the volcano with the two largest being Banos Copahue and Caviahue. During its eruptive history, it has produced numerous lava flows, pyroclastic flows, ash deposits, and lahars. This isolated region has steep topography and little vegetation, rendering it poorly monitored. The need to model volcanic hazard risk has been reinforced by recent volcanic activity that intermittently released several ash plumes from December 2012 through May 2013. Exposure to volcanic ash is currently the main threat for the surrounding populations as the volcano becomes more active. The goal of this project was to study Copahue and determine areas that have the highest potential of being affected in the event of an eruption. Remote sensing techniques were used to examine and identify volcanic activity and areas vulnerable to experiencing volcanic hazards including volcanic ash, SO2 gas, lava flow, pyroclastic density currents and lahars. Landsat 7 Enhanced Thematic Mapper Plus (ETM+), Landsat 8 Operational Land Imager (OLI), EO-1 Advanced Land Imager (ALI), Terra Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Shuttle Radar Topography Mission (SRTM), ISS ISERV Pathfinder, and Aura Ozone Monitoring Instrument (OMI) products were used to analyze volcanic hazards. These datasets were used to create a historic lava flow map of the Copahue volcano by identifying historic lava flows, tephra, and lahars both visually and spectrally. Additionally, a volcanic risk and hazard map for the surrounding area was created by modeling the possible extent of ash fallout, lahars, lava flow, and pyroclastic density currents (PDC) for future eruptions. These model results were then used to identify areas that should be prioritized for disaster relief and evacuation orders.

  2. Two-stage local M-estimation of additive models

    Institute of Scientific and Technical Information of China (English)

    JIANG JianCheng; LI JianTao

    2008-01-01

    This paper studies local M-estimation of the nonparametric components of additive models. A two-stage local M-estimation procedure is proposed for estimating the additive components and their derivatives. Under very mild conditions, the proposed estimators of each additive component and its derivative are jointly asymptotically normal and share the same asymptotic distributions as they would be if the other components were known. The established asymptotic results also hold for two particular local M-estimations: the local least squares and least absolute deviation estimations. However,for general two-stage local M-estimation with continuous and nonlinear ψ-functions, its implementation is time-consuming. To reduce the computational burden, one-step approximations to the two-stage local M-estimators are developed. The one-step estimators are shown to achieve the same efficiency as the fully iterative two-stage local M-estimators, which makes the two-stage local M-estimation more feasible in practice. The proposed estimators inherit the advantages and at the same time overcome the disadvantages of the local least-squares based smoothers. In addition, the practical implementation of the proposed estimation is considered in details. Simulations demonstrate the merits of the two-stage local M-estimation, and a real example illustrates the performance of the methodology.

  3. Two-stage local M-estimation of additive models

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper studies local M-estimation of the nonparametric components of additive models.A two-stage local M-estimation procedure is proposed for estimating the additive components and their derivatives.Under very mild conditions,the proposed estimators of each additive component and its derivative are jointly asymptotically normal and share the same asymptotic distributions as they would be if the other components were known.The established asymptotic results also hold for two particular local M-estimations:the local least squares and least absolute deviation estimations.However,for general two-stage local M-estimation with continuous and nonlinear ψ-functions,its implementation is time-consuming.To reduce the computational burden,one-step approximations to the two-stage local M-estimators are developed.The one-step estimators are shown to achieve the same effciency as the fully iterative two-stage local M-estimators,which makes the two-stage local M-estimation more feasible in practice.The proposed estimators inherit the advantages and at the same time overcome the disadvantages of the local least-squares based smoothers.In addition,the practical implementation of the proposed estimation is considered in details.Simulations demonstrate the merits of the two-stage local M-estimation,and a real example illustrates the performance of the methodology.

  4. Global river flood hazard maps: hydraulic modelling methods and appropriate uses

    Science.gov (United States)

    Townend, Samuel; Smith, Helen; Molloy, James

    2014-05-01

    Flood hazard is not well understood or documented in many parts of the world. Consequently, the (re-)insurance sector now needs to better understand where the potential for considerable river flooding aligns with significant exposure. For example, international manufacturing companies are often attracted to countries with emerging economies, meaning that events such as the 2011 Thailand floods have resulted in many multinational businesses with assets in these regions incurring large, unexpected losses. This contribution addresses and critically evaluates the hydraulic methods employed to develop a consistent global scale set of river flood hazard maps, used to fill the knowledge gap outlined above. The basis of the modelling approach is an innovative, bespoke 1D/2D hydraulic model (RFlow) which has been used to model a global river network of over 5.3 million kilometres. Estimated flood peaks at each of these model nodes are determined using an empirically based rainfall-runoff approach linking design rainfall to design river flood magnitudes. The hydraulic model is used to determine extents and depths of floodplain inundation following river bank overflow. From this, deterministic flood hazard maps are calculated for several design return periods between 20-years and 1,500-years. Firstly, we will discuss the rationale behind the appropriate hydraulic modelling methods and inputs chosen to produce a consistent global scaled river flood hazard map. This will highlight how a model designed to work with global datasets can be more favourable for hydraulic modelling at the global scale and why using innovative techniques customised for broad scale use are preferable to modifying existing hydraulic models. Similarly, the advantages and disadvantages of both 1D and 2D modelling will be explored and balanced against the time, computer and human resources available, particularly when using a Digital Surface Model at 30m resolution. Finally, we will suggest some

  5. Building a risk-targeted regional seismic hazard model for South-East Asia

    Science.gov (United States)

    Woessner, J.; Nyst, M.; Seyhan, E.

    2015-12-01

    The last decade has tragically shown the social and economic vulnerability of countries in South-East Asia to earthquake hazard and risk. While many disaster mitigation programs and initiatives to improve societal earthquake resilience are under way with the focus on saving lives and livelihoods, the risk management sector is challenged to develop appropriate models to cope with the economic consequences and impact on the insurance business. We present the source model and ground motions model components suitable for a South-East Asia earthquake risk model covering Indonesia, Malaysia, the Philippines and Indochine countries. The source model builds upon refined modelling approaches to characterize 1) seismic activity from geologic and geodetic data on crustal faults and 2) along the interface of subduction zones and within the slabs and 3) earthquakes not occurring on mapped fault structures. We elaborate on building a self-consistent rate model for the hazardous crustal fault systems (e.g. Sumatra fault zone, Philippine fault zone) as well as the subduction zones, showcase some characteristics and sensitivities due to existing uncertainties in the rate and hazard space using a well selected suite of ground motion prediction equations. Finally, we analyze the source model by quantifying the contribution by source type (e.g., subduction zone, crustal fault) to typical risk metrics (e.g.,return period losses, average annual loss) and reviewing their relative impact on various lines of businesses.

  6. Sensitivity analysis of geometric errors in additive manufacturing medical models.

    Science.gov (United States)

    Pinto, Jose Miguel; Arrieta, Cristobal; Andia, Marcelo E; Uribe, Sergio; Ramos-Grez, Jorge; Vargas, Alex; Irarrazaval, Pablo; Tejos, Cristian

    2015-03-01

    Additive manufacturing (AM) models are used in medical applications for surgical planning, prosthesis design and teaching. For these applications, the accuracy of the AM models is essential. Unfortunately, this accuracy is compromised due to errors introduced by each of the building steps: image acquisition, segmentation, triangulation, printing and infiltration. However, the contribution of each step to the final error remains unclear. We performed a sensitivity analysis comparing errors obtained from a reference with those obtained modifying parameters of each building step. Our analysis considered global indexes to evaluate the overall error, and local indexes to show how this error is distributed along the surface of the AM models. Our results show that the standard building process tends to overestimate the AM models, i.e. models are larger than the original structures. They also show that the triangulation resolution and the segmentation threshold are critical factors, and that the errors are concentrated at regions with high curvatures. Errors could be reduced choosing better triangulation and printing resolutions, but there is an important need for modifying some of the standard building processes, particularly the segmentation algorithms. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  7. Seismic hazard assessment for Myanmar: Earthquake model database, ground-motion scenarios, and probabilistic assessments

    Science.gov (United States)

    Chan, C. H.; Wang, Y.; Thant, M.; Maung Maung, P.; Sieh, K.

    2015-12-01

    We have constructed an earthquake and fault database, conducted a series of ground-shaking scenarios, and proposed seismic hazard maps for all of Myanmar and hazard curves for selected cities. Our earthquake database integrates the ISC, ISC-GEM and global ANSS Comprehensive Catalogues, and includes harmonized magnitude scales without duplicate events. Our active fault database includes active fault data from previous studies. Using the parameters from these updated databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and the elapse time of last events), we have determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the modelled ground motions to the felt intensities of earthquakes. Through the case of the 1975 Bagan earthquake, we determined that Atkinson and Moore's (2003) scenario using the ground motion prediction equations (GMPEs) fits the behaviours of the subduction events best. Also, the 2011 Tarlay and 2012 Thabeikkyin events suggested the GMPEs of Akkar and Cagnan (2010) fit crustal earthquakes best. We thus incorporated the best-fitting GMPEs and site conditions based on Vs30 (the average shear-velocity down to 30 m depth) from analysis of topographic slope and microtremor array measurements to assess seismic hazard. The hazard is highest in regions close to the Sagaing Fault and along the Western Coast of Myanmar as seismic sources there have earthquakes occur at short intervals and/or last events occurred a long time ago. The hazard curves for the cities of Bago, Mandalay, Sagaing, Taungoo and Yangon show higher hazards for sites close to an active fault or with a low Vs30, e.g., the downtown of Sagaing and Shwemawdaw Pagoda in Bago.

  8. Estimation and variable selection for generalized additive partial linear models

    KAUST Repository

    Wang, Li

    2011-08-01

    We study generalized additive partial linear models, proposing the use of polynomial spline smoothing for estimation of nonparametric functions, and deriving quasi-likelihood based estimators for the linear parameters. We establish asymptotic normality for the estimators of the parametric components. The procedure avoids solving large systems of equations as in kernel-based procedures and thus results in gains in computational simplicity. We further develop a class of variable selection procedures for the linear parameters by employing a nonconcave penalized quasi-likelihood, which is shown to have an asymptotic oracle property. Monte Carlo simulations and an empirical example are presented for illustration. © Institute of Mathematical Statistics, 2011.

  9. Additive Manufacturing of Medical Models--Applications in Rhinology.

    Science.gov (United States)

    Raos, Pero; Klapan, Ivica; Galeta, Tomislav

    2015-09-01

    In the paper we are introducing guidelines and suggestions for use of 3D image processing SW in head pathology diagnostic and procedures for obtaining physical medical model by additive manufacturing/rapid prototyping techniques, bearing in mind the improvement of surgery performance, its maximum security and faster postoperative recovery of patients. This approach has been verified in two case reports. In the treatment we used intelligent classifier-schemes for abnormal patterns using computer-based system for 3D-virtual and endoscopic assistance in rhinology, with appropriate visualization of anatomy and pathology within the nose, paranasal sinuses, and scull base area.

  10. Malliavin's calculus in insider models: Additional utility and free lunches

    OpenAIRE

    2002-01-01

    We consider simple models of financial markets with regular traders and insiders possessing some extra information hidden in a random variable which is accessible to the regular trader only at the end of the trading interval. The problems we focus on are the calculation of the additional utility of the insider and a study of his free lunch possibilities. The information drift, i.e. the drift to eliminate in order to preserve the martingale property in the insider's filtration, turns out to be...

  11. Using the RBFN model and GIS technique to assess wind erosion hazards of Inner Mongolia, China

    Science.gov (United States)

    Shi, Huading; Liu, Jiyuan; Zhuang, Dafang; Hu, Yunfeng

    2006-08-01

    Soil wind erosion is the primary process and the main driving force for land desertification and sand-dust storms in arid and semi-arid areas of Northern China. Many researchers have paid more attention to this issue. This paper select Inner Mongolia autonomous region as the research area, quantify the various indicators affecting the soil wind erosion, using the GIS technology to extract the spatial data, and construct the RBFN (Radial Basis Function Network) model for assessment of wind erosion hazard. After training the sample data of the different levels of wind erosion hazard, we get the parameters of the model, and then assess the wind erosion hazard. The result shows that in the Southern parts of Inner Mongolia wind erosion hazard are very severe, counties in the middle regions of Inner Mongolia vary from moderate to severe, and in eastern are slight. The comparison of the result with other researches shows that the result is in conformity with actual conditions, proving the reasonability and applicability of the RBFN model.

  12. Cost models of additive manufacturing: A literature review

    Directory of Open Access Journals (Sweden)

    G. Costabile

    2016-11-01

    Full Text Available From the past decades, increasing attention has been paid to the quality level of technological and mechanical properties achieved by the Additive Manufacturing (AM; these two elements have achieved a good performance, and it is possible to compare this with the results achieved by traditional technology. Therefore, the AM maturity is high enough to let industries adopt this technology in a more general production framework as the mechanical manufacturing industrial one is. Since the technological and mechanical properties are also beneficial for the materials produced with AM, the primary objective of this paper is to focus more on managerial facets, such as the cost control of a production environment, where these new technologies are present. This paper aims to analyse the existing literature about the cost models developed specifically for AM from an operations management point of view and discusses the strengths and weaknesses of all models.

  13. Multiscale Modeling of Powder Bed-Based Additive Manufacturing

    Science.gov (United States)

    Markl, Matthias; Körner, Carolin

    2016-07-01

    Powder bed fusion processes are additive manufacturing technologies that are expected to induce the third industrial revolution. Components are built up layer by layer in a powder bed by selectively melting confined areas, according to sliced 3D model data. This technique allows for manufacturing of highly complex geometries hardly machinable with conventional technologies. However, the underlying physical phenomena are sparsely understood and difficult to observe during processing. Therefore, an intensive and expensive trial-and-error principle is applied to produce components with the desired dimensional accuracy, material characteristics, and mechanical properties. This review presents numerical modeling approaches on multiple length scales and timescales to describe different aspects of powder bed fusion processes. In combination with tailored experiments, the numerical results enlarge the process understanding of the underlying physical mechanisms and support the development of suitable process strategies and component topologies.

  14. Kinetics approach to modeling of polymer additive degradation in lubricants

    Institute of Scientific and Technical Information of China (English)

    llyaI.KUDISH; RubenG.AIRAPETYAN; Michael; J.; COVITCH

    2001-01-01

    A kinetics problem for a degrading polymer additive dissolved in a base stock is studied.The polymer degradation may be caused by the combination of such lubricant flow parameters aspressure, elongational strain rate, and temperature as well as lubricant viscosity and the polymercharacteristics (dissociation energy, bead radius, bond length, etc.). A fundamental approach tothe problem of modeling mechanically induced polymer degradation is proposed. The polymerdegradation is modeled on the basis of a kinetic equation for the density of the statistical distribu-tion of polymer molecules as a function of their molecular weight. The integrodifferential kineticequation for polymer degradation is solved numerically. The effects of pressure, elongational strainrate, temperature, and lubricant viscosity on the process of lubricant degradation are considered.The increase of pressure promotes fast degradation while the increase of temperature delaysdegradation. A comparison of a numerically calculated molecular weight distribution with an ex-perimental one obtained in bench tests showed that they are in excellent agreement with eachother.

  15. Model Checking Vector Addition Systems with one zero-test

    CERN Document Server

    Bonet, Rémi; Leroux, Jérôme; Zeitoun, Marc

    2012-01-01

    We design a variation of the Karp-Miller algorithm to compute, in a forward manner, a finite representation of the cover (i.e., the downward closure of the reachability set) of a vector addition system with one zero-test. This algorithm yields decision procedures for several problems for these systems, open until now, such as place-boundedness or LTL model-checking. The proof techniques to handle the zero-test are based on two new notions of cover: the refined and the filtered cover. The refined cover is a hybrid between the reachability set and the classical cover. It inherits properties of the reachability set: equality of two refined covers is undecidable, even for usual Vector Addition Systems (with no zero-test), but the refined cover of a Vector Addition System is a recursive set. The second notion of cover, called the filtered cover, is the central tool of our algorithms. It inherits properties of the classical cover, and in particular, one can effectively compute a finite representation of this set, e...

  16. WATEQ3 geochemical model: thermodynamic data for several additional solids

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, K.M.; Jenne, E.A.

    1982-09-01

    Geochemical models such as WATEQ3 can be used to model the concentrations of water-soluble pollutants that may result from the disposal of nuclear waste and retorted oil shale. However, for a model to competently deal with these water-soluble pollutants, an adequate thermodynamic data base must be provided that includes elements identified as important in modeling these pollutants. To this end, several minerals and related solid phases were identified that were absent from the thermodynamic data base of WATEQ3. In this study, the thermodynamic data for the identified solids were compiled and selected from several published tabulations of thermodynamic data. For these solids, an accepted Gibbs free energy of formation, ..delta..G/sup 0//sub f,298/, was selected for each solid phase based on the recentness of the tabulated data and on considerations of internal consistency with respect to both the published tabulations and the existing data in WATEQ3. For those solids not included in these published tabulations, Gibbs free energies of formation were calculated from published solubility data (e.g., lepidocrocite), or were estimated (e.g., nontronite) using a free-energy summation method described by Mattigod and Sposito (1978). The accepted or estimated free energies were then combined with internally consistent, ancillary thermodynamic data to calculate equilibrium constants for the hydrolysis reactions of these minerals and related solid phases. Including these values in the WATEQ3 data base increased the competency of this geochemical model in applications associated with the disposal of nuclear waste and retorted oil shale. Additional minerals and related solid phases that need to be added to the solubility submodel will be identified as modeling applications continue in these two programs.

  17. Metal Big Area Additive Manufacturing: Process Modeling and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Simunovic, Srdjan [ORNL; Nycz, Andrzej [ORNL; Noakes, Mark W [ORNL; Chin, Charlie [Dassault Systemes; Oancea, Victor [Dassault Systemes

    2017-01-01

    Metal Big Area Additive Manufacturing (mBAAM) is a new additive manufacturing (AM) technology for printing large-scale 3D objects. mBAAM is based on the gas metal arc welding process and uses a continuous feed of welding wire to manufacture an object. An electric arc forms between the wire and the substrate, which melts the wire and deposits a bead of molten metal along the predetermined path. In general, the welding process parameters and local conditions determine the shape of the deposited bead. The sequence of the bead deposition and the corresponding thermal history of the manufactured object determine the long range effects, such as thermal-induced distortions and residual stresses. Therefore, the resulting performance or final properties of the manufactured object are dependent on its geometry and the deposition path, in addition to depending on the basic welding process parameters. Physical testing is critical for gaining the necessary knowledge for quality prints, but traversing the process parameter space in order to develop an optimized build strategy for each new design is impractical by pure experimental means. Computational modeling and optimization may accelerate development of a build process strategy and saves time and resources. Because computational modeling provides these opportunities, we have developed a physics-based Finite Element Method (FEM) simulation framework and numerical models to support the mBAAM process s development and design. In this paper, we performed a sequentially coupled heat transfer and stress analysis for predicting the final deformation of a small rectangular structure printed using the mild steel welding wire. Using the new simulation technologies, material was progressively added into the FEM simulation as the arc weld traversed the build path. In the sequentially coupled heat transfer and stress analysis, the heat transfer was performed to calculate the temperature evolution, which was used in a stress analysis to

  18. Advances in National Capabilities for Consequence Assessment Modeling of Airborne Hazards

    Energy Technology Data Exchange (ETDEWEB)

    Nasstrom, J; Sugiyama, G; Foster, K; Larsen, S; Kosovic, B; Eme, B; Walker, H; Goldstein, P; Lundquist, J; Pobanz, B; Fulton, J

    2007-11-26

    This paper describes ongoing advancement of airborne hazard modeling capabilities in support of multiple agencies through the National Atmospheric Release Advisory Center (NARAC) and the Interagency Atmospheric Modeling and Atmospheric Assessment Center (IMAAC). A suite of software tools developed by Lawrence Livermore National Laboratory (LLNL) and collaborating organizations includes simple stand-alone, local-scale plume modeling tools for end user's computers, Web- and Internet-based software to access advanced 3-D flow and atmospheric dispersion modeling tools and expert analysis from the national center at LLNL, and state-of-the-science high-resolution urban models and event reconstruction capabilities.

  19. Patterns of Risk Using an Integrated Spatial Multi-Hazard Model (PRISM Model)

    Science.gov (United States)

    Multi-hazard risk assessment has long centered on small scale needs, whereby a single community or group of communities’ exposures are assessed to determine potential mitigation strategies. While this approach has advanced the understanding of hazard interactions, it is li...

  20. Deriving global flood hazard maps of fluvial floods through a physical model cascade

    Science.gov (United States)

    Pappenberger, Florian; Dutra, Emanuel; Wetterhall, Fredrik; Cloke, Hannah L.

    2013-04-01

    Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979-2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25 × 25 km grid, which is then reprojected onto a 1 × 1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25 × 25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.

  1. Deriving global flood hazard maps of fluvial floods through a physical model cascade

    Directory of Open Access Journals (Sweden)

    F. Pappenberger

    2012-11-01

    Full Text Available Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (reinsurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010 simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25 × 25 km grid, which is then reprojected onto a 1 × 1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25 × 25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.

  2. A Mathematical Model for the Industrial Hazardous Waste Location-Routing Problem

    Directory of Open Access Journals (Sweden)

    Omid Boyer

    2013-01-01

    Full Text Available Technology progress is a cause of industrial hazardous wastes increasing in the whole world . Management of hazardous waste is a significant issue due to the imposed risk on environment and human life. This risk can be a result of location of undesirable facilities and also routing hazardous waste. In this paper a biobjective mixed integer programing model for location-routing industrial hazardous waste with two objectives is developed. First objective is total cost minimization including transportation cost, operation cost, initial investment cost, and cost saving from selling recycled waste. Second objective is minimization of transportation risk. Risk of population exposure within bandwidth along route is used to measure transportation risk. This model can help decision makers to locate treatment, recycling, and disposal centers simultaneously and also to route waste between these facilities considering risk and cost criteria. The results of the solved problem prove conflict between two objectives. Hence, it is possible to decrease the cost value by marginally increasing the transportation risk value and vice versa. A weighted sum method is utilized to combine two objectives function into one objective function. To solve the problem GAMS software with CPLEX solver is used. The problem is applied in Markazi province in Iran.

  3. Geoinformational prognostic model of mudflows hazard and mudflows risk for the territory of Ukrainian Carpathians

    Science.gov (United States)

    Chepurna, Tetiana B.; Kuzmenko, Eduard D.; Chepurnyj, Igor V.

    2017-06-01

    The article is devoted to the geological issue of the space-time regional prognostication of mudflow hazard. The methodology of space-time prediction of mudflows hazard by creating GIS predictive model has been developed. Using GIS technologies the relevant and representative complex of significant influence of spatial and temporal factors, adjusted to use in the regional prediction of mudflows hazard, were selected. Geological, geomorphological, technological, climatic, and landscape factors have been selected as spatial mudflow factors. Spatial analysis is based on detection of a regular connection of spatial factor characteristics with spatial distribution of the mudflow sites. The function of a standard complex spatial index (SCSI) of the probability of the mudflow sites distribution has been calculated. The temporal, long-term prediction of the mudflows activity was based on the hypothesis of the regular reiteration of natural processes. Heliophysical, seismic, meteorological, and hydrogeological factors have been selected as time mudflow factors. The function of a complex index of long standing mudflow activity (CIMA) has been calculated. The prognostic geoinformational model of mudflow hazard up to 2020 year, a year of the next peak of the mudflows activity, has been created. Mudflow risks have been counted and carogram of mudflow risk assessment within the limits of administrative-territorial units has been built for 2020 year.

  4. Critical load analysis in hazard assessment of metals using a Unit World Model.

    Science.gov (United States)

    Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L

    2011-09-01

    A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic

  5. Integrating fault and seismological data into a probabilistic seismic hazard model for Italy.

    Science.gov (United States)

    Valentini, Alessandro; Visini, Francesco; Pace, Bruno

    2017-04-01

    We present the results of new probabilistic seismic hazard analysis (PSHA) for Italy based on active fault and seismological data. Combining seismic hazard from active fault with distributed seismic sources (where there are no data on active faults) is the backbone of this work. Far away from identifying a best procedure, currently adopted approaches combine active faults and background sources applying a threshold magnitude, generally between 5.5 and 7, over which seismicity is modelled by faults, and under which is modelled by distributed sources or area sources. In our PSHA we (i) apply a new method for the treatment of geologic data of major active faults and (ii) propose a new approach to combine these data with historical seismicity to evaluate PSHA for Italy. Assuming that deformation is concentrated in correspondence of fault, we combine the earthquakes occurrences derived from the geometry and slip rates of the active faults with the earthquakes from the spatially smoothed earthquake sources. In the vicinity of an active fault, the smoothed seismic activity is gradually reduced by a fault-size driven factor. Even if the range and gross spatial distribution of expected accelerations obtained in our work are comparable to the ones obtained through methods applying seismic catalogues and classical zonation models, the main difference is in the detailed spatial pattern of our PSHA model: our model is characterized by spots of more hazardous area, in correspondence of mapped active faults, while the previous models give expected accelerations almost uniformly distributed in large regions. Finally, we investigate the impact due to the earthquake rates derived from two magnitude-frequency distribution (MFD) model for faults on the hazard result and in respect to the contribution of faults versus distributed seismic activity.

  6. Geo-additive modelling of malaria in Burundi

    Directory of Open Access Journals (Sweden)

    Gebhardt Albrecht

    2011-08-01

    Full Text Available Abstract Background Malaria is a major public health issue in Burundi in terms of both morbidity and mortality, with around 2.5 million clinical cases and more than 15,000 deaths each year. It is still the single main cause of mortality in pregnant women and children below five years of age. Because of the severe health and economic burden of malaria, there is still a growing need for methods that will help to understand the influencing factors. Several studies/researches have been done on the subject yielding different results as which factors are most responsible for the increase in malaria transmission. This paper considers the modelling of the dependence of malaria cases on spatial determinants and climatic covariates including rainfall, temperature and humidity in Burundi. Methods The analysis carried out in this work exploits real monthly data collected in the area of Burundi over 12 years (1996-2007. Semi-parametric regression models are used. The spatial analysis is based on a geo-additive model using provinces as the geographic units of study. The spatial effect is split into structured (correlated and unstructured (uncorrelated components. Inference is fully Bayesian and uses Markov chain Monte Carlo techniques. The effects of the continuous covariates are modelled by cubic p-splines with 20 equidistant knots and second order random walk penalty. For the spatially correlated effect, Markov random field prior is chosen. The spatially uncorrelated effects are assumed to be i.i.d. Gaussian. The effects of climatic covariates and the effects of other spatial determinants are estimated simultaneously in a unified regression framework. Results The results obtained from the proposed model suggest that although malaria incidence in a given month is strongly positively associated with the minimum temperature of the previous months, regional patterns of malaria that are related to factors other than climatic variables have been identified

  7. Review and perspectives: Understanding natural-hazards-generated ionospheric perturbations using GPS measurements and coupled modeling

    Science.gov (United States)

    Komjathy, Attila; Yang, Yu-Ming; Meng, Xing; Verkhoglyadova, Olga; Mannucci, Anthony J.; Langley, Richard B.

    2016-07-01

    Natural hazards including earthquakes, volcanic eruptions, and tsunamis have been significant threats to humans throughout recorded history. Global navigation satellite systems (GNSS; including the Global Positioning System (GPS)) receivers have become primary sensors to measure signatures associated with natural hazards. These signatures typically include GPS-derived seismic deformation measurements, coseismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure, model, and monitor postseismic ionospheric disturbances caused by, e.g., earthquakes, volcanic eruptions, and tsunamis. In this paper, we review research progress at the Jet Propulsion Laboratory and elsewhere using examples of ground-based and spaceborne observation of natural hazards that generated TEC perturbations. We present results for state-of-the-art imaging using ground-based and spaceborne ionospheric measurements and coupled atmosphere-ionosphere modeling of ionospheric TEC perturbations. We also report advancements and chart future directions in modeling and inversion techniques to estimate tsunami wave heights and ground surface displacements using TEC measurements and error estimates. Our initial retrievals strongly suggest that both ground-based and spaceborne GPS remote sensing techniques could play a critical role in detection and imaging of the upper atmosphere signatures of natural hazards including earthquakes and tsunamis. We found that combining ground-based and spaceborne measurements may be crucial in estimating critical geophysical parameters such as tsunami wave heights and ground surface displacements using TEC observations. The GNSS-based remote sensing of natural-hazard-induced ionospheric disturbances could be applied to and used in operational tsunami and earthquake early warning systems.

  8. Techniques, advances, problems and issues in numerical modelling of landslide hazard

    CERN Document Server

    Van Asch, Theo; Van Beek, Ludovicus; Amitrano, David

    2007-01-01

    Slope movements (e.g. landslides) are dynamic systems that are complex in time and space and closely linked to both inherited and current preparatory and triggering controls. It is not yet possible to assess in all cases conditions for failure, reactivation and rapid surges and successfully simulate their transient and multi-dimensional behaviour and development, although considerable progress has been made in isolating many of the key variables and elementary mechanisms and to include them in physically-based models for landslide hazard assessments. Therefore, the objective of this paper is to review the state-of-the-art in the understanding of landslide processes and to identify some pressing challenges for the development of our modelling capabilities in the forthcoming years for hazard assessment. This paper focuses on the special nature of slope movements and the difficulties related to simulating their complex time-dependent behaviour in mathematical, physically-based models. It analyses successively th...

  9. [Critical of the additive model of the randomized controlled trial].

    Science.gov (United States)

    Boussageon, Rémy; Gueyffier, François; Bejan-Angoulvant, Theodora; Felden-Dominiak, Géraldine

    2008-01-01

    Randomized, double-blind, placebo-controlled clinical trials are currently the best way to demonstrate the clinical effectiveness of drugs. Its methodology relies on the method of difference (John Stuart Mill), through which the observed difference between two groups (drug vs placebo) can be attributed to the pharmacological effect of the drug being tested. However, this additive model can be questioned in the event of statistical interactions between the pharmacological and the placebo effects. Evidence in different domains has shown that the placebo effect can influence the effect of the active principle. This article evaluates the methodological, clinical and epistemological consequences of this phenomenon. Topics treated include extrapolating results, accounting for heterogeneous results, demonstrating the existence of several factors in the placebo effect, the necessity to take these factors into account for given symptoms or pathologies, as well as the problem of the "specific" effect.

  10. Checking Fine and Gray subdistribution hazards model with cumulative sums of residuals

    DEFF Research Database (Denmark)

    Li, Jianing; Scheike, Thomas; Zhang, Mei Jie

    2015-01-01

    Recently, Fine and Gray (J Am Stat Assoc 94:496–509, 1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter...... estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. The proposed goodness-of-fit test procedures are based on the cumulative sums...

  11. Measurement, geospatial, and mechanistic models of public health hazard vulnerability and jurisdictional risk.

    Science.gov (United States)

    Testa, Marcia A; Pettigrew, Mary L; Savoia, Elena

    2014-01-01

    County and state health departments are increasingly conducting hazard vulnerability and jurisdictional risk (HVJR) assessments for public health emergency preparedness and mitigation planning and evaluation to improve the public health disaster response; however, integration and adoption of these assessments into practice are still relatively rare. While the quantitative methods associated with complex analytic and measurement methods, causal inference, and decision theory are common in public health research, they have not been widely used in public health preparedness and mitigation planning. To address this gap, the Harvard School of Public Health PERLC's goal was to develop measurement, geospatial, and mechanistic models to aid public health practitioners in understanding the complexity of HVJR assessment and to determine the feasibility of using these methods for dynamic and predictive HVJR analyses. We used systematic reviews, causal inference theory, structural equation modeling (SEM), and multivariate statistical methods to develop the conceptual and mechanistic HVJR models. Geospatial mapping was used to inform the hypothetical mechanistic model by visually examining the variability and patterns associated with county-level demographic, social, economic, hazards, and resource data. A simulation algorithm was developed for testing the feasibility of using SEM estimation. The conceptual model identified the predictive latent variables used in public health HVJR tools (hazard, vulnerability, and resilience), the outcomes (human, physical, and economic losses), and the corresponding measurement subcomponents. This model was translated into a hypothetical mechanistic model to explore and evaluate causal and measurement pathways. To test the feasibility of SEM estimation, the mechanistic model path diagram was translated into linear equations and solved simultaneously using simulated data representing 192 counties. Measurement, geospatial, and mechanistic

  12. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    Science.gov (United States)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  13. Versatility of cooperative transcriptional activation: a thermodynamical modeling analysis for greater-than-additive and less-than-additive effects.

    Directory of Open Access Journals (Sweden)

    Till D Frank

    Full Text Available We derive a statistical model of transcriptional activation using equilibrium thermodynamics of chemical reactions. We examine to what extent this statistical model predicts synergy effects of cooperative activation of gene expression. We determine parameter domains in which greater-than-additive and less-than-additive effects are predicted for cooperative regulation by two activators. We show that the statistical approach can be used to identify different causes of synergistic greater-than-additive effects: nonlinearities of the thermostatistical transcriptional machinery and three-body interactions between RNA polymerase and two activators. In particular, our model-based analysis suggests that at low transcription factor concentrations cooperative activation cannot yield synergistic greater-than-additive effects, i.e., DNA transcription can only exhibit less-than-additive effects. Accordingly, transcriptional activity turns from synergistic greater-than-additive responses at relatively high transcription factor concentrations into less-than-additive responses at relatively low concentrations. In addition, two types of re-entrant phenomena are predicted. First, our analysis predicts that under particular circumstances transcriptional activity will feature a sequence of less-than-additive, greater-than-additive, and eventually less-than-additive effects when for fixed activator concentrations the regulatory impact of activators on the binding of RNA polymerase to the promoter increases from weak, to moderate, to strong. Second, for appropriate promoter conditions when activator concentrations are increased then the aforementioned re-entrant sequence of less-than-additive, greater-than-additive, and less-than-additive effects is predicted as well. Finally, our model-based analysis suggests that even for weak activators that individually induce only negligible increases in promoter activity, promoter activity can exhibit greater-than-additive

  14. Dynamic modelling of a forward osmosis-nanofiltration integrated process for treating hazardous wastewater.

    Science.gov (United States)

    Pal, Parimal; Das, Pallabi; Chakrabortty, Sankha; Thakura, Ritwik

    2016-11-01

    Dynamic modelling and simulation of a nanofiltration-forward osmosis integrated complete system was done along with economic evaluation to pave the way for scale up of such a system for treating hazardous pharmaceutical wastes. The system operated in a closed loop not only protects surface water from the onslaught of hazardous industrial wastewater but also saves on cost of fresh water by turning wastewater recyclable at affordable price. The success of dynamic modelling in capturing the relevant transport phenomena is well reflected in high overall correlation coefficient value (R (2) > 0.98), low relative error (forward osmosis loop at a reasonably high flux of 56-58 l per square meter per hour.

  15. Rockfall Hazard Analysis From Discrete Fracture Network Modelling with Finite Persistence Discontinuities

    Science.gov (United States)

    Lambert, Cédric; Thoeni, Klaus; Giacomini, Anna; Casagrande, Davide; Sloan, Scott

    2012-09-01

    Developing an accurate representation of the rock mass fabric is a key element in rock fall hazard analysis. The orientation, persistence and density of fractures control the volume and shape of unstable blocks or compartments. In this study, the discrete fracture modelling technique and digital photogrammetry were used to accurately depict the fabric. A volume distribution of unstable blocks was derived combining polyhedral modelling and kinematic analyses. For each block size, probabilities of failure and probabilities of propagation were calculated. A complete energy distribution was obtained by considering, for each block size, its occurrence in the rock mass, its probability of falling, its probability to reach a given location, and the resulting distribution of energies at each location. This distribution was then used with an energy-frequency diagram to assess the hazard.

  16. Model and Method for Multiobjective Time-Dependent Hazardous Material Transportation

    Directory of Open Access Journals (Sweden)

    Zhen Zhou

    2014-01-01

    Full Text Available In most of the hazardous material transportation problems, risk factors are assumed to be constant, which ignores the fact that they can vary with time throughout the day. In this paper, we deal with a novel time-dependent hazardous material transportation problem via lane reservation, in which the dynamic nature of transportation risk in the real-life traffic environment is taken into account. We first develop a multiobjective mixed integer programming (MIP model with two conflicting objectives: minimizing the impact on the normal traffic resulting from lane reservation and minimizing the total transportation risk. We then present a cut-and-solve based ε-constraint method to solve this model. Computational results indicate that our method outperforms the ε-constraint method based on optimization software package CPLEX.

  17. A simulation study of finite-sample properties of marginal structural Cox proportional hazards models.

    Science.gov (United States)

    Westreich, Daniel; Cole, Stephen R; Schisterman, Enrique F; Platt, Robert W

    2012-08-30

    Motivated by a previously published study of HIV treatment, we simulated data subject to time-varying confounding affected by prior treatment to examine some finite-sample properties of marginal structural Cox proportional hazards models. We compared (a) unadjusted, (b) regression-adjusted, (c) unstabilized, and (d) stabilized marginal structural (inverse probability-of-treatment [IPT] weighted) model estimators of effect in terms of bias, standard error, root mean squared error (MSE), and 95% confidence limit coverage over a range of research scenarios, including relatively small sample sizes and 10 study assessments. In the base-case scenario resembling the motivating example, where the true hazard ratio was 0.5, both IPT-weighted analyses were unbiased, whereas crude and adjusted analyses showed substantial bias towards and across the null. Stabilized IPT-weighted analyses remained unbiased across a range of scenarios, including relatively small sample size; however, the standard error was generally smaller in crude and adjusted models. In many cases, unstabilized weighted analysis showed a substantial increase in standard error compared with other approaches. Root MSE was smallest in the IPT-weighted analyses for the base-case scenario. In situations where time-varying confounding affected by prior treatment was absent, IPT-weighted analyses were less precise and therefore had greater root MSE compared with adjusted analyses. The 95% confidence limit coverage was close to nominal for all stabilized IPT-weighted but poor in crude, adjusted, and unstabilized IPT-weighted analysis. Under realistic scenarios, marginal structural Cox proportional hazards models performed according to expectations based on large-sample theory and provided accurate estimates of the hazard ratio.

  18. The Impact of the Subduction Modeling Beneath Calabria on Seismic Hazard

    Science.gov (United States)

    Morasca, P.; Johnson, W. J.; Del Giudice, T.; Poggi, P.; Traverso, C.; Parker, E. J.

    2014-12-01

    The aim of this work is to better understand the influence of subduction beneath Calabria on seismic hazard, as very little is known about present-day kinematics and the seismogenic potential of the slab interface in the Calabrian Arc region. This evaluation is significant because, depending on stress conditions, subduction zones can vary from being fully coupled to almost entirely decoupled with important consequences in the seismic hazard assessment. Although the debate is still open about the current kinematics of the plates and microplates lying in the region and the degree of coupling of Ionian lithosphere beneath Calabria, GPS data suggest that this subduction is locked in its interface sector. Also the lack of instrumentally recorded thrust earthquakes suggests this zone is locked. The current seismotectonic model developed for the Italian National territory is simplified in this area and does not reflect the possibility of locked subduction beneath the Calabria that could produce infrequent, but very large earthquakes associated with the subduction interface. Because of this we have conducted an independent seismic source analysis to take into account the influence of subduction as part of a regional seismic hazard analysis. Our final model includes two separate provinces for the subduction beneath the Calabria: inslab and interface. From a geometrical point of view the interface province is modeled with a depth between 20-50 km and a dip of 20°, while the inslab one dips 70° between 50 -100 km. Following recent interpretations we take into account that the interface subduction is possibly locked and, in such a case, large events could occur as characteristic earthquakes. The results of the PSHA analysis show that the subduction beneath the Calabrian region has an influence in the total hazard for this region, especially for long return periods. Regional seismotectonic models for this region should account for subduction.

  19. Interpretation of laser/multi-sensor data for short range terrain modeling and hazard detection

    Science.gov (United States)

    Messing, B. S.

    1980-01-01

    A terrain modeling algorithm that would reconstruct the sensed ground images formed by the triangulation scheme, and classify as unsafe any terrain feature that would pose a hazard to a roving vehicle is described. This modeler greatly reduces quantization errors inherent in a laser/sensing system through the use of a thinning algorithm. Dual filters are employed to separate terrain steps from the general landscape, simplifying the analysis of terrain features. A crosspath analysis is utilized to detect and avoid obstacles that would adversely affect the roll of the vehicle. Computer simulations of the rover on various terrains examine the performance of the modeler.

  20. Assessing Glacial Lake Outburst Flood Hazard in the Nepal Himalayas using Satellite Imagery and Hydraulic Models

    Science.gov (United States)

    Rounce, D.; McKinney, D. C.

    2015-12-01

    The last half century has witnessed considerable glacier melt that has led to the formation of large glacial lakes. These glacial lakes typically form behind terminal moraines comprising loose boulders, debris, and soil, which are susceptible to fail and cause a glacial lake outburst flood (GLOF). These lakes also act as a heat sink that accelerates glacier melt and in many cases is accompanied by rapid areal expansion. As these glacial lakes continue to grow, their hazard also increases due to the increase in potential flood volume and the lakes' proximity to triggering events such as avalanches and landslides. Despite the large threat these lakes may pose to downstream communities, there are few detailed studies that combine satellite imagery with hydraulic models to present a holistic understanding of the GLOF hazard. The aim of this work is to assess the GLOF hazard of glacial lakes in Nepal using a holistic approach based on a combination of satellite imagery and hydraulic models. Imja Lake will be the primary focus of the modeling efforts, but the methods will be developed in a manner that is transferable to other potentially dangerous glacial lakes in Nepal.

  1. [Clinical research XXII. From clinical judgment to Cox proportional hazards model].

    Science.gov (United States)

    Pérez-Rodríguez, Marcela; Rivas-Ruiz, Rodolfo; Palacios-Cruz, Lino; Talavera, Juan O

    2014-01-01

    Survival analyses are commonly used to determine the time of an event (for example, death). However, they can be used also for other clinical outcomes on the condition that these are dichotomous, for example healing time. These analyses only consider the relationship of one variable. However, Cox proportional hazards model is a multivariate analysis of the survival analysis, in which other potentially confounding covariates of the effect of the main maneuver studied, such as age, gender or disease stage, are taken into account. This analysis can include both quantitative and qualitative variables in the model. The measure of association used is called hazard ratio (HR) or relative risk ratio, which is not the same as the relative risk or odds ratio (OR). The difference is that the HR refers to the possibility that one of the groups develops the event before it is compared with the other group. The proportional hazards multivariate model of Cox is the most widely used in medicine when the phenomenon is studied in two dimensions: time and event.

  2. A nonparametric dynamic additive regression model for longitudinal data

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas H.

    2000-01-01

    dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models......dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models...

  3. Benchmarking Computational Fluid Dynamics Models for Application to Lava Flow Simulations and Hazard Assessment

    Science.gov (United States)

    Dietterich, H. R.; Lev, E.; Chen, J.; Cashman, K. V.; Honor, C.

    2015-12-01

    Recent eruptions in Hawai'i, Iceland, and Cape Verde highlight the need for improved lava flow models for forecasting and hazard assessment. Existing models used for lava flow simulation range in assumptions, complexity, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess the capabilities of existing models and test the development of new codes, we conduct a benchmarking study of computational fluid dynamics models for lava flows, including VolcFlow, OpenFOAM, Flow3D, and COMSOL. Using new benchmark scenarios defined in Cordonnier et al. (2015) as a guide, we model Newtonian, Herschel-Bulkley and cooling flows over inclined planes, obstacles, and digital elevation models with a wide range of source conditions. Results are compared to analytical theory, analogue and molten basalt experiments, and measurements from natural lava flows. Our study highlights the strengths and weakness of each code, including accuracy and computational costs, and provides insights regarding code selection. We apply the best-fit codes to simulate the lava flows in Harrat Rahat, a predominately mafic volcanic field in Saudi Arabia. Input parameters are assembled from rheology and volume measurements of past flows using geochemistry, crystallinity, and present-day lidar and photogrammetric digital elevation models. With these data, we use our verified models to reconstruct historic and prehistoric events, in order to assess the hazards posed by lava flows for Harrat Rahat.

  4. Fuzzy multi-objective chance-constrained programming model for hazardous materials transportation

    Science.gov (United States)

    Du, Jiaoman; Yu, Lean; Li, Xiang

    2016-04-01

    Hazardous materials transportation is an important and hot issue of public safety. Based on the shortest path model, this paper presents a fuzzy multi-objective programming model that minimizes the transportation risk to life, travel time and fuel consumption. First, we present the risk model, travel time model and fuel consumption model. Furthermore, we formulate a chance-constrained programming model within the framework of credibility theory, in which the lengths of arcs in the transportation network are assumed to be fuzzy variables. A hybrid intelligent algorithm integrating fuzzy simulation and genetic algorithm is designed for finding a satisfactory solution. Finally, some numerical examples are given to demonstrate the efficiency of the proposed model and algorithm.

  5. Large scale debris-flow hazard assessment: a geotechnical approach and GIS modelling

    Directory of Open Access Journals (Sweden)

    G. Delmonaco

    2003-01-01

    Full Text Available A deterministic distributed model has been developed for large-scale debris-flow hazard analysis in the basin of River Vezza (Tuscany Region – Italy. This area (51.6 km 2 was affected by over 250 landslides. These were classified as debris/earth flow mainly involving the metamorphic geological formations outcropping in the area, triggered by the pluviometric event of 19 June 1996. In the last decades landslide hazard and risk analysis have been favoured by the development of GIS techniques permitting the generalisation, synthesis and modelling of stability conditions on a large scale investigation (>1:10 000. In this work, the main results derived by the application of a geotechnical model coupled with a hydrological model for the assessment of debris flows hazard analysis, are reported. This analysis has been developed starting by the following steps: landslide inventory map derived by aerial photo interpretation, direct field survey, generation of a database and digital maps, elaboration of a DTM and derived themes (i.e. slope angle map, definition of a superficial soil thickness map, geotechnical soil characterisation through implementation of a backanalysis on test slopes, laboratory test analysis, inference of the influence of precipitation, for distinct return times, on ponding time and pore pressure generation, implementation of a slope stability model (infinite slope model and generalisation of the safety factor for estimated rainfall events with different return times. Such an approach has allowed the identification of potential source areas of debris flow triggering. This is used to detected precipitation events with estimated return time of 10, 50, 75 and 100 years. The model shows a dramatic decrease of safety conditions for the simulation when is related to a 75 years return time rainfall event. It corresponds to an estimated cumulated daily intensity of 280–330 mm. This value can be considered the hydrological triggering

  6. Perspectives on open access high resolution digital elevation models to produce global flood hazard layers

    Science.gov (United States)

    Sampson, Christopher; Smith, Andrew; Bates, Paul; Neal, Jeffrey; Trigg, Mark

    2015-12-01

    Global flood hazard models have recently become a reality thanks to the release of open access global digital elevation models, the development of simplified and highly efficient flow algorithms, and the steady increase in computational power. In this commentary we argue that although the availability of open access global terrain data has been critical in enabling the development of such models, the relatively poor resolution and precision of these data now limit significantly our ability to estimate flood inundation and risk for the majority of the planet's surface. The difficulty of deriving an accurate 'bare-earth' terrain model due to the interaction of vegetation and urban structures with the satellite-based remote sensors means that global terrain data are often poorest in the areas where people, property (and thus vulnerability) are most concentrated. Furthermore, the current generation of open access global terrain models are over a decade old and many large floodplains, particularly those in developing countries, have undergone significant change in this time. There is therefore a pressing need for a new generation of high resolution and high vertical precision open access global digital elevation models to allow significantly improved global flood hazard models to be developed.

  7. Progress in NTHMP Hazard Assessment

    Science.gov (United States)

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  8. Assessing rainfall triggered landslide hazards through physically based models under uncertainty

    Science.gov (United States)

    Balin, D.; Metzger, R.; Fallot, J. M.; Reynard, E.

    2009-04-01

    Hazard and risk assessment require, besides good data, good simulation capabilities to allow prediction of events and their consequences. The present study introduces a landslide hazards assessment strategy based on the coupling of hydrological physically based models with slope stability models that should be able to cope with uncertainty of input data and model parameters. The hydrological model used is based on the Water balance Simulation Model, WASIM-ETH (Schulla et al., 1997), a fully distributed hydrological model that has been successfully used previously in the alpine regions to simulate runoff, snowmelt, glacier melt, and soil erosion and impact of climate change on these. The study region is the Vallon de Nant catchment (10km2) in the Swiss Alps. A sound sensitivity analysis will be conducted in order to choose the discretization threshold derived from a Laser DEM model, to which the hydrological model yields the best compromise between performance and time computation. The hydrological model will be further coupled with slope stability methods (that use the topographic index and the soil moisture such as derived from the hydrological model) to simulate the spatial distribution of the initiation areas of different geomorphic processes such as debris flows and rainfall triggered landslides. To calibrate the WASIM-ETH model, the Monte Carlo Markov Chain Bayesian approach is privileged (Balin, 2004, Schaefli et al., 2006). The model is used in a single and a multi-objective frame to simulate discharge and soil moisture with uncertainty at representative locations. This information is further used to assess the potential initial areas for rainfall triggered landslides and to study the impact of uncertain input data, model parameters and simulated responses (discharge and soil moisture) on the modelling of geomorphological processes.

  9. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    Science.gov (United States)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2017-01-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  10. Detailed Flood Modeling and Hazard Assessment from Storm Tides, Rainfall and Sea Level Rise

    Science.gov (United States)

    Orton, P. M.; Hall, T. M.; Georgas, N.; Conticello, F.; Cioffi, F.; Lall, U.; Vinogradov, S. V.; Blumberg, A. F.

    2014-12-01

    A flood hazard assessment has been conducted for the Hudson River from New York City to Troy at the head of tide, using a three-dimensional hydrodynamic model and merging hydrologic inputs and storm tides from tropical and extra-tropical cyclones, as well as spring freshet floods. Our recent work showed that neglecting freshwater flows leads to underestimation of peak water levels at up-river sites and neglecting stratification (typical with two-dimensional modeling) leads to underestimation all along the Hudson. The hazard assessment framework utilizes a representative climatology of over 1000 synthetic tropical cyclones (TCs) derived from a statistical-stochastic TC model, and historical extra-tropical cyclones and freshets from 1950-present. Hydrodynamic modeling is applied with seasonal variations in mean sea level and ocean and estuary stratification. The model is the Stevens ECOM model and is separately used for operational ocean forecasts on the NYHOPS domain (http://stevens.edu/NYHOPS). For the synthetic TCs, an Artificial Neural Network/ Bayesian multivariate approach is used for rainfall-driven freshwater inputs to the Hudson, translating the TC attributes (e.g. track, SST, wind speed) directly into tributary stream flows (see separate presentation by Cioffi for details). Rainfall intensity has been rising in recent decades in this region, and here we will also examine the sensitivity of Hudson flooding to future climate warming-driven increases in storm precipitation. The hazard assessment is being repeated for several values of sea level, as projected for future decades by the New York City Panel on Climate Change. Recent studies have given widely varying estimates of the present-day 100-year flood at New York City, from 2.0 m to 3.5 m, and special emphasis will be placed on quantifying our study's uncertainty.

  11. Percolation model with an additional source of disorder

    Science.gov (United States)

    Kundu, Sumanta; Manna, S. S.

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p . Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R1 and R2 of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R1-R2 plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is pc(sq) , the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R ∈{0 ,R0} and a percolation transition is observed with R0 as the control variable, similar to the site occupation probability.

  12. Flood Hazard Mapping by Using Geographic Information System and Hydraulic Model: Mert River, Samsun, Turkey

    Directory of Open Access Journals (Sweden)

    Vahdettin Demir

    2016-01-01

    Full Text Available In this study, flood hazard maps were prepared for the Mert River Basin, Samsun, Turkey, by using GIS and Hydrologic Engineering Centers River Analysis System (HEC-RAS. In this river basin, human life losses and a significant amount of property damages were experienced in 2012 flood. The preparation of flood risk maps employed in the study includes the following steps: (1 digitization of topographical data and preparation of digital elevation model using ArcGIS, (2 simulation of flood lows of different return periods using a hydraulic model (HEC-RAS, and (3 preparation of flood risk maps by integrating the results of (1 and (2.

  13. Hyperbolic value addition and general models of animal choice.

    Science.gov (United States)

    Mazur, J E

    2001-01-01

    Three mathematical models of choice--the contextual-choice model (R. Grace, 1994), delay-reduction theory (N. Squires & E. Fantino, 1971), and a new model called the hyperbolic value-added model--were compared in their ability to predict the results from a wide variety of experiments with animal subjects. When supplied with 2 or 3 free parameters, all 3 models made fairly accurate predictions for a large set of experiments that used concurrent-chain procedures. One advantage of the hyperbolic value-added model is that it is derived from a simpler model that makes accurate predictions for many experiments using discrete-trial adjusting-delay procedures. Some results favor the hyperbolic value-added model and delay-reduction theory over the contextual-choice model, but more data are needed from choice situations for which the models make distinctly different predictions.

  14. Mathematical Decision Models Applied for Qualifying and Planning Areas Considering Natural Hazards and Human Dealing

    Science.gov (United States)

    Anton, Jose M.; Grau, Juan B.; Tarquis, Ana M.; Sanchez, Elena; Andina, Diego

    2014-05-01

    The authors were involved in the use of some Mathematical Decision Models, MDM, to improve knowledge and planning about some large natural or administrative areas for which natural soils, climate, and agro and forest uses where main factors, but human resources and results were important, natural hazards being relevant. In one line they have contributed about qualification of lands of the Community of Madrid, CM, administrative area in centre of Spain containing at North a band of mountains, in centre part of Iberian plateau and river terraces, and also Madrid metropolis, from an official study of UPM for CM qualifying lands using a FAO model from requiring minimums of a whole set of Soil Science criteria. The authors set first from these criteria a complementary additive qualification, and tried later an intermediate qualification from both using fuzzy logic. The authors were also involved, together with colleagues from Argentina et al. that are in relation with local planners, for the consideration of regions and of election of management entities for them. At these general levels they have adopted multi-criteria MDM, used a weighted PROMETHEE, and also an ELECTRE-I with the same elicited weights for the criteria and data, and at side AHP using Expert Choice from parallel comparisons among similar criteria structured in two levels. The alternatives depend on the case study, and these areas with monsoon climates have natural hazards that are decisive for their election and qualification with an initial matrix used for ELECTRE and PROMETHEE. For the natural area of Arroyos Menores at South of Rio Cuarto town, with at North the subarea of La Colacha, the loess lands are rich but suffer now from water erosions forming regressive ditches that are spoiling them, and use of soils alternatives must consider Soil Conservation and Hydraulic Management actions. The use of soils may be in diverse non compatible ways, as autochthonous forest, high value forest, traditional

  15. Measurement Error in Proportional Hazards Models for Survival Data with Long-term Survivors

    Institute of Scientific and Technical Information of China (English)

    Xiao-bing ZHAO; Xian ZHOU

    2012-01-01

    This work studies a proportional hazards model for survival data with "long-term survivors",in which covariates are subject to linear measurement error.It is well known that the na?ve estimators from both partial and full likelihood methods are inconsistent under this measurement error model.For measurement error models,methods of unbiased estimating function and corrected likelihood have been proposed in the literature.In this paper,we apply the corrected partial and full likelihood approaches to estimate the model and obtain statistical inference from survival data with long-term survivors.The asymptotic properties of the estimators are established.Simulation results illustrate that the proposed approaches provide useful tools for the models considered.

  16. An approach for modeling thermal destruction of hazardous wastes in circulating fluidized bed incinerator.

    Science.gov (United States)

    Patil, M P; Sonolikar, R L

    2008-10-01

    This paper presents a detailed computational fluid dynamics (CFD) based approach for modeling thermal destruction of hazardous wastes in a circulating fluidized bed (CFB) incinerator. The model is based on Eular - Lagrangian approach in which gas phase (continuous phase) is treated in a Eularian reference frame, whereas the waste particulate (dispersed phase) is treated in a Lagrangian reference frame. The reaction chemistry hasbeen modeled through a mixture fraction/ PDF approach. The conservation equations for mass, momentum, energy, mixture fraction and other closure equations have been solved using a general purpose CFD code FLUENT4.5. Afinite volume method on a structured grid has been used for solution of governing equations. The model provides detailed information on the hydrodynamics (gas velocity, particulate trajectories), gas composition (CO, CO2, O2) and temperature inside the riser. The model also allows different operating scenarios to be examined in an efficient manner.

  17. MODELING OF THE BUILDING LOCAL PROTECTION (SHELTER – IN PLACE INCLUDING SORBTION OF THE HAZARDOUS CONTAMINANT ON INDOOR SURFACES

    Directory of Open Access Journals (Sweden)

    N. N. Belyayev

    2014-05-01

    Full Text Available Purpose. Chemically hazardous objects, where toxic substances are used, manufactured and stored, and also main lines, on which the hazardous materials transportation is conducted, pose potential sources of atmosphere accidental pollution.Development of the CFD model for evaluating the efficiency of the building local protection from hazardous substantives ingress by using air curtain and sorption/desorption of hazardous substance on indoor surfaces. Methodology. To solve the problem of hydrodynamic interaction of the air curtain with wind flow and considering the building influence on this process the model of ideal fluid is used. In order to calculate the transfer process of the hazardous substance in the atmosphere an equation of convection-diffusion transport of impurities is applied. To calculate the process of indoors air pollution under leaking of foul air Karisson & Huber model is used. This model takes into account the sorption of the hazardous substance at various indoors surfaces. For the numerical integration of the model equations differential methods are used. Findings. In this paper we construct an efficient CFD model of evaluating the effectiveness of the buildings protection against ingress of hazardous substances through the use of an air curtain. On the basis of the built model a computational experiment to assess the effectiveness of this protection method under varying the location of the air curtain relative to the building was carried out. Originality. A new model was developed to compute the effectiveness of the air curtain supply to reduce the toxic chemical concentration inside the building. Practical value. The developed model can be used for design of the building local protection against ingress of hazardous substances.

  18. Developing a simplified geographical information system approach to dilute lahar modelling for rapid hazard assessment

    Science.gov (United States)

    Darnell, A. R.; Phillips, J. C.; Barclay, J.; Herd, R. A.; Lovett, A. A.; Cole, P. D.

    2013-04-01

    In this study, we present a geographical information system (GIS)-based approach to enable the estimation of lahar features important to rapid hazard assessment (including flow routes, velocities and travel times). Our method represents a simplified first stage in extending the utility of widely used existing GIS-based inundation models, such as LAHARZ, to provide estimates of flow speeds. LAHARZ is used to determine the spatial distribution of a lahar of constant volume, and for a given cell in a GIS grid, a single-direction flow routing technique incorporating the effect of surface roughness directs the flow according to steepest descent. The speed of flow passing through a cell is determined from coupling the flow depth, change in elevation and roughness using Manning's formula, and in areas where there is little elevation difference, flow is routed to locally maximum increase in velocity. Application of this methodology to lahars on Montserrat, West Indies, yielded support for this GIS-based approach as a hazard assessment tool through tests on small volume (5,000-125,000 m3) dilute lahars (consistent with application of Manning's law). Dominant flow paths were mapped, and for the first time in this study area, velocities (magnitudes and spatial distribution) and average travel times were estimated for a range of lahar volumes. Flow depth approximations were also made using (modified) LAHARZ, and these refined the input to Manning's formula. Flow depths were verified within an order of magnitude by field observations, and velocity predictions were broadly consistent with proxy measurements and published data. Forecasts from this coupled method can operate on short to mid-term timescales for hazard management. The methodology has potential to provide a rapid preliminary hazard assessment in similar systems where data acquisition may be difficult.

  19. Earthquake catalogs for the 2017 Central and Eastern U.S. short-term seismic hazard model

    Science.gov (United States)

    Mueller, Charles S.

    2017-01-01

    The U. S. Geological Survey (USGS) makes long-term seismic hazard forecasts that are used in building codes. The hazard models usually consider only natural seismicity; non-tectonic (man-made) earthquakes are excluded because they are transitory or too small. In the past decade, however, thousands of earthquakes related to underground fluid injection have occurred in the central and eastern U.S. (CEUS), and some have caused damage.  In response, the USGS is now also making short-term forecasts that account for the hazard from these induced earthquakes. Seismicity statistics are analyzed to develop recurrence models, accounting for catalog completeness. In the USGS hazard modeling methodology, earthquakes are counted on a map grid, recurrence models are applied to estimate the rates of future earthquakes in each grid cell, and these rates are combined with maximum-magnitude models and ground-motion models to compute the hazard The USGS published a forecast for the years 2016 and 2017.Here, we document the development of the seismicity catalogs for the 2017 CEUS short-term hazard model.  A uniform earthquake catalog is assembled by combining and winnowing pre-existing source catalogs. The initial, final, and supporting earthquake catalogs are made available here.

  20. Predictive Modeling of Chemical Hazard by Integrating Numerical Descriptors of Chemical Structures and Short-term Toxicity Assay Data

    Science.gov (United States)

    Rusyn, Ivan; Sedykh, Alexander; Guyton, Kathryn Z.; Tropsha, Alexander

    2012-01-01

    Quantitative structure-activity relationship (QSAR) models are widely used for in silico prediction of in vivo toxicity of drug candidates or environmental chemicals, adding value to candidate selection in drug development or in a search for less hazardous and more sustainable alternatives for chemicals in commerce. The development of traditional QSAR models is enabled by numerical descriptors representing the inherent chemical properties that can be easily defined for any number of molecules; however, traditional QSAR models often have limited predictive power due to the lack of data and complexity of in vivo endpoints. Although it has been indeed difficult to obtain experimentally derived toxicity data on a large number of chemicals in the past, the results of quantitative in vitro screening of thousands of environmental chemicals in hundreds of experimental systems are now available and continue to accumulate. In addition, publicly accessible toxicogenomics data collected on hundreds of chemicals provide another dimension of molecular information that is potentially useful for predictive toxicity modeling. These new characteristics of molecular bioactivity arising from short-term biological assays, i.e., in vitro screening and/or in vivo toxicogenomics data can now be exploited in combination with chemical structural information to generate hybrid QSAR–like quantitative models to predict human toxicity and carcinogenicity. Using several case studies, we illustrate the benefits of a hybrid modeling approach, namely improvements in the accuracy of models, enhanced interpretation of the most predictive features, and expanded applicability domain for wider chemical space coverage. PMID:22387746

  1. Web-based Services for Earth Observing and Model Data in National Applications and Hazards

    Science.gov (United States)

    Kafatos, M.; Boybeyi, Z.; Cervone, G.; di, L.; Sun, D.; Yang, C.; Yang, R.

    2005-12-01

    The ever-growing large volumes of Earth system science data, collected by Earth observing platforms, in situ stations and as model output data, are increasingly being used by discipline scientists and by wider classes of users. In particular, applications of Earth system science data to environmental and hazards as well as other national applications, require tailored or specialized data, as well as web-based tools and infrastructure. The latter are driven by applications and usage drivers which include ease of access, visualization of complex data, ease of producing value-added data, GIS and open source analysis usage, metadata, etc. Here we present different aspects of such web-based services and access, and discuss several applications in the hazards and environmental areas, including earthquake signatures and observations and model runs of hurricanes. Examples and lessons learned from the consortium Mid-Atlantic Geospatial Information Consortium will be presented. We discuss a NASA-funded, open source on-line data analysis system that is being applied to climate studies for the ESIP Federation. Since enhanced, this project and the next-generation Metadata Integrated Data Analysis System allow users not only to identify data but also to generate new data products on-the-fly. The functionalities extend from limited predefined functions, to sophisticated functions described by general-purposed GrADS (Grid Analysis and Display System) commands. The Federation system also allows third party data products to be combined with local data. Software component are available for converting the output from MIDAS (OPenDAP) into OGC compatible software. The on-going Grid efforts at CEOSR and LAITS in the School of Computational Sciences (SCS) include enhancing the functions of Globus to provide support for a geospatial system so the system can share the computing power to handle problems with different peak access times and improve the stability and flexibility of a rapid

  2. TRENT2D WG: a smart web infrastructure for debris-flow modelling and hazard assessment

    Science.gov (United States)

    Zorzi, Nadia; Rosatti, Giorgio; Zugliani, Daniel; Rizzi, Alessandro; Piffer, Stefano

    2016-04-01

    Mountain regions are naturally exposed to geomorphic flows, which involve large amounts of sediments and induce significant morphological modifications. The physical complexity of this class of phenomena represents a challenging issue for modelling, leading to elaborate theoretical frameworks and sophisticated numerical techniques. In general, geomorphic-flows models proved to be valid tools in hazard assessment and management. However, model complexity seems to represent one of the main obstacles to the diffusion of advanced modelling tools between practitioners and stakeholders, although the UE Flood Directive (2007/60/EC) requires risk management and assessment to be based on "best practices and best available technologies". Furthermore, several cutting-edge models are not particularly user-friendly and multiple stand-alone software are needed to pre- and post-process modelling data. For all these reasons, users often resort to quicker and rougher approaches, leading possibly to unreliable results. Therefore, some effort seems to be necessary to overcome these drawbacks, with the purpose of supporting and encouraging a widespread diffusion of the most reliable, although sophisticated, modelling tools. With this aim, this work presents TRENT2D WG, a new smart modelling solution for the state-of-the-art model TRENT2D (Armanini et al., 2009, Rosatti and Begnudelli, 2013), which simulates debris flows and hyperconcentrated flows adopting a two-phase description over a mobile bed. TRENT2D WG is a web infrastructure joining advantages offered by the software-delivering model SaaS (Software as a Service) and by WebGIS technology and hosting a complete and user-friendly working environment for modelling. In order to develop TRENT2D WG, the model TRENT2D was converted into a service and exposed on a cloud server, transferring computational burdens from the user hardware to a high-performing server and reducing computational time. Then, the system was equipped with an

  3. Additive manufacturing for consumer-centric business models

    DEFF Research Database (Denmark)

    Bogers, Marcel; Hadar, Ronen; Bilberg, Arne

    2016-01-01

    and capturing value. In this paper, we explore the implications that AM technologies have for manufacturing systems in the new business models that they enable. In particular, we consider how a consumer goods manufacturer can organize the operations of a more open business model when moving from a manufacturer...

  4. Rapid SAR and GPS Measurements and Models for Hazard Science and Situational Awareness

    Science.gov (United States)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Moore, A. W.; Rosen, P. A.; Simons, M.; Webb, F.; Linick, J.; Fielding, E. J.; Lundgren, P.; Sacco, G. F.; Polet, J.; Manipon, G.

    2016-12-01

    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating higher level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR), Differential Global Positioning System (DGPS), SAR-based change detection, and image pixel tracking have recently become critical additions to our toolset for understanding and mapping the damage caused by earthquakes, volcanic eruptions, landslides, and floods. Analyses of these data sets are still largely handcrafted following each event and are not generated rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by California Institute of Technology (Caltech) and by NASA through the Jet Propulsion Laboratory (JPL), has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition, the ARIA project is developing the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the imminent increase in raw data from geodetic imaging missions planned for launch by NASA, as well as international space agencies. We will present the progress we have made on automating the analysis of SAR data for hazard monitoring and response using data from Sentinel 1a/b as well as continuous GPS stations. Since the beginning of our project, our team has imaged events and generated response products for events around the world. These response products have enabled many conversations with those in the disaster response community

  5. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 storm-hazard projections

    Science.gov (United States)

    Barnard, Patrick; Erikson, Li; O'Neill, Andrea; Foxgrover, Amy; Herdman, Liv

    2017-01-01

    The Coastal Storm Modeling System (CoSMoS) makes detailed predictions (meter-scale) over large geographic scales (100s of kilometers) of storm-induced coastal flooding and erosion for both current and future SLR scenarios, as well as long-term shoreline change and cliff retreat.  Resulting projections for future climate scenarios (sea-level rise and storms) provide emergency responders and coastal planners with critical storm-hazards information that can be used to increase public safety, mitigate physical damages, and more effectively manage and allocate resources within complex coastal settings. Several versions of CoSMoS have been implemented for areas of the California coast, including Southern California, Central California, and San Francisco Bay, and further versions will be incorporated as additional regions and improvements are developed.

  6. Tsunami Hazard Preventing Based Land Use Planning Model Using GIS Techniques in Muang Krabi, Thailand

    Directory of Open Access Journals (Sweden)

    Abdul Salam Soomro

    2012-10-01

    Full Text Available The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region.

  7. Hazard-consistent ground motions generated with a stochastic fault-rupture model

    Energy Technology Data Exchange (ETDEWEB)

    Nishida, Akemi, E-mail: nishida.akemi@jaea.go.jp [Center for Computational Science and e-Systems, Japan Atomic Energy Agency, 178-4-4, Wakashiba, Kashiwa, Chiba 277-0871 (Japan); Igarashi, Sayaka, E-mail: igrsyk00@pub.taisei.co.jp [Technology Center, Taisei Corporation, 344-1 Nase-cho, Totsuka-ku, Yokohama 245-0051 (Japan); Sakamoto, Shigehiro, E-mail: shigehiro.sakamoto@sakura.taisei.co.jp [Technology Center, Taisei Corporation, 344-1 Nase-cho, Totsuka-ku, Yokohama 245-0051 (Japan); Uchiyama, Yasuo, E-mail: yasuo.uchiyama@sakura.taisei.co.jp [Technology Center, Taisei Corporation, 344-1 Nase-cho, Totsuka-ku, Yokohama 245-0051 (Japan); Yamamoto, Yu, E-mail: ymmyu-00@pub.taisei.co.jp [Technology Center, Taisei Corporation, 344-1 Nase-cho, Totsuka-ku, Yokohama 245-0051 (Japan); Muramatsu, Ken, E-mail: kmuramat@tcu.ac.jp [Department of Nuclear Safety Engineering, Tokyo City University, 1-28-1 Tamazutsumi, Setagaya-ku, Tokyo 158-8557 (Japan); Takada, Tsuyoshi, E-mail: takada@load.arch.t.u-tokyo.ac.jp [Department of Architecture, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-12-15

    Conventional seismic probabilistic risk assessments (PRAs) of nuclear power plants consist of probabilistic seismic hazard and fragility curves. Even when earthquake ground-motion time histories are required, they are generated to fit specified response spectra, such as uniform hazard spectra at a specified exceedance probability. These ground motions, however, are not directly linked with seismic-source characteristics. In this context, the authors propose a method based on Monte Carlo simulations to generate a set of input ground-motion time histories to develop an advanced PRA scheme that can explain exceedance probability and the sequence of safety-functional loss in a nuclear power plant. These generated ground motions are consistent with seismic hazard at a reference site, and their seismic-source characteristics can be identified in detail. Ground-motion generation is conducted for a reference site, Oarai in Japan, the location of a hypothetical nuclear power plant. A total of 200 ground motions are generated, ranging from 700 to 1100 cm/s{sup 2} peak acceleration, which corresponds to a 10{sup −4} to 10{sup −5} annual exceedance frequency. In the ground-motion generation, seismic sources are selected according to their hazard contribution at the site, and Monte Carlo simulations with stochastic parameters for the seismic-source characteristics are then conducted until ground motions with the target peak acceleration are obtained. These ground motions are selected so that they are consistent with the hazard. Approximately 110,000 simulations were required to generate 200 ground motions with these peak accelerations. Deviations of peak ground motion acceleration generated for 1000–1100 cm/s{sup 2} range from 1.5 to 3.0, where the deviation is evaluated with peak ground motion accelerations generated from the same seismic source. Deviations of 1.0 to 3.0 for stress drops, one of the stochastic parameters of seismic-source characteristics, are required to

  8. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    Science.gov (United States)

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models.

  9. Modeling continuous self-report measures of perceived emotion using generalized additive mixed models.

    Science.gov (United States)

    McKeown, Gary J; Sneddon, Ian

    2014-03-01

    Emotion research has long been dominated by the "standard method" of displaying posed or acted static images of facial expressions of emotion. While this method has been useful, it is unable to investigate the dynamic nature of emotion expression. Although continuous self-report traces have enabled the measurement of dynamic expressions of emotion, a consensus has not been reached on the correct statistical techniques that permit inferences to be made with such measures. We propose generalized additive models and generalized additive mixed models as techniques that can account for the dynamic nature of such continuous measures. These models allow us to hold constant shared components of responses that are due to perceived emotion across time, while enabling inference concerning linear differences between groups. The generalized additive mixed model approach is preferred, as it can account for autocorrelation in time series data and allows emotion decoding participants to be modeled as random effects. To increase confidence in linear differences, we assess the methods that address interactions between categorical variables and dynamic changes over time. In addition, we provide comments on the use of generalized additive models to assess the effect size of shared perceived emotion and discuss sample sizes. Finally, we address additional uses, the inference of feature detection, continuous variable interactions, and measurement of ambiguity.

  10. Additive gamma frailty models with applications to competing risks in related individuals

    DEFF Research Database (Denmark)

    Eriksson, Frank; Scheike, Thomas

    2015-01-01

    Epidemiological studies of related individuals are often complicated by the fact that follow-up on the event type of interest is incomplete due to the occurrence of other events. We suggest a class of frailty models with cause-specific hazards for correlated competing events in related individuals...

  11. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    Science.gov (United States)

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  12. Primary circuit iodine model addition to IMPAIR-3

    Energy Technology Data Exchange (ETDEWEB)

    Osetek, D.J.; Louie, D.L.Y. [Los Alamos Technical Associates, Inc., Albuquerque, NM (United States); Guntay, S.; Cripps, R. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1996-12-01

    As part of a continuing effort to provide the U.S. Department of Energy (DOE) Advanced Reactor Severe Accident Program (ARSAP) with complete iodine analysis capability, a task was undertaken to expand the modeling of IMPAIR-3, an iodine chemistry code. The expanded code will enable the DOE to include detailed iodine behavior in the assessment of severe accident source terms used in the licensing of U.S. Advanced Light Water Reactors (ALWRs). IMPAIR-3 was developed at the Paul Scherrer Institute (PSI), Switzerland, and has been used by ARSAP for the past two years to analyze containment iodine chemistry for ALWR source term analyses. IMPAIR-3 is primarily a containment code but the iodine chemistry inside the primary circuit (the Reactor Coolant System or RCS) may influence the iodine species released into the the containment; therefore, a RCS iodine chemistry model must be implemented in IMPAIR-3 to ensure thorough source term analysis. The ARSAP source term team and the PSI IMPAIR-3 developers are working together to accomplish this task. This cooperation is divided into two phases. Phase I, taking place in 1996, involves developing a stand-alone RCS iodine chemistry program called IMPRCS (IMPAIR -Reactor Coolant System). This program models a number of the chemical and physical processes of iodine that are thought to be important at conditions of high temperature and pressure in the RCS. In Phase II, which is tentatively scheduled for 1997, IMPRCS will be implemented as a subroutine in IMPAIR-3. To ensure an efficient calculation, an interface/tracking system will be developed to control the use of the RCS model from the containment model. These two models will be interfaced in such a way that once the iodine is released from the RCS, it will no longer be tracked by the RCS model but will be tracked by the containment model. All RCS thermal-hydraulic parameters will be provided by other codes. (author) figs., tabs., refs.

  13. Non-additive model for specific heat of electrons

    Science.gov (United States)

    Anselmo, D. H. A. L.; Vasconcelos, M. S.; Silva, R.; Mello, V. D.

    2016-10-01

    By using non-additive Tsallis entropy we demonstrate numerically that one-dimensional quasicrystals, whose energy spectra are multifractal Cantor sets, are characterized by an entropic parameter, and calculate the electronic specific heat, where we consider a non-additive entropy Sq. In our method we consider an energy spectra calculated using the one-dimensional tight binding Schrödinger equation, and their bands (or levels) are scaled onto the [ 0 , 1 ] interval. The Tsallis' formalism is applied to the energy spectra of Fibonacci and double-period one-dimensional quasiperiodic lattices. We analytically obtain an expression for the specific heat that we consider to be more appropriate to calculate this quantity in those quasiperiodic structures.

  14. Richly parameterized linear models additive, time series, and spatial models using random effects

    CERN Document Server

    Hodges, James S

    2013-01-01

    A First Step toward a Unified Theory of Richly Parameterized Linear ModelsUsing mixed linear models to analyze data often leads to results that are mysterious, inconvenient, or wrong. Further compounding the problem, statisticians lack a cohesive resource to acquire a systematic, theory-based understanding of models with random effects.Richly Parameterized Linear Models: Additive, Time Series, and Spatial Models Using Random Effects takes a first step in developing a full theory of richly parameterized models, which would allow statisticians to better understand their analysis results. The aut

  15. Models of magma-aquifer interactions and their implications for hazard assessment

    Science.gov (United States)

    Strehlow, Karen; Gottsmann, Jo; Tumi Gudmundsson, Magnús

    2014-05-01

    Kverkfjöll and in October on White Island, New Zealand. The latter is only one example of these natural attractions that are visited by thousands of tourists every year. Additionally, these systems are increasingly used for energy generation. Phreatic explosions pose a serious risk to people and infrastructure nearby, and they are hard to predict. To improve risk assessment in hydrothermal areas, we assessed historical records and literature with regard to the frequency and mechanisms of hydrothermal explosions. Complemented by numerical models this study wants to answer the question: What determines the change of a safe to a dangerous behaviour of the system, i.e. the change from silent degassing to explosions? Our project aims to widen our knowledge base on the complex coupling of magmatic and hydrological systems, which provides further insight into the subsurface processes at volcanic systems and will aid future risk assessment and eruption forecasting.

  16. Additional Research Needs to Support the GENII Biosphere Models

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen

    2013-11-30

    In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models. It is recommended that priorities be set by NRC staff to guide selection of the most useful improvements in a cost-effective manner. Suggestions are made based on relatively easy and inexpensive changes, and longer-term more costly studies. In the short term, there are several improved model formulations that could be applied to the GENII suite of codes to make them more generally useful. • Implementation of the separation of the translocation and weathering processes • Implementation of an improved model for carbon-14 from non-atmospheric sources • Implementation of radon exposure pathways models • Development of a KML processor for the output report generator module data that are calculated on a grid that could be superimposed upon digital maps for easier presentation and display • Implementation of marine mammal models (manatees, seals, walrus, whales, etc.). Data needs in the longer term require extensive (and potentially expensive) research. Before picking any one radionuclide or food type, NRC staff should perform an in-house review of current and anticipated environmental analyses to select “dominant” radionuclides of interest to allow setting of cost-effective priorities for radionuclide- and pathway-specific research. These include • soil-to-plant uptake studies for oranges and other citrus fruits, and • Development of models for evaluation of radionuclide concentration in highly-processed foods such as oils and sugars. Finally, renewed

  17. A "mental models" approach to the communication of subsurface hydrology and hazards

    Science.gov (United States)

    Gibson, Hazel; Stewart, Iain S.; Pahl, Sabine; Stokes, Alison

    2016-05-01

    Communicating information about geological and hydrological hazards relies on appropriately worded communications targeted at the needs of the audience. But what are these needs, and how does the geoscientist discern them? This paper adopts a psychological "mental models" approach to assess the public perception of the geological subsurface, presenting the results of attitudinal studies and surveys in three communities in the south-west of England. The findings reveal important preconceptions and misconceptions regarding the impact of hydrological systems and hazards on the geological subsurface, notably in terms of the persistent conceptualisation of underground rivers and the inferred relations between flooding and human activity. The study demonstrates how such mental models can provide geoscientists with empirical, detailed and generalised data of perceptions surrounding an issue, as well reveal unexpected outliers in perception that they may not have considered relevant, but which nevertheless may locally influence communication. Using this approach, geoscientists can develop information messages that more directly engage local concerns and create open engagement pathways based on dialogue, which in turn allow both geoscience "experts" and local "non-experts" to come together and understand each other more effectively.

  18. Partitioning into hazard subregions for regional peaks-over-threshold modeling of heavy precipitation

    Science.gov (United States)

    Carreau, J.; Naveau, P.; Neppel, L.

    2017-05-01

    The French Mediterranean is subject to intense precipitation events occurring mostly in autumn. These can potentially cause flash floods, the main natural danger in the area. The distribution of these events follows specific spatial patterns, i.e., some sites are more likely to be affected than others. The peaks-over-threshold approach consists in modeling extremes, such as heavy precipitation, by the generalized Pareto (GP) distribution. The shape parameter of the GP controls the probability of extreme events and can be related to the hazard level of a given site. When interpolating across a region, the shape parameter should reproduce the observed spatial patterns of the probability of heavy precipitation. However, the shape parameter estimators have high uncertainty which might hide the underlying spatial variability. As a compromise, we choose to let the shape parameter vary in a moderate fashion. More precisely, we assume that the region of interest can be partitioned into subregions with constant hazard level. We formalize the model as a conditional mixture of GP distributions. We develop a two-step inference strategy based on probability weighted moments and put forward a cross-validation procedure to select the number of subregions. A synthetic data study reveals that the inference strategy is consistent and not very sensitive to the selected number of subregions. An application on daily precipitation data from the French Mediterranean shows that the conditional mixture of GPs outperforms two interpolation approaches (with constant or smoothly varying shape parameter).

  19. Generalized Additive Models, Cubic Splines and Penalized Likelihood.

    Science.gov (United States)

    1987-05-22

    in case control studies ). All models in the table include dummy variable to account for the matching. The first 3 lines of the table indicate that OA...Ausoc. Breslow, N. and Day, N. (1980). Statistical methods in cancer research, volume 1- the analysis of case - control studies . International agency

  20. Multiple Imputation of Predictor Variables Using Generalized Additive Models

    NARCIS (Netherlands)

    de Jong, Roel; van Buuren, Stef; Spiess, Martin

    2016-01-01

    The sensitivity of multiple imputation methods to deviations from their distributional assumptions is investigated using simulations, where the parameters of scientific interest are the coefficients of a linear regression model, and values in predictor variables are missing at random. The performanc

  1. Estimating the Tradeoff Between Risk Protection and Moral Hazard with a Nonlinear Budget Set Model of Health Insurance*

    OpenAIRE

    Amanda E. Kowalski

    2015-01-01

    Insurance induces a well-known tradeoff between the welfare gains from risk protection and the welfare losses from moral hazard. Empirical work traditionally estimates each side of the tradeoff separately, potentially yielding mutually inconsistent results. I develop a nonlinear budget set model of health insurance that allows for the calculation of both sides of the tradeoff simultaneously, allowing for a relationship between moral hazard and risk protection. An important feature of this mod...

  2. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  3. Quantification of Inter-Tsunami Model Variability for Hazard Assessment Studies

    Science.gov (United States)

    Catalan, P. A.; Alcantar, A.; Cortés, P. I.

    2014-12-01

    There is a wide range of numerical models capable of modeling tsunamis, most of which have been properly validated and verified against standard benchmark cases and particular field or laboratory cases studies. Consequently, these models are regularly used as essential tools in estimating the tsunami hazard on coastal communities by scientists, or consulting companies, and treating model results in a deterministic way. Most of these models are derived from the same set of equations, typically the Non Linear Shallow Water Equations, to which ad-hoc terms are added to include physical effects such as friction, the Coriolis force, and others. However, not very often these models are used in unison to address the variability in the results. Therefore, in this contribution, we perform a high number of simulations using a set of numerical models and quantify the variability in the results. In order to reduce the influence of input data on the results, a single tsunami scenario is used over a common bathymetry. Next, we perform model comparisons as to asses sensitivity to changes in grid resolution and Manning roughness coefficients. Results are presented either as intra-model comparisons (sensitivity to changes using the same model) and inter-model comparisons (sensitivity to changing models). For the case tested, it was observed that most models reproduced fairly consistently the arrival and periodicity of the tsunami waves. However, variations in amplitude, characterized by the standard-deviation between model runs, could be as large as the mean signal. This level of variability is considered too large for deterministic assessment, reinforcing the idea that uncertainty needs to be included in such studies.

  4. Socio-economic vulnerability to natural hazards - proposal for an indicator-based model

    Science.gov (United States)

    Eidsvig, U.; McLean, A.; Vangelsten, B. V.; Kalsnes, B.; Ciurean, R. L.; Argyroudis, S.; Winter, M.; Corominas, J.; Mavrouli, O. C.; Fotopoulou, S.; Pitilakis, K.; Baills, A.; Malet, J. P.

    2012-04-01

    Vulnerability assessment, with respect to natural hazards, is a complex process that must consider multiple dimensions of vulnerability, including both physical and social factors. Physical vulnerability refers to conditions of physical assets, and may be modeled by the intensity and magnitude of the hazard, the degree of physical protection provided by the natural and built environment, and the physical robustness of the exposed elements. Social vulnerability refers to the underlying factors leading to the inability of people, organizations, and societies to withstand impacts from the natural hazards. Social vulnerability models can be used in combination with physical vulnerability models to estimate both direct losses, i.e. losses that occur during and immediately after the impact, as well as indirect losses, i.e. long-term effects of the event. Direct impact of a landslide typically includes casualties and damages to buildings and infrastructure while indirect losses may e.g. include business closures or limitations in public services. The direct losses are often assessed using physical vulnerability indicators (e.g. construction material, height of buildings), while indirect losses are mainly assessed using social indicators (e.g. economical resources, demographic conditions). Within the EC-FP7 SafeLand research project, an indicator-based method was proposed to assess relative socio-economic vulnerability to landslides. The indicators represent the underlying factors which influence a community's ability to prepare for, deal with, and recover from the damage associated with landslides. The proposed model includes indicators representing demographic, economic and social characteristics as well as indicators representing the degree of preparedness and recovery capacity. Although the model focuses primarily on the indirect losses, it could easily be extended to include more physical indicators which account for the direct losses. Each indicator is individually

  5. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  6. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  7. The unconvincing product - Consumer versus expert hazard identification: A mental models study of novel foods

    DEFF Research Database (Denmark)

    Hagemann, Kit; Scholderer, Joachim

    offered by lifelong habits. Consumers found it utterly unconvincing that, all of a sudden, they should regard their everyday foods as toxic and therefore it might not be possible to effectively communicate the health benefits of some novel foods to consumers. Several misconceptions became apparent......Novel foods have been the object of intense public debate in recent years. Despite efforts to communicate the outcomes of risk assessments to consumers, public confidence in the management of potential risks associated has been low. Various reasons behind this has identified, chiefly a disagreement...... and experts understanding of benefits and risks associated with three Novel foods (a potato, rice and functional food ingredients) using a relatively new methodology for the study of risk perception called Mental models. Mental models focus on the way people conceptualise hazardous processes and allows...

  8. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models.

    Science.gov (United States)

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; Gutiérrez-Gutiérrez, O. Q.; Larreynaga, J.; González, M.; Castro, M.; Gavidia, F.; Aguirre-Ayerbe, I.; González-Riancho, P.; Carreño, E.

    2013-11-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and deterministic methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold: on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high-resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps, and from the elevation in the near shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific Basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences-finite volumes numerical model in this work, based on the linear and non-linear shallow water equations, to simulate a total of 24 earthquake-generated tsunami scenarios. Our results show that at the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results

  9. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    Directory of Open Access Journals (Sweden)

    J. A. Álvarez-Gómez

    2013-05-01

    Full Text Available El Salvador is the smallest and most densely populated country in Central America; its coast has approximately a length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there have been 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and hundreds of victims. The hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached from both Probabilistic and Deterministic Methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold, on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps and from the elevation in the near-shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences – finite volumes numerical model in this work, based on the Linear and Non-linear Shallow Water Equations, to simulate a total of 24 earthquake generated tsunami scenarios. In the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results obtained

  10. Issues in testing the new national seismic hazard model for Italy

    Science.gov (United States)

    Stein, S.; Peresan, A.; Kossobokov, V. G.; Brooks, E. M.; Spencer, B. D.

    2016-12-01

    It is important to bear in mind that we know little about how earthquake hazard maps actually describe the shaking that will actually occur in the future, and have no agreed way of assessing how well a map performed in the past, and, thus, whether one map performs better than another. Moreover, we should not forget that different maps can be useful for different end users, who may have different cost-and-benefit strategies. Thus, regardless of the specific tests we chose to use, the adopted testing approach should have several key features: We should assess map performance using all the available instrumental, paleo seismology, and historical intensity data. Instrumental data alone span a period much too short to capture the largest earthquakes - and thus strongest shaking - expected from most faults. We should investigate what causes systematic misfit, if any, between the longest record we have - historical intensity data available for the Italian territory from 217 B.C. to 2002 A.D. - and a given hazard map. We should compare how seismic hazard maps developed over time. How do the most recent maps for Italy compare to earlier ones? It is important to understand local divergences that show how the models are developing to the most recent one. The temporal succession of maps is important: we have to learn from previous errors. We should use the many different tests that have been proposed. All are worth trying, because different metrics of performance show different aspects of how a hazard map performs and can be used. We should compare other maps to the ones we are testing. Maps can be made using a wide variety of assumptions, which will lead to different predicted shaking. It is possible that maps derived by other approaches may perform better. Although Italian current codes are based on probabilistic maps, it is important from both a scientific and societal perspective to look at all options including deterministic scenario based ones. Comparing what works

  11. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    Science.gov (United States)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve

  12. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    Science.gov (United States)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve

  13. Development of models to inform a national Daily Landslide Hazard Assessment for Great Britain

    Science.gov (United States)

    Dijkstra, Tom A.; Reeves, Helen J.; Dashwood, Claire; Pennington, Catherine; Freeborough, Katy; Mackay, Jonathan D.; Uhlemann, Sebastian S.; Chambers, Jonathan E.; Wilkinson, Paul B.

    2015-04-01

    were combined with records of observed landslide events to establish which antecedent effective precipitation (AEP) signatures of different duration could be used as a pragmatic proxy for the occurrence of landslides. It was established that 1, 7, and 90 days AEP provided the most significant correlations and these were used to calculate the probability of at least one landslide occurring. The method was then extended over the period 2006 to 2014 and the results evaluated against observed occurrences. It is recognised that AEP is a relatively poor proxy for simulating effective stress conditions along potential slip surfaces. However, the temporal pattern of landslide probability compares well to the observed occurrences and provides a potential benefit to assist with the DLHA. Further work is continuing to fine-tune the model for landslide type, better spatial resolution of effective precipitation input and cross-reference to models that capture changes in water balance and conditions along slip surfaces. The latter is facilitated by intensive research at several field laboratories, such as the Hollin Hill site in Yorkshire, England. At this site, a decade of activity has generated a broad range of research and a wealth of data. This paper reports on one example of recent work; the characterisation of near surface hydrology using infiltration experiments where hydrological pathways are captured, among others, by electrical resistivity tomography. This research, which has further developed our understanding of soil moisture movement in a heterogeneous landslide complex, has highlighted the importance of establishing detailed ground models to enable determination of landslide potential at high resolution. In turn, the knowledge gained through this research is used to enhance the expertise for the daily landslide hazard assessments at a national scale.

  14. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    Science.gov (United States)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  15. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Science.gov (United States)

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  16. Trimming a hazard logic tree with a new model-order-reduction technique

    Science.gov (United States)

    Porter, Keith; Field, Ned; Milner, Kevin R

    2017-01-01

    The size of the logic tree within the Uniform California Earthquake Rupture Forecast Version 3, Time-Dependent (UCERF3-TD) model can challenge risk analyses of large portfolios. An insurer or catastrophe risk modeler concerned with losses to a California portfolio might have to evaluate a portfolio 57,600 times to estimate risk in light of the hazard possibility space. Which branches of the logic tree matter most, and which can one ignore? We employed two model-order-reduction techniques to simplify the model. We sought a subset of parameters that must vary, and the specific fixed values for the remaining parameters, to produce approximately the same loss distribution as the original model. The techniques are (1) a tornado-diagram approach we employed previously for UCERF2, and (2) an apparently novel probabilistic sensitivity approach that seems better suited to functions of nominal random variables. The new approach produces a reduced-order model with only 60 of the original 57,600 leaves. One can use the results to reduce computational effort in loss analyses by orders of magnitude.

  17. Weather modeling for hazard and consequence assessment operations during the 2006 Winter Olympic Games

    Science.gov (United States)

    Hayes, P.; Trigg, J. L.; Stauffer, D.; Hunter, G.; McQueen, J.

    2006-05-01

    Consequence assessment (CA) operations are those processes that attempt to mitigate negative impacts of incidents involving hazardous materials such as chemical, biological, radiological, nuclear, and high explosive (CBRNE) agents, facilities, weapons, or transportation. Incident types range from accidental spillage of chemicals at/en route to/from a manufacturing plant, to the deliberate use of radiological or chemical material as a weapon in a crowded city. The impacts of these incidents are highly variable, from little or no impact to catastrophic loss of life and property. Local and regional scale atmospheric conditions strongly influence atmospheric transport and dispersion processes in the boundary layer, and the extent and scope of the spread of dangerous materials in the lower levels of the atmosphere. Therefore, CA personnel charged with managing the consequences of CBRNE incidents must have detailed knowledge of current and future weather conditions to accurately model potential effects. A meteorology team was established at the U.S. Defense Threat Reduction Agency (DTRA) to provide weather support to CA personnel operating DTRA's CA tools, such as the Hazard Prediction and Assessment Capability (HPAC) tool. The meteorology team performs three main functions: 1) regular provision of meteorological data for use by personnel using HPAC, 2) determination of the best performing medium-range model forecast for the 12 - 48 hour timeframe and 3) provision of real-time help-desk support to users regarding acquisition and use of weather in HPAC CA applications. The normal meteorology team operations were expanded during a recent modeling project which took place during the 2006 Winter Olympic Games. The meteorology team took advantage of special weather observation datasets available in the domain of the Winter Olympic venues and undertook a project to improve weather modeling at high resolution. The varied and complex terrain provided a special challenge to the

  18. Three-dimensional displays for natural hazards analysis, using classified Landsat Thematic Mapper digital data and large-scale digital elevation models

    Science.gov (United States)

    Butler, David R.; Walsh, Stephen J.; Brown, Daniel G.

    1991-01-01

    Methods are described for using Landsat Thematic Mapper digital data and digital elevation models for the display of natural hazard sites in a mountainous region of northwestern Montana, USA. Hazard zones can be easily identified on the three-dimensional images. Proximity of facilities such as highways and building locations to hazard sites can also be easily displayed. A temporal sequence of Landsat TM (or similar) satellite data sets could also be used to display landscape changes associated with dynamic natural hazard processes.

  19. River Loire levees hazard studies – CARDigues’ model principles and utilization examples on Blois levees

    Directory of Open Access Journals (Sweden)

    Durand Eduard

    2016-01-01

    Full Text Available Along the river Loire, in order to have a homogenous method to do specific risk assessment studies, a new model named CARDigues (for Levee Breach Hazard Calculation was developed in a partnership with DREAL Centre-Val de Loire (owner of levees, Cerema and Irstea. This model enables to approach the probability of failure on every levee sections and to integrate and cross different “stability” parameters such topography and included structures, geology and material geotechnical characteristics, hydraulic loads… and observations of visual inspections or instrumentation results considered as disorders (seepage, burrowing animals, vegetation, pipes, etc.. This model and integrated tool CARDigues enables to check for each levee section, the probability of appearance and rupture of five breaching scenarios initiated by: overflowing, internal erosion, slope instability, external erosion and uplift. It has been recently updated and has been applied on several levee systems by different contractors. The article presents the CARDigues model principles and its recent developments (version V28.00 with examples on river Loire and how it is currently used for a relevant and global levee system diagnosis and assessment. Levee reinforcement or improvement management is also a perspective of applications for this model CARDigues.

  20. Age-period-cohort models using smoothing splines: a generalized additive model approach.

    Science.gov (United States)

    Jiang, Bei; Carriere, Keumhee C

    2014-02-20

    Age-period-cohort (APC) models are used to analyze temporal trends in disease or mortality rates, dealing with linear dependency among associated effects of age, period, and cohort. However, the nature of sparseness in such data has severely limited the use of APC models. To deal with these practical limitations and issues, we advocate cubic smoothing splines. We show that the methods of estimable functions proposed in the framework of generalized linear models can still be considered to solve the non-identifiability problem when the model fitting is within the framework of generalized additive models with cubic smoothing splines. Through simulation studies, we evaluate the performance of the cubic smoothing splines in terms of the mean squared errors of estimable functions. Our results support the use of cubic smoothing splines for APC modeling with sparse but unaggregated data from a Lexis diagram.

  1. Risk assessment framework of fate and transport models applied to hazardous waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, S.T.

    1993-06-01

    Risk assessment is an increasingly important part of the decision-making process in the cleanup of hazardous waste sites. Despite guidelines from regulatory agencies and considerable research efforts to reduce uncertainties in risk assessments, there are still many issues unanswered. This paper presents new research results pertaining to fate and transport models, which will be useful in estimating exposure concentrations and will help reduce uncertainties in risk assessment. These developments include an approach for (1) estimating the degree of emissions and concentration levels of volatile pollutants during the use of contaminated water, (2) absorption of organic chemicals in the soil matrix through the skin, and (3) steady state, near-field, contaminant concentrations in the aquifer within a waste boundary.

  2. Modelling the impacts of coastal hazards on land-use development

    Science.gov (United States)

    Ramirez, J.; Vafeidis, A. T.

    2009-04-01

    Approximately 10% of the world's population live in close proximity to the coast and are potentially susceptible to tropical or extra-tropical storm-surge events. These events will be exacerbated by projected sea-level rise (SLR) in the 21st century. Accelerated SLR is one of the more certain impacts of global warming and can have major effects on humans and ecosystems. Of particular vulnerability are densely populated coastal urban centres containing globally important commercial resources, with assets in the billions USD. Moreover, the rates of growth of coastal populations, which are reported to be growing faster than the global means, are leading to increased human exposure to coastal hazards. Consequently, potential impacts of coastal hazards can be significant in the future and will depend on various factors but actual impacts can be considerably reduced by appropriate human decisions on coastal land-use management. At the regional scale, it is therefore necessary to identify which coastal areas are vulnerable to these events and explore potential long-term responses reflected in land usage. Land-use change modelling is a technique which has been extensively used in recent years for studying the processes and mechanisms that govern the evolution of land use and which can potentially provide valuable information related to the future coastal development of regions that are vulnerable to physical forcings. Although studies have utilized land-use classification maps to determine the impact of sea-level rise, few use land-use projections to make these assessments, and none have considered adaptive behaviour of coastal dwellers exposed to hazards. In this study a land-use change model, which is based on artificial neural networks (ANN), was employed for predicting coastal urban and agricultural development. The model uses as inputs a series of spatial layers, which include information on population distribution, transportation networks, existing urban centres, and

  3. Developing Sustainable Modeling Software and Necessary Data Repository for Volcanic Hazard Analysis -- Some Lessons Learnt

    Science.gov (United States)

    Patra, A. K.; Connor, C.; Webley, P.; Jones, M.; Charbonnier, S. J.; Connor, L.; Gallo, S.; Bursik, M. I.; Valentine, G.; Hughes, C. G.; Aghakhani, H.; Renschler, C. S.; Kosar, T.

    2014-12-01

    We report here on an effort to improve the sustainability, robustness and usability of the core modeling and simulation tools housed in the collaboratory VHub.org and used in the study of complex volcanic behavior. In particular, we focus on tools that support large scale mass flows (TITAN2D), ash deposition/transport and dispersal (Tephra2 and PUFF), and lava flows (Lava2). These tools have become very popular in the community especially due to the availability of an online usage modality. The redevelopment of the tools ot take advantage of new hardware and software advances was a primary thrust for the effort. However, as we start work we have reoriented the effort to also take advantage of significant new opportunities for supporting the complex workflows and use of distributed data resources that will enable effective and efficient hazard analysis.

  4. Novel Harmonic Regularization Approach for Variable Selection in Cox’s Proportional Hazards Model

    Directory of Open Access Journals (Sweden)

    Ge-Jin Chu

    2014-01-01

    Full Text Available Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL, the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  5. Doubly stochastic models for volcanic hazard assessment at Campi Flegrei caldera

    CERN Document Server

    Bevilacqua, Andrea

    2016-01-01

    This study provides innovative mathematical models for assessing the eruption probability and associated volcanic hazards, and applies them to the Campi Flegrei caldera in Italy. Throughout the book, significant attention is devoted to quantifying the sources of uncertainty affecting the forecast estimates. The Campi Flegrei caldera is certainly one of the world’s highest-risk volcanoes, with more than 70 eruptions over the last 15,000 years, prevalently explosive ones of varying magnitude, intensity and vent location. In the second half of the twentieth century the volcano apparently once again entered a phase of unrest that continues to the present. Hundreds of thousands of people live inside the caldera and over a million more in the nearby city of Naples, making a future eruption of Campi Flegrei an event with potentially catastrophic consequences at the national and European levels.

  6. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    Science.gov (United States)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two

  7. Transferability of regional permafrost disturbance susceptibility modelling using generalized linear and generalized additive models

    Science.gov (United States)

    Rudy, Ashley C. A.; Lamoureux, Scott F.; Treitz, Paul; van Ewijk, Karin Y.

    2016-07-01

    To effectively assess and mitigate risk of permafrost disturbance, disturbance-prone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape characteristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Peninsula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed locations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) > 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Additionally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results indicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of disturbances were

  8. Hazardous Waste

    Science.gov (United States)

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  9. Combining an additive and tree-based regression model simultaneously: STIMA

    NARCIS (Netherlands)

    Dusseldorp, E.; Conversano, C.; Os, B.J. van

    2010-01-01

    Additive models and tree-based regression models are two main classes of statistical models used to predict the scores on a continuous response variable. It is known that additive models become very complex in the presence of higher order interaction effects, whereas some tree-based models, such as

  10. A fast, calibrated model for pyroclastic density currents kinematics and hazard

    Science.gov (United States)

    Esposti Ongaro, Tomaso; Orsucci, Simone; Cornolti, Fulvio

    2016-11-01

    Multiphase flow models represent valuable tools for the study of the complex, non-equilibrium dynamics of pyroclastic density currents. Particle sedimentation, flow stratification and rheological changes, depending on the flow regime, interaction with topographic obstacles, turbulent air entrainment, buoyancy reversal, and other complex features of pyroclastic currents can be simulated in two and three dimensions, by exploiting efficient numerical solvers and the improved computational capability of modern supercomputers. However, numerical simulations of polydisperse gas-particle mixtures are quite computationally expensive, so that their use in hazard assessment studies (where there is the need of evaluating the probability of hazardous actions over hundreds of possible scenarios) is still challenging. To this aim, a simplified integral (box) model can be used, under the appropriate hypotheses, to describe the kinematics of pyroclastic density currents over a flat topography, their scaling properties and their depositional features. In this work, multiphase flow simulations are used to evaluate integral model approximations, to calibrate its free parameters and to assess the influence of the input data on the results. Two-dimensional numerical simulations describe the generation and decoupling of a dense, basal layer (formed by progressive particle sedimentation) from the dilute transport system. In the Boussinesq regime (i.e., for solid mass fractions below about 0.1), the current Froude number (i.e., the ratio between the current inertia and buoyancy) does not strongly depend on initial conditions and it is consistent to that measured in laboratory experiments (i.e., between 1.05 and 1.2). For higher density ratios (solid mass fraction in the range 0.1-0.9) but still in a relatively dilute regime (particle volume fraction lower than 0.01), numerical simulations demonstrate that the box model is still applicable, but the Froude number depends on the reduced

  11. A comparative analysis of hazard models for predicting debris flows in Madison County, VA

    Science.gov (United States)

    Morrissey, Meghan M.; Wieczorek, Gerald F.; Morgan, Benjamin A.

    2001-01-01

    During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a sixteen-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, Virginia. Field studies showed that the majority (70%) of these debris flows initiated with a thickness of 0.5 to 3.0 m in colluvium on slopes from 17 o to 41 o (Wieczorek et al., 2000). This paper evaluated and compared the approaches of SINMAP, LISA, and Iverson's (2000) transient response model for slope stability analysis by applying each model to the landslide data from Madison County. Of these three stability models, only Iverson's transient response model evaluated stability conditions as a function of time and depth. Iverson?s model would be the preferred method of the three models to evaluate landslide hazards on a regional scale in areas prone to rain-induced landslides as it considers both the transient and spatial response of pore pressure in its calculation of slope stability. The stability calculation used in SINMAP and LISA is similar and utilizes probability distribution functions for certain parameters. Unlike SINMAP that only considers soil cohesion, internal friction angle and rainfall-rate distributions, LISA allows the use of distributed data for all parameters, so it is the preferred model to evaluate slope stability over SINMAP. Results from all three models suggested similar soil and hydrologic properties for triggering the landslides that occurred during the 1995 storm in Madison County, Virginia. The colluvium probably had cohesion of less than 2KPa. The root-soil system is above the failure plane and consequently root strength and tree surcharge had negligible effect on slope stability. The result that the final location of the water table was near the ground surface is supported by the water budget analysis of the rainstorm conducted by Smith et al. (1996).

  12. Distribution modeling of hazardous airborne emissions from industrial campuses in Iraq via GIS techniques

    Science.gov (United States)

    Salwan Al-Hasnawi, S.; Salam Bash AlMaliki, J.; Falih Nazal, Zainab

    2017-08-01

    The presence of considerable amounts of hazardous elements in air may represent prolonged lethal effects for the residential and/or commercial campuses and activities, especially those around the emission activities, hence it is so important to monitor and anticipate these concentrations and design an effective spatial forecasting models for that sake. Geographic information systems GIS were utilized to monitor, analyze and model the presence and concentrations for airborne Pb, Cr, and Zn elements in the atmosphere around certain industrial campuses at the northern part of Iraq. Diffusion patterns were determined for these elements via the adaptation of GIS extension; the geostatistical and spatial analysis that implement Kriging and inverse distance weighted (IDW) methods to interpolate a raster surface. The main determination factors like wind speed, ambient temperature and topographic distributions were considered in order to design a prediction model that serves as early alert of future possible accidents. Results of eight months observation program have proved that the concentrations of the three elements had significantly exceeded the Iraqi and WHO limits at most of the observed locations especially for summer times. Also, the predicted models were validated with the field measures and have proved close match especially for the geostatistical analysis map that had around 4% percentage error for the three tested elements.

  13. Converting HAZUS capacity curves to seismic hazard-compatible building fragility functions: effect of hysteretic models

    Science.gov (United States)

    Ryu, Hyeuk; Luco, Nicolas; Baker, Jack W.; Karaca, Erdem

    2008-01-01

    A methodology was recently proposed for the development of hazard-compatible building fragility models using parameters of capacity curves and damage state thresholds from HAZUS (Karaca and Luco, 2008). In the methodology, HAZUS curvilinear capacity curves were used to define nonlinear dynamic SDOF models that were subjected to the nonlinear time history analysis instead of the capacity spectrum method. In this study, we construct a multilinear capacity curve with negative stiffness after an ultimate (capping) point for the nonlinear time history analysis, as an alternative to the curvilinear model provided in HAZUS. As an illustration, here we propose parameter values of the multilinear capacity curve for a moderate-code low-rise steel moment resisting frame building (labeled S1L in HAZUS). To determine the final parameter values, we perform nonlinear time history analyses of SDOF systems with various parameter values and investigate their effects on resulting fragility functions through sensitivity analysis. The findings improve capacity curves and thereby fragility and/or vulnerability models for generic types of structures.

  14. Simulated hazards of loosing infection-free status in a Dutch BHV1 model.

    Science.gov (United States)

    Vonk Noordegraaf, A; Labrovic, A; Frankena, K; Pfeiffer, D U; Nielen, M

    2004-01-30

    A compulsory eradication programme for bovine herpesvirus 1 (BHV1) was implemented in the Netherlands in 1998. At the start of the programme, about 25% of the dairy herds were certified BHV1-free. Simulation models have played an important role in the decision-making process associated with BHV1 eradication. Our objective in this study was to improve understanding of model behaviour (as part of internal validation) regarding loss by herds of the BHV1-free certificate. Using a Cox proportional hazards model, the association between farm characteristics and the risk of certificate loss during simulation was quantified. The overall fraction of herds experiencing certificate loss amongst initially certified during simulation was 3.0% in 6.5 years. Factors that increased risk for earlier certificate loss in the final multivariable Cox model were higher 'yearly number of cattle purchased', 'farm density within a 1 km radius' and 'cattle density within a 1 km radius'. Qualitative behaviour of risk factors we found agreed with observations in field studies.

  15. Additive interaction in survival analysis

    DEFF Research Database (Denmark)

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise

    2012-01-01

    It is a widely held belief in public health and clinical decision-making that interventions or preventive strategies should be aimed at patients or population subgroups where most cases could potentially be prevented. To identify such subgroups, deviation from additivity of absolute effects...... implementation guide of the additive hazards model is provided in the appendix....

  16. An evaluation of three representative multimedia models used to support cleanup decision-making at hazardous, mixed, and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; Fthenakis, V.M.; Holtzman, S. [Brookhaven National Lab., Upton, NY (United States)] [and others

    1996-04-01

    The decision process involved in cleaning sites contaminated with hazardous, mixed, and radioactive materials is supported often by results obtained from computer models. These results provide limits within which a decision-maker can judge the importance of individual transport and fate processes, and the likely outcome of alternative cleanup strategies. The transport of hazardous materials may occur predominately through one particular pathway but, more often, actual or potential transport must be evaluated across several pathways and media. Multimedia models are designed to simulate the transport of contaminants from a source to a receptor through more than one environmental pathway. Three such multimedia models are reviewed here: MEPAS, MMSOILS, and PRESTO-EPA-CPG. The reviews are based on documentation provided with the software, on published reviews, on personal interviews with the model developers, and on model summaries extracted from computer databases and expert systems. The three models are reviewed within the context of specific media components: air, surface water, ground water, and food chain. Additional sections evaluate the way that these three models calculate human exposure and dose and how they report uncertainty. Special emphasis is placed on how each model handles radionuclide transport within specific media. For the purpose of simulating the transport, fate and effects of radioactive contaminants through more than one pathway, both MEPAS and PRESTO-EPA-CPG are adequate for screening studies; MMSOILS only handles nonradioactive substances and must be modified before it can be used in these same applications. Of the three models, MEPAS is the most versatile, especially if the user needs to model the transport, fate, and effects of hazardous and radioactive contaminants. 44 refs., 2 tabs.

  17. AschFlow - A dynamic landslide run-out model for medium scale hazard analysis.

    Science.gov (United States)

    Luna, Byron Quan; Blahut, Jan; van Asch, Theo; van Westen, Cees; Kappes, Melanie

    2015-04-01

    Landslides and debris flow hazard assessments require a scale-dependent analysis in order to mitigate damage and other negative consequences at the respective scales of occurrence. Medium or large scale landslide run-out modelling for many possible landslide initiation areas has been a cumbersome task in the past. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the run-out models to compute the displacement with a large amount of individual initiation areas (computational exhaustive). Most of the existing physically based run-out models have complications in handling such situations and therefore empirical methods have been used as a practical mean to predict landslides mobility at a medium scale (1:10,000 to 1:50,000). In this context, a simple medium scale numerical model for rapid mass movements in urban and mountainous areas was developed. The deterministic nature of the approach makes it possible to calculate the velocity, height and increase in mass by erosion, resulting in the estimation of various forms of impacts exerted by debris flows at the medium scale The established and implemented model ("AschFlow") is a 2-D one-phase continuum model that simulates, the entrainment, spreading and deposition process of a landslide or debris flow at a medium scale. The flow is thus treated as a single phase material, whose behavior is controlled by rheology (e.g. Voellmy or Bingham). The developed regional model "AschFlow" was applied and evaluated in well documented areas with known past debris flow events.

  18. Modeling the cardiovascular system using a nonlinear additive autoregressive model with exogenous input

    Science.gov (United States)

    Riedl, M.; Suhrbier, A.; Malberg, H.; Penzel, T.; Bretthauer, G.; Kurths, J.; Wessel, N.

    2008-07-01

    The parameters of heart rate variability and blood pressure variability have proved to be useful analytical tools in cardiovascular physics and medicine. Model-based analysis of these variabilities additionally leads to new prognostic information about mechanisms behind regulations in the cardiovascular system. In this paper, we analyze the complex interaction between heart rate, systolic blood pressure, and respiration by nonparametric fitted nonlinear additive autoregressive models with external inputs. Therefore, we consider measurements of healthy persons and patients suffering from obstructive sleep apnea syndrome (OSAS), with and without hypertension. It is shown that the proposed nonlinear models are capable of describing short-term fluctuations in heart rate as well as systolic blood pressure significantly better than similar linear ones, which confirms the assumption of nonlinear controlled heart rate and blood pressure. Furthermore, the comparison of the nonlinear and linear approaches reveals that the heart rate and blood pressure variability in healthy subjects is caused by a higher level of noise as well as nonlinearity than in patients suffering from OSAS. The residue analysis points at a further source of heart rate and blood pressure variability in healthy subjects, in addition to heart rate, systolic blood pressure, and respiration. Comparison of the nonlinear models within and among the different groups of subjects suggests the ability to discriminate the cohorts that could lead to a stratification of hypertension risk in OSAS patients.

  19. Hazard Mapping of Structurally Controlled Landslide in Southern Leyte, Philippines Using High Resolution Digital Elevation Model

    Science.gov (United States)

    Luzon, Paul Kenneth; Rochelle Montalbo, Kristina; Mahar Francisco Lagmay, Alfredo

    2014-05-01

    The 2006 Guinsaugon landslide in St. Bernard, Southern Leyte is the largest known mass movement of soil in the Philippines. It consisted of a 15 million m3 rockslide-debris avalanche from an approximately 700 m high escarpment produced by continuous movement of the Philippine fault at approximately 2.5 cm/year. The landslide was preceded by continuous heavy rainfall totaling 571.2 mm from February 8 to 12, 2006. The catastrophic landslide killed more than 1,000 people and displaced 19,000 residents over its 6,400 km path. To investigate the present-day morphology of the scar and potential failure that may occur, an analysis of a high-resolution digital elevation model (10 m resolution Synthetic Aperture Radar images in 2013) was conducted, leading to the generation of a structurally controlled landslide hazard map of the area. Discontinuity sets that could contribute to any failure mechanism were identified using Coltop 3D software which uses a unique lower Schmidt-Lambert color scheme for any given dip and dip direction. Thus, finding main morpho-structural orientations became easier. Matterocking, a software designed for structural analysis, was used to generate possible planes that could slide due to the identified discontinuity sets. Conefall was then utilized to compute the extent to which the rock mass will run out. The results showed potential instabilities in the scarp area of the 2006 Guinsaguon landslide and in adjacent slopes because of the presence of steep discontinuities that range from 45-60°. Apart from the 2006 Guinsaugon potential landslides, conefall simulation generated farther rock mass extent in adjacent slopes. In conclusion, there is a high probability of landslides in the municipality of St. Bernard Leyte, where the 2006 Guinsaugon Landslide occurred. Concerned agencies may use maps produced from this study for disaster preparedness and to facilitate long-term recovery planning for hazardous areas.

  20. Cellular parameters for track structure modelling of radiation hazard in space

    Science.gov (United States)

    Hollmark, M.; Lind, B.; Gudowska, I.; Waligorski, M.

    Based on irradiation with 45 MeV/u N and B ions and with Co-60 gamma rays, track structure cellular parameters have been fitted for V 79-379A Chinese hamster lung fibroblasts and for human melanoma cells (AA wtp53). These sets of parameters will be used to develop a calculation of radiation hazard in deep space, based on the system for evaluating, summing and reporting occupational exposures proposed in 1967 by subcommittee of the NCRP, but never issued as an NCRP report. The key concepts of this system were: i) expression of the risk from all radiation exposures relative to that from a whole-body exposure to Co-60 radiation; ii) relating the risk from any exposure to that of the standard (Co-60) radiation through an "effectiveness factor" (ef), a product of sub-factors representing radiation quality, body region irradiated, and depth of penetration of radiation; the product of absorbed dose by ef being termed the "exposure record unit" (eru); iii) development of ef values and a cumulative eru record for external and internal emitters. Application of this concept should provide a better description of the Gy -equivalent presently in use by NASA for evaluating risk in deep space than the equivalent dose, following ICRP-60 recommendations. Dose and charged particle fluence levels encountered in space, particularly after Solar Particle Events, require that deterministic rather than stochastic effects be considered. Also, synergistic effects due to simultaneous multiple charged particle transfers, may have to be considered. Thus, models applicable in radiotherapy, where the Gy -equivalent is also applied, in conjunction with transport calculations performed using, e.g. the ADAM and EVA phantoms, along the concepts of the 1967 NCRP system, may be more appropriate for evaluating the radiation hazard from external fields with a large flux and a major high-LET component.

  1. Local models for rainstorm-induced hazard analysis on Mediterranean river-torrential geomorphological systems

    Directory of Open Access Journals (Sweden)

    N. Diodato

    2004-01-01

    Full Text Available Damaging hydrogeomorphological events are defined as one or more simultaneous phenomena (e.g. accelerated erosions, landslides, flash floods and river floods, occurring in a spatially and temporal random way and triggered by rainfall with different intensity and extent. The storm rainfall values are highly dependent on weather condition and relief. However, the impact of rainstorms in Mediterranean mountain environments depend mainly on climatic fluctuations in the short and long term, especially in rainfall quantity. An algorithm for the characterisation of this impact, called Rainfall Hazard Index (RHI, is developed with a less expensive methodology. In RHI modelling, we assume that the river-torrential system has adapted to the natural hydrological regime, and a sudden fluctuation in this regime, especially those exceeding thresholds for an acceptable range of flexibility, may have disastrous consequences for the mountain environment. RHI integrate two rainfall variables based upon storm depth current and historical data, both of a fixed duration, and a one-dimensionless parameter representative of the degree ecosystem flexibility. The approach was applied to a test site in the Benevento river-torrential landscape, Campania (Southern Italy. So, a database including data from 27 events which have occurred during an 77-year period (1926-2002 was compared with Benevento-station RHI(24h, for a qualitative validation. Trends in RHIx for annual maximum storms of duration 1, 3 and 24h were also examined. Little change is observed at the 3- and 24-h duration of a storm, but a significant increase results in hazard of a short and intense storm (RHIx(1h, in agreement with a reduction in return period for extreme rainfall events.

  2. Flood Hazard Mapping using Hydraulic Model and GIS: A Case Study in Mandalay City, Myanmar

    Directory of Open Access Journals (Sweden)

    Kyu Kyu Sein

    2016-01-01

    Full Text Available This paper presents the use of flood frequency analysis integrating with 1D Hydraulic model (HECRAS and Geographic Information System (GIS to prepare flood hazard maps of different return periods in Ayeyarwady River at Mandalay City in Myanmar. Gumbel’s distribution was used to calculate the flood peak of different return periods, namely, 10 years, 20 years, 50 years, and 100 years. The flood peak from frequency analysis were input into HEC-RAS model to find the corresponding flood level and extents in the study area. The model results were used in integrating with ArcGIS to generate flood plain maps. Flood depths and extents have been identified through flood plain maps. Analysis of 100 years return period flood plain map indicated that 157.88 km2 with the percentage of 17.54% is likely to be inundated. The predicted flood depth ranges varies from greater than 0 to 24 m in the flood plains and on the river. The range between 3 to 5 m were identified in the urban area of Chanayetharzan, Patheingyi, and Amarapua Townships. The highest inundated area was 85 km2 in the Amarapura Township.

  3. Mediation Analysis with Survival Outcomes: Accelerated Failure Time Versus Proportional Hazards Models

    Directory of Open Access Journals (Sweden)

    Lois A Gelfand

    2016-03-01

    Full Text Available Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH and fully parametric accelerated failure time (AFT approaches for illustration.Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively under varied data conditions, some including censoring. A simulated data set illustrates the findings.Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome – underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG.Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  4. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Harper, Bryan [Oregon State University (United States); Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan [Pacific Northwest National Laboratory (United States); Tang, Kaizhi [Intelligent Automation, Inc. (United States); Heredia-Langner, Alejandro [Pacific Northwest National Laboratory (United States); Lins, Roberto [CPqAM, Oswaldo Cruz Foundation, FIOCRUZ-PE (Brazil); Harper, Stacey, E-mail: stacey.harper@oregonstate.edu [Oregon State University (United States)

    2015-06-15

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships.

  5. A forecasting and forewarning model for methane hazard in working face of coal mine based on LS-SVM

    Institute of Scientific and Technical Information of China (English)

    CAO Shu-gang; LIU Yan-bao; WANG Yan-ping

    2008-01-01

    To improve the precision and reliability in predicting methane hazard in working face of coal mine, we have proposed a forecasting and forewarning model for methane hazard based on the least square support vector (LS-SVM) multi-classifier and regression machine. For the forecasting model, the methane concentration can be considered as a nonlinear time series and the time series analysis method is adopted to predict the change in methane concentration using LS-SVM regression. For the forewarning model, which is based on the forecasting results, by the multi-classification method of LS-SVM, the methane hazard was identified to four grades: normal, attention, warning and danger. According to the forewarning results, corresponding measures are taken. The model was used to forecast and forewarn the K9 working face. The results obtained by LS-SVM regression show that the forecast- ing have a high precision and forewarning results based on a LS-SVM multi-classifier are credible. Therefore, it is an effective model building method for continuous prediction of methane concentration and hazard forewarning in working face.

  6. Comparison of 2D numerical models for river flood hazard assessment: simulation of the Secchia River flood in January, 2014

    Science.gov (United States)

    Shustikova, Iuliia; Domeneghetti, Alessio; Neal, Jeffrey; Bates, Paul; Castellarin, Attilio

    2017-04-01

    Hydrodynamic modeling of inundation events still brings a large array of uncertainties. This effect is especially evident in the models run for geographically large areas. Recent studies suggest using fully two-dimensional (2D) models with high resolution in order to avoid uncertainties and limitations coming from the incorrect interpretation of flood dynamics and an unrealistic reproduction of the terrain topography. This, however, affects the computational efficiency increasing the running time and hardware demands. Concerning this point, our study evaluates and compares numerical models of different complexity by testing them on a flood event that occurred in the basin of the Secchia River, Northern Italy, on 19th January, 2014. The event was characterized by a levee breach and consequent flooding of over 75 km2 of the plain behind the dike within 48 hours causing population displacement, one death and economic losses in excess of 400 million Euro. We test the well-established TELEMAC 2D, and LISFLOOD-FP codes, together with the recently launched HEC-RAS 5.0.3 (2D model), all models are implemented using different grid size (2-200 m) based on the 1 m digital elevation model resolution. TELEMAC is a fully 2D hydrodynamic model which is based on the finite-element or finite-volume approach. Whereas HEC-RAS 5.0.3 and LISFLOOD-FP are both coupled 1D-2D models. All models are calibrated against observed inundation extent and maximum water depths, which are retrieved from remotely sensed data and field survey reports. Our study quantitatively compares the three modeling strategies highlighting differences in terms of the ease of implementation, accuracy of representation of hydraulic processes within floodplains and computational efficiency. Additionally, we look into the different grid resolutions in terms of the results accuracy and computation time. Our study is a preliminary assessment that focuses on smaller areas in order to identify potential modeling schemes

  7. Reliability estimation and remaining useful lifetime prediction for bearing based on proportional hazard model

    Institute of Scientific and Technical Information of China (English)

    王鹭; 张利; 王学芝

    2015-01-01

    As the central component of rotating machine, the performance reliability assessment and remaining useful lifetime prediction of bearing are of crucial importance in condition-based maintenance to reduce the maintenance cost and improve the reliability. A prognostic algorithm to assess the reliability and forecast the remaining useful lifetime (RUL) of bearings was proposed, consisting of three phases. Online vibration and temperature signals of bearings in normal state were measured during the manufacturing process and the most useful time-dependent features of vibration signals were extracted based on correlation analysis (feature selection step). Time series analysis based on neural network, as an identification model, was used to predict the features of bearing vibration signals at any horizons (feature prediction step). Furthermore, according to the features, degradation factor was defined. The proportional hazard model was generated to estimate the survival function and forecast the RUL of the bearing (RUL prediction step). The positive results show that the plausibility and effectiveness of the proposed approach can facilitate bearing reliability estimation and RUL prediction.

  8. Survival prediction based on compound covariate under Cox proportional hazard models.

    Directory of Open Access Journals (Sweden)

    Takeshi Emura

    Full Text Available Survival prediction from a large number of covariates is a current focus of statistical and medical research. In this paper, we study a methodology known as the compound covariate prediction performed under univariate Cox proportional hazard models. We demonstrate via simulations and real data analysis that the compound covariate method generally competes well with ridge regression and Lasso methods, both already well-studied methods for predicting survival outcomes with a large number of covariates. Furthermore, we develop a refinement of the compound covariate method by incorporating likelihood information from multivariate Cox models. The new proposal is an adaptive method that borrows information contained in both the univariate and multivariate Cox regression estimators. We show that the new proposal has a theoretical justification from a statistical large sample theory and is naturally interpreted as a shrinkage-type estimator, a popular class of estimators in statistical literature. Two datasets, the primary biliary cirrhosis of the liver data and the non-small-cell lung cancer data, are used for illustration. The proposed method is implemented in R package "compound.Cox" available in CRAN at http://cran.r-project.org/.

  9. A general additive-multiplicative rates model for recurrent event data

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this article, we propose a general additive-multiplicative rates model for recurrent event data. The proposed model includes the additive rates and multiplicative rates models as special cases. For the inference on the model parameters, estimating equation approaches are developed, and asymptotic properties of the proposed estimators are established through modern empirical process theory. In addition, an illustration with multiple-infection data from a clinic study on chronic granulomatous disease is provided.

  10. Validation analysis of probabilistic models of dietary exposure to food additives.

    Science.gov (United States)

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  11. Modeling and hazard mapping of complex cascading mass movement processes: the case of glacier lake 513, Carhuaz, Peru

    Science.gov (United States)

    Schneider, Demian; Huggel, Christian; García, Javier; Ludeña, Sebastian; Cochachin, Alejo

    2013-04-01

    The Cordilleras in Peru are especially vulnerable to, and affected by impacts from climate change. Local communities and cities often exist directly within the reach of major hazard potentials such as lake outburst floods (aluviones), mud-/debris flows (huaycos) or large rock-/ice avalanches. They have been repeatedly and strongly affected these regions over the last decades and since the last century, and thousands of people have been killed. One of the most recent events in the Cordillera Blanca occurred on 11 April 2010, when a rock/ice avalanche from the top of Hualcán mountain, NE of the town of Carhuaz impacted the glacier lake 513 (Laguna 513), caused displacement waves and triggered an outburst flood wave. The flow repeatedly transformed from debris flow to hyperconcentrated flow and eventually caused significant damage in Carhuaz. This event was motivation to start early warning and prevention efforts to reduce risks related to ice/rock avalanches and glacier lake outburst floods (GLOF). One of the basic components of an early warning system is the assessment, understanding and communication of relevant hazards and risks. Here we report on the methodology and results of generating GLOF related hazard maps for Carhuaz based on numerical modeling and field work. This exercise required an advanced concept and implementation of different mass movement models. Specifically, numerical models were applied for simulating avalanche flow, avalanche lake impact, displacement wave generation and lake overtopping, and eventually flow propagation of the outburst flood with changing rheology between debris flow and hyperconcentrated flows. We adopted a hazard mapping procedure slightly adjusted adjusted from guidelines developed in Switzerland and in the Andes region. A methodology has thereby been developed to translate results from numerical mass movement modeling into hazard maps. The resulting hazard map was verified and adjusted during field work. This study shows

  12. Preparing a seismic hazard model for Switzerland: the view from PEGASOS Expert Group 3 (EG1c)

    Energy Technology Data Exchange (ETDEWEB)

    Musson, R. M. W. [British Geological Survey, West Mains Road, Edinburgh, EH9 3LA (United Kingdom); Sellami, S. [Swiss Seismological Service, ETH-Hoenggerberg, Zuerich (Switzerland); Bruestle, W. [Regierungspraesidium Freiburg, Abt. 9: Landesamt fuer Geologie, Rohstoffe und Bergbau, Ref. 98: Landeserdbebendienst, Freiburg im Breisgau (Germany)

    2009-05-15

    The seismic hazard model used in the PEGASOS project for assessing earth-quake hazard at four NPP sites was a composite of four sub-models, each produced by a team of three experts. In this paper, one of these models is described in detail by the authors. A criticism sometimes levelled at probabilistic seismic hazard studies is that the process by which seismic source zones are arrived at is obscure, subjective and inconsistent. Here, we attempt to recount the stages by which the model evolved, and the decisions made along the way. In particular, a macro-to-micro approach was used, in which three main stages can be described. The first was the characterisation of the overall kinematic model, the 'big picture' of regional seismo-genesis. Secondly, this was refined to a more detailed seismotectonic model. Lastly, this was used as the basis of individual sources, for which parameters can be assessed. Some basic questions had also to be answered about aspects of the approach to modelling to be used: for instance, is spatial smoothing an appropriate tool to apply? Should individual fault sources be modelled in an intra-plate environment? Also, the extent to which alternative modelling decisions should be expressed in a logic tree structure has to be considered. (author)

  13. A Course in Hazardous Chemical Spills: Use of the CAMEO Air Dispersion Model to Predict Evacuation Distances.

    Science.gov (United States)

    Kumar, Ashok; And Others

    1989-01-01

    Provides an overview of the Computer-Aided Management of Emergency Operations (CAMEO) model and its use in the classroom as a training tool in the "Hazardous Chemical Spills" course. Presents six problems illustrating classroom use of CAMEO. Lists 16 references. (YP)

  14. A Monte Carlo study of time-aggregation in continuous-time and discrete-time parametric hazard models.

    NARCIS (Netherlands)

    Hofstede, ter F.; Wedel, M.

    1998-01-01

    This study investigates the effects of time aggregation in discrete and continuous-time hazard models. A Monte Carlo study is conducted in which data are generated according to various continuous and discrete-time processes, and aggregated into daily, weekly and monthly intervals. These data are

  15. SCEC Community Modeling Environment (SCEC/CME) - Seismic Hazard Analysis Applications and Infrastructure

    Science.gov (United States)

    Maechling, P. J.; Jordan, T. H.; Kesselman, C.; Moore, R.; Minster, B.; SCEC ITR Collaboration

    2003-12-01

    The Southern California Earthquake Center (SCEC) has formed a Geoscience/IT partnership to develop an advanced information infrastructure for system-level earthquake science in Southern California. This SCEC/ITR partnership comprises SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. This collaboration recently completed the second year in a five-year National Science Foundation (NSF) funded ITR project called the SCEC Community Modeling Environment (SCEC/CME). The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed by project collaborators include a Probabilistic Seismic Hazard Analysis system called OpenSHA [Field et al., this meeting]. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERF's). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. A Rupture Dynamic Model (RDM) has also been developed that couples a rupture dynamics simulation into an anelastic wave model. The collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of SHA programs. To support computationally expensive simulations, we have constructed a grid-based system utilizing Globus software [Kesselman et al., this meeting]. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC, NPACI and Teragrid High Performance Computing Centers. We have

  16. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  17. Tsunami hazard assessment along the French Mediterranean coast : detailed modeling of tsunami impacts for the ALDES project

    Science.gov (United States)

    Quentel, E.; Loevenbruck, A.; Hébert, H.

    2012-04-01

    The catastrophic 2004 tsunami drew the international community's attention to tsunami risk in all basins where tsunamis occurred but no warning system exists. Consequently, under the coordination of UNESCO, France decided to create a regional center, called CENALT, for the north-east Atlantic and the western Mediterranean. This warning system, which should be operational by 2012, is set up by the CEA in collaboration with the SHOM and the CNRS. The French authorities are in charge of the top-down alert system including the local alert dissemination. In order to prepare the appropriate means and measures, they initiated the ALDES (Alerte Descendante) project to which the CEA also contributes. It aims at examining along the French Mediterranean coast the tsunami risk related to earthquakes and landslides. In addition to the evaluation at regional scale, it includes the detailed studies of 3 selected sites; the local alert system will be designed for one of them : the French Riviera. In this project, our main task at CEA consists in assessing tsunami hazard related to seismic sources using numerical modeling. Past tsunamis have affected the west Mediterranean coast but are too few and poorly documented to provide a suitable database. Thus, a synthesis of earthquakes representative of the tsunamigenic seismic activity and prone to induce the largest impact to the French coast is performed based on historical data, seismotectonics and first order models. The North Africa Margin, the Ligurian and the South Tyrrhenian Seas are considered as the main tsunamigenic zones. In order to forecast the most important plausible effects, the magnitudes are estimated by enhancing to some extent the largest known values. Our hazard estimation is based on the simulation of the induced tsunamis scenarios performed with the CEA code. The 3 sites have been chosen according to the regional hazard studies, coastal typology elements and the appropriate DTMs (Digital Terrain Models). The

  18. Slope instability induced by volcano-tectonics as an additional source of hazard in active volcanic areas: the case of Ischia island (Italy)

    Science.gov (United States)

    Della Seta, Marta; Marotta, Enrica; Orsi, Giovanni; de Vita, Sandro; Sansivero, Fabio; Fredi, Paola

    2012-01-01

    Ischia is an active volcanic island in the Gulf of Naples whose history has been dominated by a caldera-forming eruption (ca. 55 ka) and resurgence phenomena that have affected the caldera floor and generated a net uplift of about 900 m since 33 ka. The results of new geomorphological, stratigraphical and textural investigations of the products of gravitational movements triggered by volcano-tectonic events have been combined with the information arising from a reinterpretation of historical chronicles on natural phenomena such as earthquakes, ground deformation, gravitational movements and volcanic eruptions. The combined interpretation of all these data shows that gravitational movements, coeval to volcanic activity and uplift events related to the long-lasting resurgence, have affected the highly fractured marginal portions of the most uplifted Mt. Epomeo blocks. Such movements, mostly occurring since 3 ka, include debris avalanches; large debris flows (lahars); smaller mass movements (rock falls, slumps, debris and rock slides, and small debris flows); and deep-seated gravitational slope deformation. The occurrence of submarine deposits linked with subaerial deposits of the most voluminous mass movements clearly shows that the debris avalanches impacted on the sea. The obtained results corroborate the hypothesis that the behaviour of the Ischia volcano is based on an intimate interplay among magmatism, resurgence dynamics, fault generation, seismicity, slope oversteepening and instability, and eruptions. They also highlight that volcano-tectonically triggered mass movements are a potentially hazardous phenomena that have to be taken into account in any attempt to assess volcanic and related hazards at Ischia. Furthermore, the largest mass movements could also flow into the sea, generating tsunami waves that could impact on the island's coast as well as on the neighbouring and densely inhabited coast of the Neapolitan area.

  19. Seismic hazard studies in Egypt

    Science.gov (United States)

    Mohamed, Abuo El-Ela A.; El-Hadidy, M.; Deif, A.; Abou Elenean, K.

    2012-12-01

    The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba-Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5°) within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA) values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA) values were detected in the western part of the western desert and it is less than 25 gal.

  20. Examining school-based bullying interventions using multilevel discrete time hazard modeling.

    Science.gov (United States)

    Ayers, Stephanie L; Wagaman, M Alex; Geiger, Jennifer Mullins; Bermudez-Parsai, Monica; Hedberg, E C

    2012-10-01

    Although schools have been trying to address bullying by utilizing different approaches that stop or reduce the incidence of bullying, little remains known about what specific intervention strategies are most successful in reducing bullying in the school setting. Using the social-ecological framework, this paper examines school-based disciplinary interventions often used to deliver consequences to deter the reoccurrence of bullying and aggressive behaviors among school-aged children. Data for this study are drawn from the School-Wide Information System (SWIS) with the final analytic sample consisting of 1,221 students in grades K - 12 who received an office disciplinary referral for bullying during the first semester. Using Kaplan-Meier Failure Functions and Multi-level discrete time hazard models, determinants of the probability of a student receiving a second referral over time were examined. Of the seven interventions tested, only Parent-Teacher Conference (AOR = 0.65, p connection between the students' mesosystems as well as utilizing disciplinary strategies that take into consideration student's microsystem roles.

  1. Examining School-Based Bullying Interventions Using Multilevel Discrete Time Hazard Modeling

    Science.gov (United States)

    Wagaman, M. Alex; Geiger, Jennifer Mullins; Bermudez-Parsai, Monica; Hedberg, E. C.

    2014-01-01

    Although schools have been trying to address bulling by utilizing different approaches that stop or reduce the incidence of bullying, little remains known about what specific intervention strategies are most successful in reducing bullying in the school setting. Using the social-ecological framework, this paper examines school-based disciplinary interventions often used to deliver consequences to deter the reoccurrence of bullying and aggressive behaviors among school-aged children. Data for this study are drawn from the School-Wide Information System (SWIS) with the final analytic sample consisting of 1,221 students in grades K – 12 who received an office disciplinary referral for bullying during the first semester. Using Kaplan-Meier Failure Functions and Multi-level discrete time hazard models, determinants of the probability of a student receiving a second referral over time were examined. Of the seven interventions tested, only Parent-Teacher Conference (AOR=0.65, pbullying and aggressive behaviors. By using a social-ecological framework, schools can develop strategies that deter the reoccurrence of bullying by identifying key factors that enhance a sense of connection between the students’ mesosystems as well as utilizing disciplinary strategies that take into consideration student’s microsystem roles. PMID:22878779

  2. On the predictive information criteria for model determination in seismic hazard analysis

    Science.gov (United States)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  3. A class of additive-accelerated means regression models for recurrent event data

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In this article, we propose a class of additive-accelerated means regression models for analyzing recurrent event data. The class includes the proportional means model, the additive rates model, the accelerated failure time model, the accelerated rates model and the additive-accelerated rate model as special cases. The new model offers great flexibility in formulating the effects of covariates on the mean functions of counting processes while leaving the stochastic structure completely unspecified. For the inference on the model parameters, estimating equation approaches are derived and asymptotic properties of the proposed estimators are established. In addition, a technique is provided for model checking. The finite-sample behavior of the proposed methods is examined through Monte Carlo simulation studies, and an application to a bladder cancer study is illustrated.

  4. Challenges in seismic hazard assessment: Analyses of ground motion modelling and seismotectonic sources

    OpenAIRE

    Sørensen, Mathilde Bøttger

    2006-01-01

    Seismic hazard assessment has an important societal impact in describing levels of ground motions to be expected in a given region in the future. Challenges in seismic hazard assessment are closely associated with the fact that different regions, due to their differences in seismotectonics setting (and hence in earthquake occurrence) as well as socioeconomic conditions, require different and innovative approaches. One of the most important aspects in this regard is the seismici...

  5. Modeling lifetime data with multiple causes using cause specific reversed hazard rates

    Directory of Open Access Journals (Sweden)

    Paduthol Godan Sankaran

    2014-09-01

    Full Text Available In this paper we introduce and study cause specific reversed hazard rates in the context of left censored lifetime data with multiple causes. Nonparametric inference procedure for left censored lifetime data with multiple causes using cause specific reversed hazard rate is discussed. Asymptotic properties of the estimators are studied. Simulation studies are conducted to assess the efficiency of the estimators. Further, the proposed method is applied to mice mortality data (Hoel 1972 and Australian twin data (Duffy et al. 1990.

  6. Simulating floods : On the application of a 2D-hydraulic model for flood hazard and risk assessment

    OpenAIRE

    Alkema, D.

    2007-01-01

    Over the last decades, river floods in Europe seem to occur more frequently and are causing more and more economic and emotional damage. Understanding the processes causing flooding and the development of simulation models to evaluate countermeasures to control that damage are important issues. This study deals with the application of a 2D hydraulic flood propagation model for flood hazard and risk assessment. It focuses on two components: 1) how well does it predict the spatial-dynamic chara...

  7. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  8. Combining observations and model simulations to reduce the hazard of Etna volcanic ash plumes

    Science.gov (United States)

    Scollo, Simona; Boselli, Antonella; Coltelli, Mauro; Leto, Giuseppe; Pisani, Gianluca; Prestifilippo, Michele; Spinelli, Nicola; Wang, Xuan; Zanmar Sanchez, Ricardo

    2014-05-01

    Etna is one of the most active volcanoes in the world with a recent activity characterized by powerful lava fountains that produce several kilometres high eruption columns and disperse volcanic ash in the atmosphere. It is well known that, to improve the volcanic ash dispersal forecast of an ongoing explosive eruption, input parameters used by volcanic ash dispersal models should be measured during the eruption. In this work, in order to better quantify the volcanic ash dispersal, we use data from the video-surveillance system of Istituto Nazionale di Geofisica e Vulcanologia, Osservatorio Etneo, and from the lidar system together with a volcanic ash dispersal model. In detail, the visible camera installed in Catania, 27 km from the vent is able to evaluate the evolution of column height with time. The Lidar, installed at the "M.G. Fracastoro" astrophysical observatory (14.97° E, 37.69° N) of the Istituto Nazionale di Astrofisica in Catania, located at a distance of 7 km from the Etna summit craters, uses a frequency doubled Nd:YAG laser source operating at a 532-nm wavelength, with a repetition rate of 1 kHz. Backscattering and depolarization values measured by the Lidar system can give, with a certain degree of uncertainty, an estimation of volcanic ash concentration in atmosphere. The 12 August 2011 activity is considered a perfect test case because volcanic plume was retrieved by both camera and Lidar. We evaluated the mass eruption rate from the column height and used best fit procedures comparing simulated volcanic ash concentrations with those extracted by the Lidar data. During this event, powerful lava fountains were well visible at about 08:30 GMT and a sustained eruption column was produced since about 08:55 GMT. Ash emission completely ceased around 11:30 GMT. The proposed approach is an attempt to produce more robust ash dispersal forecasts reducing the hazard to air traffic during Etna volcanic crisis.

  9. Generating survival times to simulate Cox proportional hazards models with time-varying covariates.

    Science.gov (United States)

    Austin, Peter C

    2012-12-20

    Simulations and Monte Carlo methods serve an important role in modern statistical research. They allow for an examination of the performance of statistical procedures in settings in which analytic and mathematical derivations may not be feasible. A key element in any statistical simulation is the existence of an appropriate data-generating process: one must be able to simulate data from a specified statistical model. We describe data-generating processes for the Cox proportional hazards model with time-varying covariates when event times follow an exponential, Weibull, or Gompertz distribution. We consider three types of time-varying covariates: first, a dichotomous time-varying covariate that can change at most once from untreated to treated (e.g., organ transplant); second, a continuous time-varying covariate such as cumulative exposure at a constant dose to radiation or to a pharmaceutical agent used for a chronic condition; third, a dichotomous time-varying covariate with a subject being able to move repeatedly between treatment states (e.g., current compliance or use of a medication). In each setting, we derive closed-form expressions that allow one to simulate survival times so that survival times are related to a vector of fixed or time-invariant covariates and to a single time-varying covariate. We illustrate the utility of our closed-form expressions for simulating event times by using Monte Carlo simulations to estimate the statistical power to detect as statistically significant the effect of different types of binary time-varying covariates. This is compared with the statistical power to detect as statistically significant a binary time-invariant covariate.

  10. Using a ballistic-caprock model for developing a volcanic projectiles hazard map at Santorini caldera

    Science.gov (United States)

    Konstantinou, Konstantinos

    2015-04-01

    Volcanic Ballistic Projectiles (VBPs) are rock/magma fragments of variable size that are ejected from active vents during explosive eruptions. VBPs follow almost parabolic trajectories that are influenced by gravity and drag forces before they reach their impact point on the Earth's surface. Owing to their high temperature and kinetic energies, VBPs can potentially cause human casualties, severe damage to buildings as well as trigger fires. Since the Minoan eruption the Santorini caldera has produced several smaller (VEI = 2-3) vulcanian eruptions, the last of which occurred in 1950, while in 2011 it also experienced significant deformation/seismicity even though no eruption eventually occurred. In this work, an eruptive model appropriate for vulcanian eruptions is used to estimate initial conditions (ejection height, velocity) for VBPs assuming a broad range of gas concentration/overpressure in the vent. These initial conditions are then inserted into a ballistic model for the purpose of calculating the maximum range of VBPs for different VBP sizes (0.35-3 m), varying drag coefficient as a function of VBP speed and varying air density as a function of altitude. In agreement with previous studies a zone of reduced drag is also included in the ballistic calculations that is determined based on the size of vents that were active in the Kameni islands during previous eruptions (< 1 km). Results show that the horizontal range of VBPs varies between 0.9-3 km and greatly depends on gas concentration, the extent of the reduced drag zone and the size of VBP. Hazard maps are then constructed by taking into account the maximum horizontal range values as well as potential locations of eruptive vents along a NE-SW direction around the Kameni islands (the so-called "Kameni line").

  11. A spatiotemporal optimization model for the evacuation of the population exposed to flood hazard

    Science.gov (United States)

    Alaeddine, H.; Serrhini, K.; Maizia, M.

    2015-03-01

    Managing the crisis caused by natural disasters, and especially by floods, requires the development of effective evacuation systems. An effective evacuation system must take into account certain constraints, including those related to traffic network, accessibility, human resources and material equipment (vehicles, collecting points, etc.). The main objective of this work is to provide assistance to technical services and rescue forces in terms of accessibility by offering itineraries relating to rescue and evacuation of people and property. We consider in this paper the evacuation of an urban area of medium size exposed to the hazard of flood. In case of inundation, most people will be evacuated using their own vehicles. Two evacuation types are addressed in this paper: (1) a preventive evacuation based on a flood forecasting system and (2) an evacuation during the disaster based on flooding scenarios. The two study sites on which the developed evacuation model is applied are the Tours valley (Fr, 37), which is protected by a set of dikes (preventive evacuation), and the Gien valley (Fr, 45), which benefits from a low rate of flooding (evacuation before and during the disaster). Our goal is to construct, for each of these two sites, a chronological evacuation plan, i.e., computing for each individual the departure date and the path to reach the assembly point (also called shelter) according to a priority list established for this purpose. The evacuation plan must avoid the congestion on the road network. Here we present a spatiotemporal optimization model (STOM) dedicated to the evacuation of the population exposed to natural disasters and more specifically to flood risk.

  12. Prognostic variables in late follow-up of aortic valve replacement using the proportional hazard model: a study on patients using the Medtronic-Hall cardiac prosthesis.

    Science.gov (United States)

    Abdelnoor, M; Hauge, S N; Hall, K V

    1986-01-01

    An alternative approach to the study of the follow-up of patients with heart prostheses is the use of the reliability theory (hazard function) and proportional hazard model (Cox's model). In a population of 480 patients who underwent AVR in the period from June 1977 to January 1983, with a mean follow-up time of 2.8 years, 16 preoperative variables were considered. From this pool of variables, six entered the regression model in a time-independent mode. These were age at operation, sex, preoperative NYHA classification, presence of AI, presence of endocarditis and presence of atrial fibrillation on ECG, none of which entered the model in the time-related mode. Another multifactorial approach, using a stepwise regression analysis to examine primary predictive factors that independently correlate with survival, while simultaneously accounting for the other previous variables, showed that the variables with additive prognostic value were age at operation, presence of AI and presence of endocarditis. Based on this model, a forecast five-year survival rate ranging from 88 to 14 per cent was found at the end of the fifth year. For the most favourable and the worst combinations of these prognostic variables, a patient-specific forecast five-year survival rate was drawn up. Our results were compared, using univariate and multivariate methods, with the results found in the literature, and the implications of this comparison were discussed.

  13. Flow-R, a model for susceptibility mapping of debris flows and other gravitational hazards at a regional scale

    Directory of Open Access Journals (Sweden)

    P. Horton

    2013-04-01

    Full Text Available The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM. The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time

  14. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    Science.gov (United States)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  15. Numerical modeling of debris avalanches at Nevado de Toluca (Mexico): implications for hazard evaluation and mapping

    Science.gov (United States)

    Grieco, F.; Capra, L.; Groppelli, G.; Norini, G.

    2007-05-01

    The present study concerns the numerical modeling of debris avalanches on the Nevado de Toluca Volcano (Mexico) using TITAN2D simulation software, and its application to create hazard maps. Nevado de Toluca is an andesitic to dacitic stratovolcano of Late Pliocene-Holocene age, located in central México near to the cities of Toluca and México City; its past activity has endangered an area with more than 25 million inhabitants today. The present work is based upon the data collected during extensive field work finalized to the realization of the geological map of Nevado de Toluca at 1:25,000 scale. The activity of the volcano has developed from 2.6 Ma until 10.5 ka with both effusive and explosive events; the Nevado de Toluca has presented long phases of inactivity characterized by erosion and emplacement of debris flow and debris avalanche deposits on its flanks. The largest epiclastic events in the history of the volcano are wide debris flows and debris avalanches, occurred between 1 Ma and 50 ka, during a prolonged hiatus in eruptive activity. Other minor events happened mainly during the most recent volcanic activity (less than 50 ka), characterized by magmatic and tectonic-induced instability of the summit dome complex. According to the most recent tectonic analysis, the active transtensive kinematics of the E-W Tenango Fault System had a strong influence on the preferential directions of the last three documented lateral collapses, which generated the Arroyo Grande and Zaguàn debris avalanche deposits towards E and Nopal debris avalanche deposit towards W. The analysis of the data collected during the field work permitted to create a detailed GIS database of the spatial and temporal distribution of debris avalanche deposits on the volcano. Flow models, that have been performed with the software TITAN2D, developed by GMFG at Buffalo, were entirely based upon the information stored in the geological database. The modeling software is built upon equations

  16. Stiffness Model of a 3-DOF Parallel Manipulator with Two Additional Legs

    Directory of Open Access Journals (Sweden)

    Guang Yu

    2014-10-01

    Full Text Available This paper investigates the stiffness modelling of a 3-DOF parallel manipulator with two additional legs. The stiffness model in six directions of the 3-DOF parallel manipulator with two additional legs is derived by performing condensation of DOFs for the joint connection and treatment of the fixed-end connections. Moreover, this modelling method is used to derive the stiffness model of the manipulator with zero/one additional legs. Two performance indices are given to compare the stiffness of the parallel manipulators with two additional legs with those of the manipulators with zero/one additional legs. The method not only can be used to derive the stiffness model of a redundant parallel manipulator, but also to model the stiffness of non-redundant parallel manipulators.

  17. Decision support model for assessing aquifer pollution hazard and prioritizing groundwater resources management in the wet Pampa plain, Argentina.

    Science.gov (United States)

    Lima, M Lourdes; Romanelli, Asunción; Massone, Héctor E

    2013-06-01

    This paper gives an account of the implementation of a decision support system for assessing aquifer pollution hazard and prioritizing subwatersheds for groundwater resources management in the southeastern Pampa plain of Argentina. The use of this system is demonstrated with an example from Dulce Stream Basin (1,000 km(2) encompassing 27 subwatersheds), which has high level of agricultural activities and extensive available data regarding aquifer geology. In the logic model, aquifer pollution hazard is assessed as a function of two primary topics: groundwater and soil conditions. This logic model shows the state of each evaluated landscape with respect to aquifer pollution hazard based mainly on the parameters of the DRASTIC and GOD models. The decision model allows prioritizing subwatersheds for groundwater resources management according to three main criteria including farming activities, agrochemical application, and irrigation use. Stakeholder participation, through interviews, in combination with expert judgment was used to select and weight each criterion. The resulting subwatershed priority map, by combining the logic and decision models, allowed identifying five subwatersheds in the upper and middle basin as the main aquifer protection areas. The results reasonably fit the natural conditions of the basin, identifying those subwatersheds with shallow water depth, loam-loam silt texture soil media and pasture land cover in the middle basin, and others with intensive agricultural activity, coinciding with the natural recharge area to the aquifer system. Major difficulties and some recommendations of applying this methodology in real-world situations are discussed.

  18. First look at changes in flood hazard in the Inter-Sectoral Impact Model Intercomparison Project ensemble

    Science.gov (United States)

    Dankers, Rutger; Arnell, Nigel W.; Clark, Douglas B.; Falloon, Pete D.; Fekete, Balázs M.; Gosling, Simon N.; Heinke, Jens; Kim, Hyungjun; Masaki, Yoshimitsu; Satoh, Yusuke; Stacke, Tobias; Wada, Yoshihide; Wisser, Dominik

    2014-03-01

    Climate change due to anthropogenic greenhouse gas emissions is expected to increase the frequency and intensity of precipitation events, which is likely to affect the probability of flooding into the future. In this paper we use river flow simulations from nine global hydrology and land surface models to explore uncertainties in the potential impacts of climate change on flood hazard at global scale. As an indicator of flood hazard we looked at changes in the 30-y return level of 5-d average peak flows under representative concentration pathway RCP8.5 at the end of this century. Not everywhere does climate change result in an increase in flood hazard: decreases in the magnitude and frequency of the 30-y return level of river flow occur at roughly one-third (20-45%) of the global land grid points, particularly in areas where the hydrograph is dominated by the snowmelt flood peak in spring. In most model experiments, however, an increase in flooding frequency was found in more than half of the grid points. The current 30-y flood peak is projected to occur in more than 1 in 5 y across 5-30% of land grid points. The large-scale patterns of change are remarkably consistent among impact models and even the driving climate models, but at local scale and in individual river basins there can be disagreement even on the sign of change, indicating large modeling uncertainty which needs to be taken into account in local adaptation studies.

  19. A drought hazard assessment index based on the VIC-PDSI model and its application on the Loess Plateau, China

    Science.gov (United States)

    Zhang, Baoqing; Wu, Pute; Zhao, Xining; Wang, Yubao; Gao, Xiaodong; Cao, Xinchun

    2013-10-01

    Drought is a complex natural hazard that is poorly understood and difficult to assess. This paper describes a VIC-PDSI model approach to understanding drought in which the Variable Infiltration Capacity (VIC) Model was combined with the Palmer Drought Severity Index (PDSI). Simulated results obtained using the VIC model were used to replace the output of the more conventional two-layer bucket-type model for hydrological accounting, and a two-class-based procedure for calibrating the characteristic climate coefficient ( K j ) was introduced to allow for a more reliable computation of the PDSI. The VIC-PDSI model was used in conjunction with GIS technology to create a new drought assessment index (DAI) that provides a comprehensive overview of drought duration, intensity, frequency, and spatial extent. This new index was applied to drought hazard assessment across six subregions of the whole Loess Plateau. The results show that the DAI over the whole Loess Plateau ranged between 11 and 26 (the greater value of the DAI means the more severe of the drought hazard level). The drought hazards in the upper reaches of Yellow River were more severe than that in the middle reaches. The drought prone regions over the study area were mainly concentrated in Inner Mongolian small rivers, Zuli and Qingshui Rivers basin, while the drought hazards in the drainage area between Hekouzhen-Longmen and Weihe River basin were relatively mild during 1971-2010. The most serious drought vulnerabilities were associated with the area around Lanzhou, Zhongning, and Yinchuan, where the development of water-saving irrigation is the most direct and effective way to defend against and reduce losses from drought. For the relatively humid regions, it will be necessary to establish the rainwater harvesting systems, which could help to relieve the risk of water shortage and guarantee regional food security. Due to the DAI considers the multiple characteristic of drought duration, intensity, frequency

  20. Digital elevation models in the marine domain: investigating the offshore tsunami hazard from submarine landslides

    Science.gov (United States)

    Tappin, David R.

    2015-04-01

    the resolution necessary to identify the hazard from landslides, particularly along convergent margins where this hazard is the greatest. Multibeam mapping of the deep seabed requires low frequency sound sources that, because of their corresponding low resolution, cannot produce the detail required to identify the finest scale features. In addition, outside of most countries, there are not the repeat surveys that allow seabed changes to be identified. Perhaps only japan has this data. In the near future as research budgets shrink and ship time becomes ever expensive new strategies will have to be used to make best use of the vessels available. Remote AUV technology is almost certainly the answer, and should be increasingly utilised to map the seabed while the mother ship is better used to carry out other duties, such as sampling or seismic data acquisition. This will have the advantage in the deep ocean of acquiring higher resolution data from high frequency multibeams. This talk presents on a number of projects that show the evolution of the use of MBES in mapping submarine landslides since the PNG tsunami. Data from PNG is presented, together with data from Japan, Hawaii and the NE Atlantic. New multibeam acquisition methodologies are also discussed.

  1. Modeling Lahar Hazard Zones for Eruption-Generated Lahars from Lassen Peak, California

    Science.gov (United States)

    Robinson, J. E.; Clynne, M. A.

    2010-12-01

    Lassen Peak, a high-elevation, seasonally snow-covered peak located within Lassen Volcanic National Park, has lahar deposits in several drainages that head on or near the lava dome. This suggests that these drainages are susceptible to future lahars. The majority of the recognized lahar deposits are related to the May 19 and 22, 1915 eruptions of Lassen Peak. These small-volume eruptions generated lahars and floods when an avalanche of snow and hot rock, and a pyroclastic flow moved across the snow-covered upper flanks of the lava dome. Lahars flowed to the north down Lost Creek and Hat Creek. In Lost Creek, the lahars flowed up to 16 km downstream and deposited approximately 8.3 x 106 m3 of sediment. This study uses geologic mapping of the 1915 lahar deposits as a guide for LAHARZ modeling to assist in the assessment of present-day susceptibility for lahars in drainages heading on Lassen Peak. The LAHARZ model requires a Height over Length (H/L) energy cone controlling the initiation point of a lahar. We chose a H/L cone with a slope of 0.3 that intersects the earth’s surface at the break in slope at the base of the volcanic dome. Typically, the snow pack reaches its annual maximum by May. Average and maximum May snow-water content, a depth of water equal to 2.1 m and 3.5 m respectively, were calculated from a local snow gauge. A potential volume for individual 1915 lahars was calculated using the deposit volume, the snow-water contents, and the areas stripped of snow by the avalanche and pyroclastic flow. The calculated individual lahars in Lost Creek ranged in size from 9 x 106 m3 to 18.4 x 106 m3. These volumes modeled in LAHARZ matched the 1915 lahars remarkably well, with the modeled flows ending within 4 km of the mapped deposits. We delineated six drainage basins that head on or near Lassen Peak with the highest potential for lahar hazards: Lost Creek, Hat Creek, Manzanita Creek, Mill Creek, Warner Creek, and Bailey Creek. We calculated the area of each

  2. Subduction zone and crustal dynamics of western Washington; a tectonic model for earthquake hazards evaluation

    Science.gov (United States)

    Stanley, Dal; Villaseñor, Antonio; Benz, Harley

    1999-01-01

    The Cascadia subduction zone is extremely complex in the western Washington region, involving local deformation of the subducting Juan de Fuca plate and complicated block structures in the crust. It has been postulated that the Cascadia subduction zone could be the source for a large thrust earthquake, possibly as large as M9.0. Large intraplate earthquakes from within the subducting Juan de Fuca plate beneath the Puget Sound region have accounted for most of the energy release in this century and future such large earthquakes are expected. Added to these possible hazards is clear evidence for strong crustal deformation events in the Puget Sound region near faults such as the Seattle fault, which passes through the southern Seattle metropolitan area. In order to understand the nature of these individual earthquake sources and their possible interrelationship, we have conducted an extensive seismotectonic study of the region. We have employed P-wave velocity models developed using local earthquake tomography as a key tool in this research. Other information utilized includes geological, paleoseismic, gravity, magnetic, magnetotelluric, deformation, seismicity, focal mechanism and geodetic data. Neotectonic concepts were tested and augmented through use of anelastic (creep) deformation models based on thin-plate, finite-element techniques developed by Peter Bird, UCLA. These programs model anelastic strain rate, stress, and velocity fields for given rheological parameters, variable crust and lithosphere thicknesses, heat flow, and elevation. Known faults in western Washington and the main Cascadia subduction thrust were incorporated in the modeling process. Significant results from the velocity models include delineation of a previously studied arch in the subducting Juan de Fuca plate. The axis of the arch is oriented in the direction of current subduction and asymmetrically deformed due to the effects of a northern buttress mapped in the velocity models. This

  3. System Dynamics Model to develop resilience management strategies for lifelines exposed to natural hazards

    Science.gov (United States)

    Pagano, Alessandro; Pluchinotta, Irene; Giordano, Raffaele; Vurro, Michele

    2016-04-01

    Resilience has recently become a key concept, and a crucial paradigm in the analysis of the impacts of natural disasters, mainly concerning Lifeline Systems (LS). Indeed, the traditional risk management approaches require a precise knowledge of all potential hazards and a full understanding of the interconnections among different infrastructures, based on past events and trends analysis. Nevertheless, due to the inner complexity of LS, their interconnectedness and the dynamic context in which they operate (i.e. technology, economy and society), it is difficult to gain a complete comprehension of the processes influencing vulnerabilities and threats. Therefore, resilience thinking addresses the complexities of large integrated systems and the uncertainty of future threats, emphasizing the absorbing, adapting and responsive behavior of the system. Resilience thinking approaches are focused on the capability of the system to deal with the unforeseeable. The increasing awareness of the role played by LS, has led governmental agencies and institutions to develop resilience management strategies. Risk prone areas, such as cities, are highly dependent on infrastructures providing essential services that support societal functions, safety, economic prosperity and quality of life. Among the LS, drinking water supply is critical for supporting citizens during emergency and recovery, since a disruption could have a range of serious societal impacts. A very well-known method to assess LS resilience is the TOSE approach. The most interesting feature of this approach is the integration of four dimensions: Technical, Organizational, Social and Economic. Such issues are all concurrent to the resilience level of an infrastructural system, and should be therefore quantitatively assessed. Several researches underlined that the lack of integration among the different dimensions, composing the resilience concept, may contribute to a mismanagement of LS in case of natural disasters

  4. Assessment of erosion hazard after recurrence fires with the RUSLE 3D MODEL

    Science.gov (United States)

    Vecín-Arias, Daniel; Palencia, Covadonga; Fernández Raga, María

    2016-04-01

    The objective of this work is to calculate if there is more soil erosion after the recurrence of several forest fires on an area. To that end, it has been studied an area of 22 130 ha because has a high frequency of fires. This area is located in the northwest of the Iberian Peninsula. The assessment of erosion hazard was calculated in several times using Geographic Information Systems (GIS).The area have been divided into several plots according to the number of times they have been burnt in the past 15 years. Due to the complexity that has to make a detailed study of a so large field and that there are not information available anually, it is necessary to select the more interesting moments. In august 2012 it happened the most agressive and extensive fire of the area. So the study was focused on the erosion hazard for 2011 and 2014, because they are the date before and after from the fire of 2012 in which there are orthophotos available. RUSLE3D model (Revised Universal Soil Loss Equation) was used to calculate maps erosion losses. This model improves the traditional USLE (Wischmeier and D., 1965) because it studies the influence of the concavity / convexity (Renard et al., 1997), and improves the estimation of the slope factor LS (Renard et al., 1991). It is also one of the most commonly used models in literatura (Mitasova et al., 1996; Terranova et al., 2009). The tools used are free and accessible, using GIS "gvSIG" (http://www.gvsig.com/es) and the metadata were taken from Spatial Data Infrastructure of Spain webpage (IDEE, 2016). However the RUSLE model has many critics as some authors who suggest that only serves to carry out comparisons between areas, and not for the calculation of absolute soil loss data. These authors argue that in field measurements the actual recovered eroded soil can suppose about one-third of the values obtained with the model (Šúri et al., 2002). The study of the area shows that the error detected by the critics could come from

  5. A Proportional Hazards Regression Model for the Subdistribution with Covariates-adjusted Censoring Weight for Competing Risks Data

    DEFF Research Database (Denmark)

    He, Peng; Eriksson, Frank; Scheike, Thomas H.

    2016-01-01

    With competing risks data, one often needs to assess the treatment and covariate effects on the cumulative incidence function. Fine and Gray proposed a proportional hazards regression model for the subdistribution of a competing risk with the assumption that the censoring distribution and the cov......With competing risks data, one often needs to assess the treatment and covariate effects on the cumulative incidence function. Fine and Gray proposed a proportional hazards regression model for the subdistribution of a competing risk with the assumption that the censoring distribution...... and the covariates are independent. Covariate-dependent censoring sometimes occurs in medical studies. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with proper adjustments for covariate-dependent censoring. We consider a covariate-adjusted weight...... function by fitting the Cox model for the censoring distribution and using the predictive probability for each individual. Our simulation study shows that the covariate-adjusted weight estimator is basically unbiased when the censoring time depends on the covariates, and the covariate-adjusted weight...

  6. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 1-year storm in Los Angeles County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  7. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 1-year storm in San Diego County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  8. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 20-year storm in San Diego County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  9. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 100-year storm in San Diego County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  10. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: average conditions in Los Angeles County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  11. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 20-year storm in Los Angeles County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  12. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 1-year storm in Los Angeles County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  13. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: average conditions in San Diego County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  14. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 1-year storm in San Diego County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  15. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 100-year storm in Los Angeles County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  16. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 20-year storm in San Diego County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  17. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 100-year storm in San Diego County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  18. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: average conditions in Los Angeles County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  19. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: average conditions in San Diego County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  20. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 20-year storm in Los Angeles County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  1. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 2 wave-hazard projections: 100-year storm in Los Angeles County

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal Storm...

  2. Probabilistic Seismic Hazard Assessment for Taiwan

    Directory of Open Access Journals (Sweden)

    Yu-Ju Wang

    2016-06-01

    Full Text Available The Taiwan Earthquake Model (TEM was established to assess the seismic hazard and risk for Taiwan by considering the social and economic impacts of various components from geology, seismology, and engineering. This paper gives the first version of TEM probabilistic seismic hazard analysis for Taiwan in these aspects. We named it TEM PSHA2015. The model adopts the source parameters of 38 seismogenic structures identified by TEM geologists. In addition to specific fault source-based categorization, seismic activities are categorized as shallow, subduction intraplate, and subduction interplate events. To evaluate the potential ground-shaking resulting from each seismic source, the corresponding ground-motion prediction equations for crustal and subduction earthquakes are adopted. The highest hazard probability is evaluated to be in Southwestern Taiwan and the Longitudinal Valley of Eastern Taiwan. Among the special municipalities in the highly populated Western Taiwan region, Taichung, Tainan, and New Taipei City are evaluated to have the highest hazard. Tainan has the highest seismic hazard for peak ground acceleration in the model based on TEM fault parameters. In terms of pseudo-spectral acceleration, Tainan has higher hazard over short spectral periods, whereas Taichung has higher hazard over long spectral periods. The analysis indicates the importance of earthquake-resistant designs for low-rise buildings in Tainan and high-rise buildings in Taichung.

  3. Parametric studies with an atmospheric diffusion model that assesses toxic fuel hazards due to the ground clouds generated by rocket launches

    Science.gov (United States)

    Stewart, R. B.; Grose, W. L.

    1975-01-01

    Parametric studies were made with a multilayer atmospheric diffusion model to place quantitative limits on the uncertainty of predicting ground-level toxic rocket-fuel concentrations. Exhaust distributions in the ground cloud, cloud stabilized geometry, atmospheric coefficients, the effects of exhaust plume afterburning of carbon monoxide CO, assumed surface mixing-layer division in the model, and model sensitivity to different meteorological regimes were studied. Large-scale differences in ground-level predictions are quantitatively described. Cloud alongwind growth for several meteorological conditions is shown to be in error because of incorrect application of previous diffusion theory. In addition, rocket-plume calculations indicate that almost all of the rocket-motor carbon monoxide is afterburned to carbon dioxide CO2, thus reducing toxic hazards due to CO. The afterburning is also shown to have a significant effect on cloud stabilization height and on ground-level concentrations of exhaust products.

  4. An introduction to modeling longitudinal data with generalized additive models: applications to single-case designs.

    Science.gov (United States)

    Sullivan, Kristynn J; Shadish, William R; Steiner, Peter M

    2015-03-01

    Single-case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time in both the presence and absence of treatment. This article introduces a statistical technique for analyzing SCD data that has not been much used in psychological and educational research: generalized additive models (GAMs). In parametric regression, the researcher must choose a functional form to impose on the data, for example, that trend over time is linear. GAMs reverse this process by letting the data inform the choice of functional form. In this article we review the problem that trend poses in SCDs, discuss how current SCD analytic methods approach trend, describe GAMs as a possible solution, suggest a GAM model testing procedure for examining the presence of trend in SCDs, present a small simulation to show the statistical properties of GAMs, and illustrate the procedure on 3 examples of different lengths. Results suggest that GAMs may be very useful both as a form of sensitivity analysis for checking the plausibility of assumptions about trend and as a primary data analysis strategy for testing treatment effects. We conclude with a discussion of some problems with GAMs and some future directions for research on the application of GAMs to SCDs.

  5. Estimation of additive and dominance variance for reproductive traits from different models in Duroc purebred

    OpenAIRE

    Talerngsak Angkuraseranee

    2010-01-01

    The additive and dominance genetic variances of 5,801 Duroc reproductive and growth records were estimated usingBULPF90 PC-PACK. Estimates were obtained for number born alive (NBA), birth weight (BW), number weaned (NW), andweaning weight (WW). Data were analyzed using two mixed model equations. The first model included fixed effects andrandom effects identifying inbreeding depression, additive gene effect and permanent environments effects. The secondmodel was similar to the first model, but...

  6. A framework for modeling clustering in natural hazard catastrophe risk management and the implications for re/insurance loss perspectives

    Directory of Open Access Journals (Sweden)

    S. Khare

    2014-08-01

    Full Text Available In this paper, we present a novel framework for modelling clustering in natural hazard risk models. The framework we present is founded on physical principles where large-scale oscillations in the physical system is the source of non-Poissonian (clustered frequency behaviour. We focus on a particular mathematical implementation of the "Super-Cluster" methodology that we introduce. This mathematical framework has a number of advantages including tunability to the problem at hand, as well as the ability to model cross-event correlation. Using European windstorm data as an example, we provide evidence that historical data show strong evidence of clustering. We then develop Poisson and clustered simulation models for the data, demonstrating clearly the superiority of the clustered model which we have implemented using the Poisson-Mixtures approach. We then discuss the implications of including clustering in models of prices on catXL contracts, one of the most commonly used mechanisms for transferring risk between primary insurers and reinsurers. This paper provides a number of new insights into the impact clustering has on modelled catXL contract prices. The simple model presented in this paper provides an insightful starting point for practicioners of natural hazard risk modelling.

  7. Flood Hazard Zonation by Combining Mod-Clark and HEC-RAS Models in Bustan Dam Basin, Golestan Province

    Directory of Open Access Journals (Sweden)

    Z. Parisay

    2014-12-01

    Full Text Available Flood is one of the devastating phenomena which every year incurs casualties and property damages. Flood zonation is an efficient technique for flood management. The main goal of this research is flood hazard and risk zonation along a 21 km reach of the Gorganrud river in Bustan dam watershed considering two conditions: present landuse condition and scenario planning. To this end a combination of a hydrologic model (the distributed HEC-HMS with the Mod-Clark transform option and a hydraulic model (HEC-RAS were used. The required inputs to run the Mod-Clarck module of HEC-HMS are gridded files of river basin, curve number and rainfall with the SHG coordinate system and DSS format. In this research the input files were prepared using the Watershed Modeling System (WMS at cell size of 200 m. Since the Mod-Clark method requires rainfall data as radar format (NEXRAD, the distributed rainfall mapseries with time intervals of 15 minutes prepared within the PCRaster GIS system were converted to the DSS format using the asc2dss package. also the curve number map was converted to the DSS format using HEC-GeoHMS. Then, these DSS files were substituted with rainfall and curve number maps within the WMS. After calibration and validation, model was run for return periods of 2, 5, 10, 25, 50, 100 and 200 years, in two conditions of current landuse and scenario planning. The simulated peak discharge data, geometric parameters of river and cross section (at 316 locations data prepared by the HEC-GeoRAS software and roughness coefficients data, were used by the HEC-RAS software to simulate the hydraulic behavior of the river and flood inundation area maps were produced using GIS. The results of the evaluation showed that in addition to the percent error in peak flow, less than 3.2%, the model has a good performance in peak flow simulation, but is not successful in volume estimation. The results of flood zones revealed that from the total area in floodplain with

  8. Assessing End-Of-Supply Risk of Spare Parts Using the Proportional Hazard Model

    NARCIS (Netherlands)

    X. Li (Xishu); R. Dekker (Rommert); C. Heij (Christiaan); M. Hekimoğlu (Mustafa)

    2016-01-01

    textabstractOperators of long field-life systems like airplanes are faced with hazards in the supply of spare parts. If the original manufacturers or suppliers of parts end their supply, this may have large impacts on operating costs of firms needing these parts. Existing end-of-supply evaluation me

  9. Spatial Modelling of Urban Physical Vulnerability to Explosion Hazards Using GIS and Fuzzy MCDA

    Directory of Open Access Journals (Sweden)

    Yasser Ebrahimian Ghajari

    2017-07-01

    Full Text Available Most of the world’s population is concentrated in accumulated spaces in the form of cities, making the concept of urban planning a significant issue for consideration by decision makers. Urban vulnerability is a major issue which arises in urban management, and is simply defined as how vulnerable various structures in a city are to different hazards. Reducing urban vulnerability and enhancing resilience are considered to be essential steps towards achieving urban sustainability. To date, a vast body of literature has focused on investigating urban systems’ vulnerabilities with regard to natural hazards. However, less attention has been paid to vulnerabilities resulting from man-made hazards. This study proposes to investigate the physical vulnerability of buildings in District 6 of Tehran, Iran, with respect to intentional explosion hazards. A total of 14 vulnerability criteria are identified according to the opinions of various experts, and standard maps for each of these criteria have been generated in a GIS environment. Ultimately, an ordered weighted averaging (OWA technique was applied to generate vulnerability maps for different risk conditions. The results of the present study indicate that only about 25 percent of buildings in the study area have a low level of vulnerability under moderate risk conditions. Sensitivity analysis further illustrates the robustness of the results obtained. Finally, the paper concludes by arguing that local authorities must focus more on risk-reduction techniques in order to reduce physical vulnerability and achieve urban sustainability.

  10. Modelling risk in high hazard operations: integrating technical, organisational and cultural factors

    NARCIS (Netherlands)

    Ale, B.J.M.; Hanea, D.M.; Sillem, S.; Lin, P.H.; Van Gulijk, C.; Hudson, P.T.W.

    2012-01-01

    Recent disasters in high hazard industries such as Oil and Gas Exploration (The Deepwater Horizon) and Petrochemical production (Texas City) have been found to have causes that range from direct technical failures through organizational shortcomings right up to weak regulation and inappropriate comp

  11. Assessing End-Of-Supply Risk of Spare Parts Using the Proportional Hazard Model

    NARCIS (Netherlands)

    X. Li (Xishu); R. Dekker (Rommert); C. Heij (Christiaan); M. Hekimoğlu (Mustafa)

    2016-01-01

    textabstractOperators of long field-life systems like airplanes are faced with hazards in the supply of spare parts. If the original manufacturers or suppliers of parts end their supply, this may have large impacts on operating costs of firms needing these parts. Existing end-of-supply evaluation

  12. Modelling risk in high hazard operations: integrating technical, organisational and cultural factors

    NARCIS (Netherlands)

    Ale, B.J.M.; Hanea, D.M.; Sillem, S.; Lin, P.H.; Van Gulijk, C.; Hudson, P.T.W.

    2012-01-01

    Recent disasters in high hazard industries such as Oil and Gas Exploration (The Deepwater Horizon) and Petrochemical production (Texas City) have been found to have causes that range from direct technical failures through organizational shortcomings right up to weak regulation and inappropriate

  13. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey

    Science.gov (United States)

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  14. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey.

    Science.gov (United States)

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  15. [Proportional hazards model of birth intervals among marriage cohorts since the 1960s].

    Science.gov (United States)

    Otani, K

    1987-01-01

    With a view to investigating the possibility of an attitudinal change towards the timing of 1st and 2nd births, proportional hazards model analysis of the 1st and 2nd birth intervals and univariate life table analysis were both carried out. Results showed that love matches and conjugal families immediately after marriage are accompanied by a longer 1st birth interval than others, even after controlling for other independent variables. Marriage cohort analysis also shows a net effect on the relative risk of having a 1st birth. Marriage cohorts since the mid-1960s demonstrate a shorter 1st birth interval than the 1961-63 cohort. With regard to the 2nd birth interval, longer 1st birth intervals, arranged marriages, conjugal families immediately following marriage, and higher ages at 1st marriage of women tended to provoke a longer 2nd birth interval. There is no interaction between the 1st birth interval and marriage cohort. Once other independent variables were controlled, with the exception of the marriage cohorts of the early 1970s, the authors found no effect of marriage cohort on the relative risk of having a 2nd birth. This suggests that an attitudinal change towards the timing of births in this period was mainly restricted to that of a 1st birth. Fluctuations in the 2nd birth interval during the 1970-72 marriage cohort were scrutinized in detail. As a result, the authors found that conjugal families after marriage, wives with low educational status, women with husbands in white collar professions, women with white collar fathers, and wives with high age at 1st marriage who married during 1970-72 and had a 1st birth interval during 1972-74 suffered most from the pronounced rise in the 2nd birth interval. This might be due to the relatively high sensitivity to a change in socioeconomic status; the oil crisis occurring around the time of marriage and 1st birth induced a delay in the 2nd birth. The unanimous decrease in the 2nd birth interval among the 1973

  16. Product versus additive threshold models for analysis of reproduction outcomes in animal genetics.

    Science.gov (United States)

    David, I; Bodin, L; Gianola, D; Legarra, A; Manfredi, E; Robert-Granié, C

    2009-08-01

    The phenotypic observation of some reproduction traits (e.g., insemination success, interval from lambing to insemination) is the result of environmental and genetic factors acting on 2 individuals: the male and female involved in a mating couple. In animal genetics, the main approach (called additive model) proposed for studying such traits assumes that the phenotype is linked to a purely additive combination, either on the observed scale for continuous traits or on some underlying scale for discrete traits, of environmental and genetic effects affecting the 2 individuals. Statistical models proposed for studying human fecundability generally consider reproduction outcomes as the product of hypothetical unobservable variables. Taking inspiration from these works, we propose a model (product threshold model) for studying a binary reproduction trait that supposes that the observed phenotype is the product of 2 unobserved phenotypes, 1 for each individual. We developed a Gibbs sampling algorithm for fitting a Bayesian product threshold model including additive genetic effects and showed by simulation that it is feasible and that it provides good estimates of the parameters. We showed that fitting an additive threshold model to data that are simulated under a product threshold model provides biased estimates, especially for individuals with high breeding values. A main advantage of the product threshold model is that, in contrast to the additive model, it provides distinct estimates of fixed effects affecting each of the 2 unobserved phenotypes.

  17. Influence of Climate Change on Flood Hazard using Climate Informed Bayesian Hierarchical Model in Johnson Creek River

    Science.gov (United States)

    Zarekarizi, M.; Moradkhani, H.

    2015-12-01

    Extreme events are proven to be affected by climate change, influencing hydrologic simulations for which stationarity is usually a main assumption. Studies have discussed that this assumption would lead to large bias in model estimations and higher flood hazard consequently. Getting inspired by the importance of non-stationarity, we determined how the exceedance probabilities have changed over time in Johnson Creek River, Oregon. This could help estimate the probability of failure of a structure that was primarily designed to resist less likely floods according to common practice. Therefore, we built a climate informed Bayesian hierarchical model and non-stationarity was considered in modeling framework. Principle component analysis shows that North Atlantic Oscillation (NAO), Western Pacific Index (WPI) and Eastern Asia (EA) are mostly affecting stream flow in this river. We modeled flood extremes using peaks over threshold (POT) method rather than conventional annual maximum flood (AMF) mainly because it is possible to base the model on more information. We used available threshold selection methods to select a suitable threshold for the study area. Accounting for non-stationarity, model parameters vary through time with climate indices. We developed a couple of model scenarios and chose one which could best explain the variation in data based on performance measures. We also estimated return periods under non-stationarity condition. Results show that ignoring stationarity could increase the flood hazard up to four times which could increase the probability of an in-stream structure being overtopped.

  18. A semiparametric Bayesian proportional hazards model for interval censored data with frailty effects

    Directory of Open Access Journals (Sweden)

    Hölzel Dieter

    2009-02-01

    Full Text Available Abstract Background Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. Methods MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Results Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. Conclusion The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.

  19. Modeling retrospective attribution of responsibility to hazard-managing institutions: an example involving a food contamination incident.

    Science.gov (United States)

    Johnson, Branden B; Hallman, William K; Cuite, Cara L

    2015-03-01

    Perceptions of institutions that manage hazards are important because they can affect how the public responds to hazard events. Antecedents of trust judgments have received far more attention than antecedents of attributions of responsibility for hazard events. We build upon a model of retrospective attribution of responsibility to individuals to examine these relationships regarding five classes of institutions that bear responsibility for food safety: producers (e.g., farmers), processors (e.g., packaging firms), watchdogs (e.g., government agencies), sellers (e.g., supermarkets), and preparers (e.g., restaurants). A nationally representative sample of 1,200 American adults completed an Internet-based survey in which a hypothetical scenario involving contamination of diverse foods with Salmonella served as the stimulus event. Perceived competence and good intentions of the institution moderately decreased attributions of responsibility. A stronger factor was whether an institution was deemed (potentially) aware of the contamination and free to act to prevent or mitigate it. Responsibility was rated higher the more aware and free the institution. This initial model for attributions of responsibility to impersonal institutions (as opposed to individual responsibility) merits further development. © 2014 Society for Risk Analysis.

  20. Concentration addition, independent action and generalized concentration addition models for mixture effect prediction of sex hormone synthesis in vitro

    DEFF Research Database (Denmark)

    Hadrup, Niels; Taxvig, Camilla; Pedersen, Mikael;

    2013-01-01

    , antagonism was observed for effects of Mixture 2 on this hormone. The mixtures contained chemicals exerting only limited maximal effects. This hampered prediction by the CA and IA models, whereas the GCA model could be used to predict a full dose response curve. Regarding effects on progesterone...

  1. A Fault-based Crustal Deformation Model for UCERF3 and Its Implication to Seismic Hazard Analysis

    Science.gov (United States)

    Zeng, Y.; Shen, Z.

    2012-12-01

    shear zone and northern Walker Lane. This implies a significant increase in seismic hazard in the eastern California and northern Walker Lane region, but decreased seismic hazard in the southern San Andreas area, relative to the current model used in the USGS 2008 seismic hazard map evaluation. Overall the geodetic model suggests an increase in total regional moment rate of 24% compared with the UCERF2 model and the 150-yr California earthquake catalog. However not all the increases are seismic so the seismic/aseismic slip rate ratios are critical for future seismic hazard assessment.

  2. The application of a calibrated 3D ballistic trajectory model to ballistic hazard assessments at Upper Te Maari, Tongariro

    Science.gov (United States)

    Fitzgerald, R. H.; Tsunematsu, K.; Kennedy, B. M.; Breard, E. C. P.; Lube, G.; Wilson, T. M.; Jolly, A. D.; Pawson, J.; Rosenberg, M. D.; Cronin, S. J.

    2014-10-01

    On 6 August, 2012, Upper Te Maari Crater, Tongariro volcano, New Zealand, erupted for the first time in over one hundred years. Multiple vents were activated during the hydrothermal eruption, ejecting blocks up to 2.3 km and impacting ~ 2.6 km of the Tongariro Alpine Crossing (TAC) hiking track. Ballistic impact craters were mapped to calibrate a 3D ballistic trajectory model for the eruption. This was further used to inform future ballistic hazard. Orthophoto mapping revealed 3587 impact craters with a mean diameter of 2.4 m. However, field mapping of accessible regions indicated an average of at least four times more observable impact craters and a smaller mean crater diameter of 1.2 m. By combining the orthophoto and ground-truthed impact frequency and size distribution data, we estimate that approximately 13,200 ballistic projectiles were generated during the eruption. The 3D ballistic trajectory model and a series of inverse models were used to constrain the eruption directions, angles and velocities. When combined with eruption observations and geophysical observations, the model indicates that the blocks were ejected in five variously directed eruption pulses, in total lasting 19 s. The model successfully reproduced the mapped impact distribution using a mean initial particle velocity of 200 m/s with an accompanying average gas flow velocity over a 400 m radius of 150 m/s. We apply the calibrated model to assess ballistic hazard from the August eruption along the TAC. By taking the field mapped spatial density of impacts and an assumption that an average ballistic impact will cause serious injury or death (casualty) over an 8 m2 area, we estimate that the probability of casualty ranges from 1% to 16% along the affected track (assuming an eruption during the time of exposure). Future ballistic hazard and probabilities of casualty along the TAC are also assessed through application of the calibrated model. We model a magnitude larger eruption and illustrate

  3. GIS-Based Spatial Analysis and Modeling for Landslide Hazard Assessment: A Case Study in Upper Minjiang River Basin

    Institute of Scientific and Technical Information of China (English)

    FENG Wenlan; ZHOU Qigang; ZHANG Baolei; ZHOU Wancun; LI Ainong; ZHANG Haizhen; XIAN Wei

    2006-01-01

    By analyzing the topographic features of past landslides since 1980s and the main land-cover types (including change information) in landslide-prone area, modeled spatial distribution of landslide hazard in upper Minjiang River Basin was studied based on spatial analysis of GIS in this paper. Results of GIS analysis showed that landslide occurrence in this region closely related to topographic feature. Most areas with high hazard probability were deep-sheared gorge. Most of them in investigation occurred assembly in areas with elevation lower than 3 000 m, due to fragile topographic conditions and intensive human disturbances. Land-cover type, including its change information, was likely an important environmental factor to trigger landslide. Destroy of vegetation driven by increase of population and its demands augmented the probability of landslide in steep slope.

  4. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  5. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    Science.gov (United States)

    Ross, G.

    2015-12-01

    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.

  6. Effects of additional food in a delayed predator-prey model.

    Science.gov (United States)

    Sahoo, Banshidhar; Poria, Swarup

    2015-03-01

    We examine the effects of supplying additional food to predator in a gestation delay induced predator-prey system with habitat complexity. Additional food works in favor of predator growth in our model. Presence of additional food reduces the predatory attack rate to prey in the model. Supplying additional food we can control predator population. Taking time delay as bifurcation parameter the stability of the coexisting equilibrium point is analyzed. Hopf bifurcation analysis is done with respect to time delay in presence of additional food. The direction of Hopf bifurcations and the stability of bifurcated periodic solutions are determined by applying the normal form theory and the center manifold theorem. The qualitative dynamical behavior of the model is simulated using experimental parameter values. It is observed that fluctuations of the population size can be controlled either by supplying additional food suitably or by increasing the degree of habitat complexity. It is pointed out that Hopf bifurcation occurs in the system when the delay crosses some critical value. This critical value of delay strongly depends on quality and quantity of supplied additional food. Therefore, the variation of predator population significantly effects the dynamics of the model. Model results are compared with experimental results and biological implications of the analytical findings are discussed in the conclusion section.

  7. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Antoun, T; Harris, D; Lay, T; Myers, S C; Pasyanos, M E; Richards, P; Rodgers, A J; Walter, W R; Zucca, J J

    2008-02-11

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring and seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes a path by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas.

  8. Hazard Identification of the Offshore Three-phase Separation Process Based on Multilevel Flow Modeling and HAZOP

    DEFF Research Database (Denmark)

    Wu, Jing; Zhang, Laibin; Lind, Morten;

    2013-01-01

    HAZOP studies are widely accepted in chemical and petroleum industries as the method for conducting process hazard analysis related to design, maintenance and operation of the systems. Different tools have been developed to automate HAZOP studies. In this paper, a HAZOP reasoning method based...... on function-oriented modeling, Multilevel Flow Modeling (MFM), is extended with function roles. A graphical MFM editor, which is combined with the reasoning capabilities of the MFM Workbench developed by DTU is applied to automate HAZOP studies. The method is proposed to support the “brain-storming” sessions...

  9. Degree of multicollinearity and variables involved in linear dependence in additive-dominant models

    Directory of Open Access Journals (Sweden)

    Juliana Petrini

    2012-12-01

    Full Text Available The objective of this work was to assess the degree of multicollinearity and to identify the variables involved in linear dependence relations in additive-dominant models. Data of birth weight (n=141,567, yearling weight (n=58,124, and scrotal circumference (n=20,371 of Montana Tropical composite cattle were used. Diagnosis of multicollinearity was based on the variance inflation factor (VIF and on the evaluation of the condition indexes and eigenvalues from the correlation matrix among explanatory variables. The first model studied (RM included the fixed effect of dam age class at calving and the covariates associated to the direct and maternal additive and non-additive effects. The second model (R included all the effects of the RM model except the maternal additive effects. Multicollinearity was detected in both models for all traits considered, with VIF values of 1.03 - 70.20 for RM and 1.03 - 60.70 for R. Collinearity increased with the increase of variables in the model and the decrease in the number of observations, and it was classified as weak, with condition index values between 10.00 and 26.77. In general, the variables associated with additive and non-additive effects were involved in multicollinearity, partially due to the natural connection between these covariables as fractions of the biological types in breed composition.

  10. Modeling Flood Hazard Zones at the Sub-District Level with the Rational Model Integrated with GIS and Remote Sensing Approaches

    Directory of Open Access Journals (Sweden)

    Daniel Asare-Kyei

    2015-07-01

    Full Text Available Robust risk assessment requires accurate flood intensity area mapping to allow for the identification of populations and elements at risk. However, available flood maps in West Africa lack spatial variability while global datasets have resolutions too coarse to be relevant for local scale risk assessment. Consequently, local disaster managers are forced to use traditional methods such as watermarks on buildings and media reports to identify flood hazard areas. In this study, remote sensing and Geographic Information System (GIS techniques were combined with hydrological and statistical models to delineate the spatial limits of flood hazard zones in selected communities in Ghana, Burkina Faso and Benin. The approach involves estimating peak runoff concentrations at different elevations and then applying statistical methods to develop a Flood Hazard Index (FHI. Results show that about half of the study areas fall into high intensity flood zones. Empirical validation using statistical confusion matrix and the principles of Participatory GIS show that flood hazard areas could be mapped at an accuracy ranging from 77% to 81%. This was supported with local expert knowledge which accurately classified 79% of communities deemed to be highly susceptible to flood hazard. The results will assist disaster managers to reduce the risk to flood disasters at the community level where risk outcomes are first materialized.

  11. Spatial Modelling of Urban Physical Vulnerability to Explosion Hazards Using GIS and Fuzzy MCDA

    OpenAIRE

    Yasser Ebrahimian Ghajari; Ali Asghar Alesheikh; Mahdi Modiri; Reza Hosnavi; Morteza Abbasi

    2017-01-01

    Most of the world’s population is concentrated in accumulated spaces in the form of cities, making the concept of urban planning a significant issue for consideration by decision makers. Urban vulnerability is a major issue which arises in urban management, and is simply defined as how vulnerable various structures in a city are to different hazards. Reducing urban vulnerability and enhancing resilience are considered to be essential steps towards achieving urban sustainability. To date, a va...

  12. Artificial geochemical barriers for additional recovery of non-ferrous metals and reduction of ecological hazard from the mining industry waste.

    Science.gov (United States)

    Chanturiya, Valentine; Masloboev, Vladimir; Makarov, Dmitriy; Mazukhina, Svetlana; Nesterov, Dmitriy; Men'shikov, Yuriy

    2011-01-01

    Laboratory tests and physical-chemical modeling have determined that mixtures of activated silica and carbonatite, serpophite and carbonatite show considerable promise for developing artificial geochemical barriers. The obtained average contents of nickel and copper deposited on geochemical barriers in the formed mining induced ores are acceptable for their subsequent cost efficient processing using either pyro- or hydrometallurgy methods. Some tests of geochemical barriers have been carried out, involving the use of polluted water in the impact zone of the "Kol'skaya GMK" JSC. A possibility of water purification from heavy metals down to the MAC level for fishery water bodies has been displayed.

  13. ADDITIVE-MULTIPLICATIVE MODEL FOR RISK ESTIMATION IN THE PRODUCTION OF ROCKET AND SPACE TECHNICS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2014-10-01

    Full Text Available For the first time we have developed a general additive-multiplicative model of the risk estimation (to estimate the probabilities of risk events. In the two-level system in the lower level the risk estimates are combined additively, on the top – in a multiplicative way. Additive-multiplicative model was used for risk estimation for (1 implementation of innovative projects at universities (with external partners, (2 the production of new innovative products, (3 the projects for creation of rocket and space equipmen

  14. Learning from Nature - Mapping of Complex Hydrological and Geomorphological Process Systems for More Realistic Modelling of Hazard-related Maps

    Science.gov (United States)

    Chifflard, Peter; Tilch, Nils

    2010-05-01

    Introduction Hydrological or geomorphological processes in nature are often very diverse and complex. This is partly due to the regional characteristics which vary over time and space, as well as changeable process-initiating and -controlling factors. Despite being aware of this complexity, such aspects are usually neglected in the modelling of hazard-related maps due to several reasons. But particularly when it comes to creating more realistic maps, this would be an essential component to consider. The first important step towards solving this problem would be to collect data relating to regional conditions which vary over time and geographical location, along with indicators of complex processes. Data should be acquired promptly during and after events, and subsequently digitally combined and analysed. Study area In June 2009, considerable damage occurred in the residential area of Klingfurth (Lower Austria) as a result of great pre-event wetness and repeatedly heavy rainfall, leading to flooding, debris flow deposit and gravitational mass movement. One of the causes is the fact that the meso-scale watershed (16 km²) of the Klingfurth stream is characterised by adverse geological and hydrological conditions. Additionally, the river system network with its discharge concentration within the residential zone contributes considerably to flooding, particularly during excessive rainfall across the entire region, as the flood peaks from different parts of the catchment area are superposed. First results of mapping Hydro(geo)logical surveys across the entire catchment area have shown that - over 600 gravitational mass movements of various type and stage have occurred. 516 of those have acted as a bed load source, while 325 mass movements had not reached the final stage yet and could thus supply bed load in the future. It should be noted that large mass movements in the initial or intermediate stage were predominately found in clayey-silty areas and weathered material

  15. Gaussian mixture models for measuring local change down-track in LWIR imagery for explosive hazard detection

    Science.gov (United States)

    Spain, Christopher J.; Anderson, Derek T.; Keller, James M.; Popescu, Mihail; Stone, Kevin E.

    2011-06-01

    Burying objects below the ground can potentially alter their thermal properties. Moreover, there is often soil disturbance associated with recently buried objects. An intensity video frame image generated by an infrared camera in the medium and long wavelengths often locally varies in the presence of buried explosive hazards. Our approach to automatically detecting these anomalies is to estimate a background model of the image sequence. Pixel values that do not conform to the background model may represent local changes in thermal or soil signature caused by buried objects. Herein, we present a Gaussian mixture model-based technique to estimate the statistical model of background pixel values. The background model is used to detect anomalous pixel values on the road while a vehicle is moving. Foreground pixel confidence values are projected into the UTM coordinate system and a UTM confidence map is built. Different operating levels are explored and the connected component algorithm is then used to extract islands that are subjected to size, shape and orientation filters. We are currently using this approach as a feature in a larger multi-algorithm fusion system. However, in this article we also present results for using this algorithm as a stand-alone detector algorithm in order to further explore its value in detecting buried explosive hazards.

  16. Landslide Hazard Assessment and Mapping in the Guil Catchment (Queyras, Southern French Alps): From Landslide Inventory to Susceptibility Modelling

    Science.gov (United States)

    Roulleau, Louise; Bétard, François; Carlier, Benoît; Lissak, Candide; Fort, Monique

    2016-04-01

    Landslides are common natural hazards in the Southern French Alps, where they may affect human lives and cause severe damages to infrastructures. As a part of the SAMCO research project dedicated to risk evaluation in mountain areas, this study focuses on the Guil river catchment (317 km2), Queyras, to assess landslide hazard poorly studied until now. In that area, landslides are mainly occasional, low amplitude phenomena, with limited direct impacts when compared to other hazards such as floods or snow avalanches. However, when interacting with floods during extreme rainfall events, landslides may have indirect consequences of greater importance because of strong hillslope-channel connectivity along the Guil River and its tributaries (i.e. positive feedbacks). This specific morphodynamic functioning reinforces the need to have a better understanding of landslide hazards and their spatial distribution at the catchment scale to prevent local population from disasters with multi-hazard origin. The aim of this study is to produce a landslide susceptibility mapping at 1:50 000 scale as a first step towards global estimation of landslide hazard and risk. The three main methodologies used for assessing landslide susceptibility are qualitative (i.e. expert opinion), deterministic (i.e. physics-based models) and statistical methods (i.e. probabilistic models). Due to the rapid development of geographical information systems (GIS) during the last two decades, statistical methods are today widely used because they offer a greater objectivity and reproducibility at large scales. Among them, multivariate analyses are considered as the most robust techniques, especially the logistic regression method commonly used in landslide susceptibility mapping. However, this method like others is strongly dependent on the accuracy of the input data to avoid significant errors in the final results. In particular, a complete and accurate landslide inventory is required before the modelling

  17. A hazard-based duration model for analyzing crossing behavior of cyclists and electric bike riders at signalized intersections.

    Science.gov (United States)

    Yang, Xiaobao; Huan, Mei; Abdel-Aty, Mohamed; Peng, Yichuan; Gao, Ziyou

    2015-01-01

    This paper presents a hazard-based duration approach to investigate riders' waiting times, violation hazards, associated risk factors, and their differences between cyclists and electric bike riders at signalized intersections. A total of 2322 two-wheeled riders approaching the intersections during red light periods were observed in Beijing, China. The data were classified into censored and uncensored data to distinguish between safe crossing and red-light running behavior. The results indicated that the red-light crossing behavior of most riders was dependent on waiting time. They were inclined to terminate waiting behavior and run against the traffic light with the increase of waiting duration. Over half of the observed riders cannot endure 49s or longer. 25% of the riders can endure 97s or longer. Rider type, gender, waiting position, conformity tendency and crossing traffic volume were identified to have significant effects on riders' waiting times and violation hazards. Electric bike riders were found to be more sensitive to the external risk factors such as other riders' crossing behavior and crossing traffic volume than cyclists. Moreover, unobserved heterogeneity was examined in the proposed models. The finding of this paper can explain when and why cyclists and electric bike riders run against the red light at intersections. The results of this paper are useful for traffic design and management agencies to implement strategies to enhance the safety of riders.

  18. Between and beyond additivity and non-additivity; the statistical modelling of genotype by environment interaction in plant breeding.

    NARCIS (Netherlands)

    Eeuwijk, van F.A.

    1996-01-01

    In plant breeding it is a common observation to see genotypes react differently to environmental changes. This phenomenon is called genotype by environment interaction. Many statistical approaches for analysing genotype by environment interaction rely heavily on the analysis of variance model. Genot

  19. Between and beyond additivity and non-additivity : the statistical modelling of genotype by environment interaction in plant breeding

    NARCIS (Netherlands)

    Eeuwijk, van F.A.

    1996-01-01

    In plant breeding it is a common observation to see genotypes react differently to environmental changes. This phenomenon is called genotype by environment interaction. Many statistical approaches for analysing genotype by environment interaction rely heavily on the analysis of variance model.

  20. An original traffic additional emission model and numerical simulation on a signalized road

    Science.gov (United States)

    Zhu, Wen-Xing; Zhang, Jing-Yu

    2017-02-01

    Based on VSP (Vehicle Specific Power) model traffic real emissions were theoretically classified into two parts: basic emission and additional emission. An original additional emission model was presented to calculate the vehicle's emission due to the signal control effects. Car-following model was developed and used to describe the traffic behavior including cruising, accelerating, decelerating and idling at a signalized intersection. Simulations were conducted under two situations: single intersection and two adjacent intersections with their respective control policy. Results are in good agreement with the theoretical analysis. It is also proved that additional emission model may be used to design the signal control policy in our modern traffic system to solve the serious environmental problems.

  1. Estimating the phenology of elk brucellosis transmission with hierarchical models of cause-specific and baseline hazards

    Science.gov (United States)

    Cross, Paul C.; Maichak, Eric J.; Rogerson, Jared D.; Irvine, Kathryn M.; Jones, Jennifer D; Heisey, Dennis M.; Edwards, William H.; Scurlock, Brandon M.

    2015-01-01

    Understanding the seasonal timing of disease transmission can lead to more effective control strategies, but the seasonality of transmission is often unknown for pathogens transmitted directly. We inserted vaginal implant transmitters (VITs) in 575 elk (Cervus elaphus canadensis) from 2006 to 2014 to assess when reproductive failures (i.e., abortions or still births) occur, which is the primary transmission route of Brucella abortus, the causative agent of brucellosis in the Greater Yellowstone Ecosystem. Using a survival analysis framework, we developed a Bayesian hierarchical model that simultaneously estimated the total baseline hazard of a reproductive event as well as its 2 mutually exclusive parts (abortions or live births). Approximately, 16% (95% CI = 0.10, 0.23) of the pregnant seropositive elk had reproductive failures, whereas 2% (95% CI = 0.01, 0.04) of the seronegative elk had probable abortions. Reproductive failures could have occurred as early as 13 February and as late as 10 July, peaking from March through May. Model results suggest that less than 5% of likely abortions occurred after 6 June each year and abortions were approximately 5 times more likely in March, April, or May compared to February or June. In western Wyoming, supplemental feeding of elk begins in December and ends during the peak of elk abortions and brucellosis transmission (i.e., Mar and Apr). Years with more snow may enhance elk-to-elk transmission on supplemental feeding areas because elk are artificially aggregated for the majority of the transmission season. Elk-to-cattle transmission will depend on the transmission period relative to the end of the supplemental feeding season, elk seroprevalence, population size, and the amount of commingling. Our statistical approach allowed us to estimate the probability density function of different event types over time, which may be applicable to other cause-specific survival analyses. It is often challenging to assess the

  2. Seismic hazard of the Kivu rift (western branch, East African Rift system): new neotectonic map and seismotectonic zonation model

    Science.gov (United States)

    Delvaux, Damien; Mulumba, Jean-Luc; Sebagenzi Mwene Ntabwoba, Stanislas; Fiama Bondo, Silvanos; Kervyn, François; Havenith, Hans-Balder

    2017-04-01

    The first detailed probabilistic seismic hazard assessment has been performed for the Kivu and northern Tanganyika rift region in Central Africa. This region, which forms the central part of the Western Rift Branch, is one of the most seismically active part of the East African rift system. It was already integrated in large scale seismic hazard assessments, but here we defined a finer zonation model with 7 different zones representing the lateral variation of the geological and geophysical setting across the region. In order to build the new zonation model, we compiled homogeneous cross-border geological, neotectonic and sismotectonic maps over the central part of East D.R. Congo, SW Uganda, Rwanda, Burundi and NW Tanzania and defined a new neotectonic sheme. The seismic risk assessment is based on a new earthquake catalogue, compiled on the basis of various local and global earthquake catalogues. The use of macroseismic epicenters determined from felt earthquakes allowed to extend the time-range back to the beginning of the 20th century, spanning 126 years, with 1068 events. The magnitudes have been homogenized to Mw and aftershocks removed. From this initial catalogue, a catalogue of 359 events from 1956 to 2015 and with M > 4.4 has been extracted for the seismic hazard assessment. The seismotectonic zonation includes 7 seismic source areas that have been defined on the basis of the regional geological structure, neotectonic fault systems, basin architecture and distribution of thermal springs and earthquake epicenters. The Gutenberg-Richter seismic hazard parameters were determined using both the least square linear fit and the maximum likelihood method (Kijko & Smit aue program). Seismic hazard maps have been computed with the Crisis 2012 software using 3 different attenuation laws. We obtained higher PGA values (475 years return period) for the Kivu rift region than the previous estimates (Delvaux et al., 2016). They vary laterally in function of the tectonic

  3. Incorporating shape constraints in generalized additive modelling of the height-diameter relationship for Norway spruce

    Directory of Open Access Journals (Sweden)

    Natalya Pya

    2016-02-01

    Full Text Available Background: Measurements of tree heights and diameters are essential in forest assessment and modelling. Tree heights are used for estimating timber volume, site index and other important variables related to forest growth and yield, succession and carbon budget models. However, the diameter at breast height (dbh can be more accurately obtained and at lower cost, than total tree height. Hence, generalized height-diameter (h-d models that predict tree height from dbh, age and other covariates are needed. For a more flexible but biologically plausible estimation of covariate effects we use shape constrained generalized additive models as an extension of existing h-d model approaches. We use causal site parameters such as index of aridity to enhance the generality and causality of the models and to enable predictions under projected changeable climatic conditions. Methods: We develop unconstrained generalized additive models (GAM and shape constrained generalized additive models (SCAM for investigating the possible effects of tree-specific parameters such as tree age, relative diameter at breast height, and site-specific parameters such as index of aridity and sum of daily mean temperature during vegetation period, on the h-d relationship of forests in Lower Saxony, Germany. Results: Some of the derived effects, e.g. effects of age, index of aridity and sum of daily mean temperature have significantly non-linear pattern. The need for using SCAM results from the fact that some of the model effects show partially implausible patterns especially at the boundaries of data ranges. The derived model predicts monotonically increasing levels of tree height with increasing age and temperature sum and decreasing aridity and social rank of a tree within a stand. The definition of constraints leads only to marginal or minor decline in the model statistics like AIC. An observed structured spatial trend in tree height is modelled via 2-dimensional surface

  4. Estimate of influenza cases using generalized linear, additive and mixed models.

    Science.gov (United States)

    Oviedo, Manuel; Domínguez, Ángela; Pilar Muñoz, M

    2015-01-01

    We investigated the relationship between reported cases of influenza in Catalonia (Spain). Covariates analyzed were: population, age, data of report of influenza, and health region during 2010-2014 using data obtained from the SISAP program (Institut Catala de la Salut - Generalitat of Catalonia). Reported cases were related with the study of covariates using a descriptive analysis. Generalized Linear Models, Generalized Additive Models and Generalized Additive Mixed Models were used to estimate the evolution of the transmission of influenza. Additive models can estimate non-linear effects of the covariates by smooth functions; and mixed models can estimate data dependence and variability in factor variables using correlations structures and random effects, respectively. The incidence rate of influenza was calculated as the incidence per 100 000 people. The mean rate was 13.75 (range 0-27.5) in the winter months (December, January, February) and 3.38 (range 0-12.57) in the remaining months. Statistical analysis showed that Generalized Additive Mixed Models were better adapted to the temporal evolution of influenza (serial correlation 0.59) than classical linear models.

  5. A spatial hazard model for cluster detection on continuous indicators of disease: application to somatic cell score.

    Science.gov (United States)

    Gay, Emilie; Senoussi, Rachid; Barnouin, Jacques

    2007-01-01

    Methods for spatial cluster detection dealing with diseases quantified by continuous variables are few, whereas several diseases are better approached by continuous indicators. For example, subclinical mastitis of the dairy cow is evaluated using a continuous marker of udder inflammation, the somatic cell score (SCS). Consequently, this study proposed to analyze spatialized risk and cluster components of herd SCS through a new method based on a spatial hazard model. The dataset included annual SCS for 34 142 French dairy herds for the year 2000, and important SCS risk factors: mean parity, percentage of winter and spring calvings, and herd size. The model allowed the simultaneous estimation of the effects of known risk factors and of potential spatial clusters on SCS, and the mapping of the estimated clusters and their range. Mean parity and winter and spring calvings were significantly associated with subclinical mastitis risk. The model with the presence of 3 clusters was highly significant, and the 3 clusters were attractive, i.e. closeness to cluster center increased the occurrence of high SCS. The three localizations were the following: close to the city of Troyes in the northeast of France; around the city of Limoges in the center-west; and in the southwest close to the city of Tarbes. The semi-parametric method based on spatial hazard modeling applies to continuous variables, and takes account of both risk factors and potential heterogeneity of the background population. This tool allows a quantitative detection but assumes a spatially specified form for clusters.

  6. On the development of a seismic source zonation model for seismic hazard assessment in western Saudi Arabia

    Science.gov (United States)

    Zahran, Hani M.; Sokolov, Vladimir; Roobol, M. John; Stewart, Ian C. F.; El-Hadidy Youssef, Salah; El-Hadidy, Mahmoud

    2016-07-01

    A new seismic source model has been developed for the western part of the Arabian Peninsula, which has experienced considerable earthquake activity in the historical past and in recent times. The data used for the model include an up-to-date seismic catalog, results of recent studies of Cenozoic faulting in the area, aeromagnetic anomaly and gravity maps, geological maps, and miscellaneous information on volcanic activity. The model includes 18 zones ranging along the Red Sea and the Arabian Peninsula from the Gulf of Aqaba and the Dead Sea in the north to the Gulf of Aden in the south. The seismic source model developed in this study may be considered as one of the basic branches in a logic tree approach for seismic hazard assessment in Saudi Arabia and adjacent territories.

  7. Structured Additive Regression Models: An R Interface to BayesX

    Directory of Open Access Journals (Sweden)

    Nikolaus Umlauf

    2015-02-01

    Full Text Available Structured additive regression (STAR models provide a flexible framework for model- ing possible nonlinear effects of covariates: They contain the well established frameworks of generalized linear models and generalized additive models as special cases but also allow a wider class of effects, e.g., for geographical or spatio-temporal data, allowing for specification of complex and realistic models. BayesX is standalone software package providing software for fitting general class of STAR models. Based on a comprehensive open-source regression toolbox written in C++, BayesX uses Bayesian inference for estimating STAR models based on Markov chain Monte Carlo simulation techniques, a mixed model representation of STAR models, or stepwise regression techniques combining penalized least squares estimation with model selection. BayesX not only covers models for responses from univariate exponential families, but also models from less-standard regression situations such as models for multi-categorical responses with either ordered or unordered categories, continuous time survival data, or continuous time multi-state models. This paper presents a new fully interactive R interface to BayesX: the R package R2BayesX. With the new package, STAR models can be conveniently specified using Rs formula language (with some extended terms, fitted using the BayesX binary, represented in R with objects of suitable classes, and finally printed/summarized/plotted. This makes BayesX much more accessible to users familiar with R and adds extensive graphics capabilities for visualizing fitted STAR models. Furthermore, R2BayesX complements the already impressive capabilities for semiparametric regression in R by a comprehensive toolbox comprising in particular more complex response types and alternative inferential procedures such as simulation-based Bayesian inference.

  8. Probabilistic Volcanic Multi-Hazard Assessment at Somma-Vesuvius (Italy): coupling Bayesian Belief Networks with a physical model for lahar propagation

    Science.gov (United States)

    Tierz, Pablo; Woodhouse, Mark; Phillips, Jeremy; Sandri, Laura; Selva, Jacopo; Marzocchi, Warner; Odbert, Henry

    2017-04-01

    Volcanoes are extremely complex physico-chemical systems where magma formed at depth breaks into the planet's surface resulting in major hazards from local to global scales. Volcano physics are dominated by non-linearities, and complicated spatio-temporal interrelationships which make volcanic hazards stochastic (i.e. not deterministic) by nature. In this context, probabilistic assessments are required to quantify the large uncertainties related to volcanic hazards. Moreover, volcanoes are typically multi-hazard environments where different hazardous processes can occur whether simultaneously or in succession. In particular, explosive volcanoes are able to accumulate, through tephra fallout and Pyroclastic Density Currents (PDCs), large amounts of pyroclastic material into the drainage basins surrounding the volcano. This addition of fresh particulate material alters the local/regional hydrogeological equilibrium and increases the frequency and magnitude of sediment-rich aqueous flows, commonly known as lahars. The initiation and volume of rain-triggered lahars may depend on: rainfall intensity and duration; antecedent rainfall; terrain slope; thickness, permeability and hydraulic diffusivity of the tephra deposit; etc. Quantifying these complex interrelationships (and their uncertainties), in a tractable manner, requires a structured but flexible probabilistic approach. A Bayesian Belief Network (BBN) is a directed acyclic graph that allows the representation of the joint probability distribution for a set of uncertain variables in a compact and efficient way, by exploiting unconditional and conditional independences between these variables. Once constructed and parametrized, the BBN uses Bayesian inference to perform causal (e.g. forecast) and/or evidential reasoning (e.g. explanation) about query variables, given some evidence. In this work, we illustrate how BBNs can be used to model the influence of several variables on the generation of rain-triggered lahars

  9. Seismic hazard studies in Egypt

    Directory of Open Access Journals (Sweden)

    Abuo El-Ela A. Mohamed

    2012-12-01

    Full Text Available The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba–Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5° within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA values were detected in the western part of the western desert and it is less than 25 gal.

  10. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    Energy Technology Data Exchange (ETDEWEB)

    Plesko, Catherine S [Los Alamos National Laboratory; Clement, R Ryan [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  11. Modeling hydrologic and geomorphic hazards across post-fire landscapes using a self-organizing map approach

    Science.gov (United States)

    Friedel, Michael J.

    2011-01-01

    Few studies attempt to model the range of possible post-fire hydrologic and geomorphic hazards because of the sparseness of data and the coupled, nonlinear, spatial, and temporal relationships among landscape variables. In this study, a type of unsupervised artificial neural network, called a self-organized map (SOM), is trained using data from 540 burned basins in the western United States. The sparsely populated data set includes variables from independent numerical landscape categories (climate, land surface form, geologic texture, and post-fire condition), independent landscape classes (bedrock geology and state), and dependent initiation processes (runoff, landslide, and runoff and landslide combination) and responses (debris flows, floods, and no events). Pattern analysis of the SOM-based component planes is used to identify and interpret relations among the variables. Application of the Davies-Bouldin criteria following k-means clustering of the SOM neurons identified eight conceptual regional models for focusing future research and empirical model development. A split-sample validation on 60 independent basins (not included in the training) indicates that simultaneous predictions of initiation process and response types are at least 78% accurate. As climate shifts from wet to dry conditions, forecasts across the burned landscape reveal a decreasing trend in the total number of debris flow, flood, and runoff events with considerable variability among individual basins. These findings suggest the SOM may be useful in forecasting real-time post-fire hazards, and long-term post-recovery processes and effects of climate change scenarios.

  12. A Kinematic Fault Network Model of Crustal Deformation for California and Its Application to the Seismic Hazard Analysis

    Science.gov (United States)

    Zeng, Y.; Shen, Z.; Harmsen, S.; Petersen, M. D.

    2010-12-01

    We invert GPS observations to determine the slip rates on major faults in California based on a kinematic fault model of crustal deformation with geological slip rate constraints. Assuming an elastic half-space, we interpret secular surface deformation using a kinematic fault network model with each fault segment slipping beneath a locking depth. This model simulates both block-like deformation and elastic strain accumulation within each bounding block. Each fault segment is linked to its adjacent elements with slip continuity imposed at fault nodes or intersections. The GPS observations across California and its neighbors are obtained from the SCEC WGCEP project of California Crustal Motion Map version 1.0 and SCEC Crustal Motion Map 4.0. Our fault models are based on the SCEC UCERF 2.0 fault database, a previous southern California block model by Shen and Jackson, and the San Francisco Bay area block model by d’Alessio et al. Our inversion shows a slip rate ranging from 20 to 26 mm/yr for the northern San Andreas from the Santa Cruz Mountain to the Peninsula segment. Slip rates vary from 8 to 14 mm/yr along the Hayward to the Maacama segment, and from 17 to 6 mm/yr along the central Calaveras to West Napa. For the central California creeping section, we find a depth dependent slip rate with an average slip rate of 23 mm/yr across the upper 5 km and 30 mm/yr underneath. Slip rates range from 30 mm/yr along the Parkfield and central California creeping section of the San Andres to an average of 6 mm/yr on the San Bernardino Mountain segment. On the southern San Andreas, slip rates vary from 21 to 30 mm/yr from the Cochella Valley to the Imperial Valley, and from 7 to 16 mm/yr along the San Jacinto segments. The shortening rate across the greater Los Angeles region is consistent with the regional tectonics and crustal thickening in the area. We are now in the process of applying the result to seismic hazard evaluation. Overall the geodetic and geological derived

  13. A guide to generalized additive models in crop science using SAS and R

    Directory of Open Access Journals (Sweden)

    Josefine Liew

    2015-06-01

    Full Text Available Linear models and generalized linear models are well known and are used extensively in crop science. Generalized additive models (GAMs are less well known. GAMs extend generalized linear models through inclusion of smoothing functions of explanatory variables, e.g., spline functions, allowing the curves to bend to better describe the observed data. This article provides an introduction to GAMs in the context of crop science experiments. This is exemplified using a dataset consisting of four populations of perennial sow-thistle (Sonchus arvensis L., originating from two regions, for which emergence of shoots over time was compared.

  14. Modal analysis of additive manufactured carbon fiber reinforced polymer composite framework: Experiment and modeling

    Science.gov (United States)

    Dryginin, N. V.; Krasnoveikin, V. A.; Filippov, A. V.; Tarasov, S. Yu.; Rubtsov, V. E.

    2016-11-01

    Additive manufacturing by 3D printing is the most advanced and promising trend for making the multicomponent composites. Polymer-based carbon fiber reinforced composites demonstrate high mechanical properties combined with low weight characteristics of the component. This paper shows the results of 3D modeling and experimental modal analysis on a polymer composite framework obtained using additive manufacturing. By the example of three oscillation modes it was shown the agreement between the results of modeling and experimental modal analysis with the use of laser Doppler vibrometry.

  15. Hidden Markov Model for quantitative prediction of snowfall and analysis of hazardous snowfall events over Indian Himalaya

    Science.gov (United States)

    Joshi, J. C.; Tankeshwar, K.; Srivastava, Sunita

    2017-04-01

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992-2012. There are six observations and six states of the model. The most probable observation and state sequence has been computed using Forward and Viterbi algorithms, respectively. Baum-Welch algorithm has been used for optimizing the model parameters. The model has been validated for two winters (2012-2013 and 2013-2014) by computing root mean square error (RMSE), accuracy measures such as percent correct (PC), critical success index (CSI) and Heidke skill score (HSS). The RMSE of the model has also been calculated using leave-one-out cross-validation method. Snowfall predicted by the model during hazardous snowfall events in different parts of the Himalaya matches well with the observed one. The HSS of the model for all the stations implies that the optimized model has better forecasting skill than random forecast for both the days. The RMSE of the optimized model has also been found smaller than the persistence forecast and standard deviation for both the days.

  16. Hidden Markov Model for quantitative prediction of snowfall and analysis of hazardous snowfall events over Indian Himalaya

    Indian Academy of Sciences (India)

    J C Joshi; K Tankeshwar; Sunita Srivastava

    2017-04-01

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six observations and six states of the model. The most probable observation and state sequence has been computed using Forward and Viterbi algorithms, respectively. Baum–Welch algorithm has been used for optimizing the model parameters. The model has been validated for two winters (2012–2013 and 2013–2014) by computing root mean square error (RMSE), accuracy measures such as percent correct (PC), critical success index (CSI) and Heidke skill score (HSS). The RMSE of the model has also been calculated using leave-one-out cross-validation method. Snowfall predicted by the model during hazardous snowfall events in different parts of the Himalaya matches well with the observed one. The HSS of the model for all the stations implies that the optimized model has better forecasting skill than random forecast for both the days. The RMSE of the optimized model has also been found smaller than the persistence forecast and standard deviation for both the days.

  17. Climate change impact assessment on Veneto and Friuli Plain groundwater. Part I: an integrated modeling approach for hazard scenario construction.

    Science.gov (United States)

    Baruffi, F; Cisotto, A; Cimolino, A; Ferri, M; Monego, M; Norbiato, D; Cappelletto, M; Bisaglia, M; Pretner, A; Galli, A; Scarinci, A; Marsala, V; Panelli, C; Gualdi, S; Bucchignani, E; Torresan, S; Pasini, S; Critto, A; Marcomini, A

    2012-12-01

    Climate change impacts on water resources, particularly groundwater, is a highly debated topic worldwide, triggering international attention and interest from both researchers and policy makers due to its relevant link with European water policy directives (e.g. 2000/60/EC and 2007/118/EC) and related environmental objectives. The understanding of long-term impacts of climate variability and change is therefore a key challenge in order to address effective protection measures and to implement sustainable management of water resources. This paper presents the modeling approach adopted within the Life+ project TRUST (Tool for Regional-scale assessment of groUndwater Storage improvement in adaptation to climaTe change) in order to provide climate change hazard scenarios for the shallow groundwater of high Veneto and Friuli Plain, Northern Italy. Given the aim to evaluate potential impacts on water quantity and quality (e.g. groundwater level variation, decrease of water availability for irrigation, variations of nitrate infiltration processes), the modeling approach integrated an ensemble of climate, hydrologic and hydrogeologic models running from the global to the regional scale. Global and regional climate models and downscaling techniques were used to make climate simulations for the reference period 1961-1990 and the projection period 2010-2100. The simulation of the recent climate was performed using observed radiative forcings, whereas the projections have been done prescribing the radiative forcings according to the IPCC A1B emission scenario. The climate simulations and the downscaling, then, provided the precipitation, temperatures and evapo-transpiration fields used for the impact analysis. Based on downscaled climate projections, 3 reference scenarios for the period 2071-2100 (i.e. the driest, the wettest and the mild year) were selected and used to run a regional geomorphoclimatic and hydrogeological model. The final output of the model ensemble produced

  18. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    Energy Technology Data Exchange (ETDEWEB)

    Doligez, B.; Eschard, R. [Institut Francais du Petrole, Rueil Malmaison (France); Geffroy, F. [Centre de Geostatistique, Fontainebleau (France)] [and others

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  19. Estimation of additive and dominance variance for reproductive traits from different models in Duroc purebred

    Directory of Open Access Journals (Sweden)

    Talerngsak Angkuraseranee

    2010-05-01

    Full Text Available The additive and dominance genetic variances of 5,801 Duroc reproductive and growth records were estimated usingBULPF90 PC-PACK. Estimates were obtained for number born alive (NBA, birth weight (BW, number weaned (NW, andweaning weight (WW. Data were analyzed using two mixed model equations. The first model included fixed effects andrandom effects identifying inbreeding depression, additive gene effect and permanent environments effects. The secondmodel was similar to the first model, but included the dominance genotypic effect. Heritability estimates of NBA, BW, NWand WW from the two models were 0.1558/0.1716, 0.1616/0.1737, 0.0372/0.0874 and 0.1584/0.1516 respectively. Proportionsof dominance effect to total phenotypic variance from the dominance model were 0.1024, 0.1625, 0.0470, and 0.1536 for NBA,BW, NW and WW respectively. Dominance effects were found to have sizable influence on the litter size traits analyzed.Therefore, genetic evaluation with the dominance model (Model 2 is found more appropriate than the animal model (Model 1.

  20. New results in RR Lyrae modeling: convective cycles, additional modes and more

    CERN Document Server

    Molnár, L; Szabó, R; Plachy, E

    2012-01-01

    Recent theoretical and observational findings breathed new life into the field of RR Lyrae stars. The ever more precise and complete measurements of the space asteroseismology missions revealed new details, such as the period doubling and the presence of the additional modes in the stars. Theoretical work also flourished: period doubling was explained and an additional mode has been detected in hydrodynamic models as well. Although the most intriguing mystery, the Blazhko-effect has remained unsolved, new findings indicate that the convective cycle model can be effectively ruled out for short- and medium-period modulations. On the other hand, the plausibility of the radial resonance model is increasing, as more and more resonances are detected both in models and stars.

  1. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models

    CERN Document Server

    Fan, Jianqing; Song, Rui

    2011-01-01

    A variable screening procedure via correlation learning was proposed Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under the nonparametric additive models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, an iterative nonparametric independence screening (INIS) is also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data a...

  2. CalTOX, a multimedia total exposure model for hazardous-waste sites; Part 1, Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    McKone, T.E.

    1993-06-01

    CalTOX has been developed as a spreadsheet model to assist in health-risk assessments that address contaminated soils and the contamination of adjacent air, surface water, sediments, and ground water. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify and reduce uncertainty in multimedia, multiple-pathway exposure models. This report provides an overview of the CalTOX model components, lists the objectives of the model, describes the philosophy under which the model was developed, identifies the chemical classes for which the model can be used, and describes critical sensitivities and uncertainties. The multimedia transport and transformation model is a dynamic model that can be used to assess time-varying concentrations of contaminants introduced initially to soil layers or for contaminants released continuously to air or water. This model assists the user in examining how chemical and landscape properties impact both the ultimate route and quantity of human contact. Multimedia, multiple pathway exposure models are used in the CalTOX model to estimate average daily potential doses within a human population in the vicinity of a hazardous substances release site. The exposure models encompass twenty-three exposure pathways. The exposure assessment process consists of relating contaminant concentrations in the multimedia model compartments to contaminant concentrations in the media with which a human population has contact (personal air, tap water, foods, household dusts soils, etc.). The average daily dose is the product of the exposure concentrations in these contact media and an intake or uptake factor that relates the concentrations to the distributions of potential dose within the population.

  3. Efficiency modeling of solidification/stabilization of multi-metal contaminated industrial soil using cement and additives

    Energy Technology Data Exchange (ETDEWEB)

    Voglar, Grega E. [RDA - Regional Development Agency Celje, Kidriceva ulica 25, 3000 Celje (Slovenia); Lestan, Domen, E-mail: domen.lestan@bf.uni-lj.si [Agronomy Department, Centre for Soil and Environmental Science, Biotechnical Faculty, University of Ljubljana, Jamnikarjeva 101, 1000 Ljubljana (Slovenia)

    2011-08-30

    Highlights: {yields} We assess the feasibility of using soil S/S for industrial land reclamation. {yields} Retarders, accelerators, plasticizers were used in S/S cementitious formulation. {yields} We proposed novel S/S efficiency model for multi-metal contaminated soils. - Abstract: In a laboratory study, formulations of 15% (w/w) of ordinary Portland cement (OPC), calcium aluminate cement (CAC) and pozzolanic cement (PC) and additives: plasticizers cementol delta ekstra (PCDE) and cementol antikorodin (PCA), polypropylene fibers (PPF), polyoxyethylene-sorbitan monooleate (Tween 80) and aqueous acrylic polymer dispersion (Akrimal) were used for solidification/stabilization (S/S) of soils from an industrial brownfield contaminated with up to 157, 32,175, 44,074, 7614, 253 and 7085 mg kg{sup -1} of Cd, Pb, Zn, Cu, Ni and As, respectively. Soils formed solid monoliths with all cementitious formulations tested, with a maximum mechanical strength of 12 N mm{sup -2} achieved after S/S with CAC + PCA. To assess the S/S efficiency of the used formulations for multi-element contaminated soils, we propose an empirical model in which data on equilibrium leaching of toxic elements into deionized water and TCLP (toxicity characteristic leaching procedure) solution and the mass transfer of elements from soil monoliths were weighed against the relative potential hazard of the particular toxic element. Based on the model calculation, the most efficient S/S formulation was CAC + Akrimal, which reduced soil leachability of Cd, Pb, Zn, Cu, Ni and As into deionized water below the limit of quantification and into TCLP solution by up to 55, 185, 8750, 214, 4.7 and 1.2-times, respectively; and the mass transfer of elements from soil monoliths by up to 740, 746, 104,000, 4.7, 343 and 181-times, respectively.

  4. The unconvincing product - Consumer versus expert hazard identification: A mental models study of novel foods

    DEFF Research Database (Denmark)

    Hagemann, Kit; Scholderer, Joachim

    in the consumer data as consumers 1) implicitly assumed that the novel foods had substantially improved agronomic properties and 2) assumed that all novel foods were governed by the same body of legislation that applies to GM foods. The last misconception might influence consumer trust in risk management when...... between technical experts and consumers e.g. over the nature of the hazards on which risk assessments should focus and perceptions of insufficient openness about uncertainties in risk assessment. The consumers part of the EU-project, NOFORISK, investigate the disagreement by comparing laypeople...... consumers realize that novel foods other than GM foods do not have to undergo environmental risk assessment. Another implication for risk management appeared as consumers did not demand any participatory elements in the risk analysis process. Consumers talked extensively about normative, governance...

  5. CAirTOX, An inter-media transfer model for assessing indirect exposures to hazardous air contaminants

    Energy Technology Data Exchange (ETDEWEB)

    McKone, T.E.

    1994-01-01

    Risk assessment is a quantitative evaluation of information on potential health hazards of environmental contaminants and the extent of human exposure to these contaminants. As applied to toxic chemical emissions to air, risk assessment involves four interrelated steps. These are (1) determination of source concentrations or emission characteristics, (2) exposure assessment, (3) toxicity assessment, and (4) risk characterization. These steps can be carried out with assistance from analytical models in order to estimate the potential risk associated with existing and future releases. CAirTOX has been developed as a spreadsheet model to assist in making these types of calculations. CAirTOX follows an approach that has been incorporated into the CalTOX model, which was developed for the California Department of Toxic Substances Control, With CAirTOX, we can address how contaminants released to an air basin can lead to contamination of soil, food, surface water, and sediments. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify uncertainty in multimedia, multiple-pathway exposure assessments. The capacity to explicitly address uncertainty has been incorporated into the model in two ways. First, the spreadsheet form of the model makes it compatible with Monte-Carlo add-on programs that are available for uncertainty analysis. Second, all model inputs are specified in terms of an arithmetic mean and coefficient of variation so that uncertainty analyses can be carried out.

  6. Experimental model and analytic solution for real-time observation of vehicle's additional steer angle

    Science.gov (United States)

    Zhang, Xiaolong; Li, Liang; Pan, Deng; Cao, Chengmao; Song, Jian

    2014-03-01

    The current research of real-time observation for vehicle roll steer angle and compliance steer angle(both of them comprehensively referred as the additional steer angle in this paper) mainly employs the linear vehicle dynamic model, in which only the lateral acceleration of vehicle body is considered. The observation accuracy resorting to this method cannot meet the requirements of vehicle real-time stability control, especially under extreme driving conditions. The paper explores the solution resorting to experimental method. Firstly, a multi-body dynamic model of a passenger car is built based on the ADAMS/Car software, whose dynamic accuracy is verified by the same vehicle's roadway test data of steady static circular test. Based on this simulation platform, several influencing factors of additional steer angle under different driving conditions are quantitatively analyzed. Then ɛ-SVR algorithm is employed to build the additional steer angle prediction model, whose input vectors mainly include the sensor information of standard electronic stability control system(ESC). The method of typical slalom tests and FMVSS 126 tests are adopted to make simulation, train model and test model's generalization performance. The test result shows that the influence of lateral acceleration on additional steer angle is maximal (the magnitude up to 1°), followed by the longitudinal acceleration-deceleration and the road wave amplitude (the magnitude up to 0.3°). Moreover, both the prediction accuracy and the calculation real-time of the model can meet the control requirements of ESC. This research expands the accurate observation methods of the additional steer angle under extreme driving conditions.

  7. Integrating multidisciplinary science, modelling and impact data into evolving, syn-event volcanic hazard mapping and communication: A case study from the 2012 Tongariro eruption crisis, New Zealand

    Science.gov (United States)

    Leonard, Graham S.; Stewart, Carol; Wilson, Thomas M.; Procter, Jonathan N.; Scott, Bradley J.; Keys, Harry J.; Jolly, Gill E.; Wardman, Johnny B.; Cronin, Shane J.; McBride, Sara K.

    2014-10-01

    New Zealand's Tongariro National Park volcanoes produce hazardous eruptions every few years to decades. On 6 August 2012 the Te Maari vent of Tongariro Volcano erupted, producing a series of explosions and a fine ash of minor volume which was dispersed rapidly to the east. This manuscript presents a summary of the eruption impacts and the way these supported science communication during the crisis, particularly in terms of hazard map development. The most significant proximal impact was damage from pyroclastic surges and ballistics to the popular and economically-important Tongariro Alpine Crossing track. The only hazard to affect the medial impact zone was a few mms of ashfall with minor impacts. Field testing indicated that the Te Maari ash had extremely low resistivity when wetted, implying a very high potential to cause disruption to nationally-important power transmission networks via the mechanism of insulator flashover. This was not observed, presumably due to insufficient ash accumulation on insulators. Virtually no impacts from distal ashfall were reported. Post-event analysis of PM10 data demonstrates the additional value of regional air quality monitoring networks in quantifying population exposure to airborne respirable ash. While the eruption was minor, it generated a high level of public interest and a demand for information on volcanic hazards and impacts from emergency managers, the public, critical infrastructure managers, health officials, and the agriculture sector. Meeting this demand fully taxed available resources. We present here aspects of the New Zealand experience which may have wider applicability in moving towards improved integration of hazard impact information, mapping, and communication. These include wide use of a wiki technical clearinghouse and email listservs, a focus on multi-agency consistent messages, and a recently developed environment of collaboration and alignment of both research funding and technical science advice

  8. Genetic evaluation of the longevity of the Holstein population in Japan using a Weibull proportional hazard model.

    Science.gov (United States)

    Sasaki, Osamu; Aihara, Mitsuo; Hagiya, Koichi; Nishiura, Akiko; Ishii, Kazuo; Satoh, Masahiro

    2012-02-01

    The objective of this study was to confirm the stability of the genetic estimation of longevity of the Holstein population in Japan. Data on the first 10 lactation periods were obtained from the Livestock Improvement Association of Japan. Longevity was defined as the number of days from first calving until culling or censoring. DATA1 and DATA2 included the survival records for the periods 1991-2003 and 1991-2005, respectively. The proportional hazard model included the effects of the region-parity-lactation stage-milk yield class, age at first calving, the herd-year-season, and sire. The heritabilities on an original scale of DATA1 and DATA2 were 0.119 and 0.123, respectively. The estimated transmitting abilities (ETAs) of young sires in DATA1 may have been underestimated, but coefficient δ, which indicated the bias of genetic trend between DATA1 and DATA2, was not significant. The regression coefficient of ETAs between DATA1 and DATA2 was very close to 1. The proportional hazard model could steadily estimate the ETA for longevity of the sires in Japan.

  9. Implementation of NGA-West2 ground motion models in the 2014 U.S. National Seismic Hazard Maps

    Science.gov (United States)

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Harmsen, Stephen C.; Frankel, Arthur D.

    2014-01-01

    The U.S. National Seismic Hazard Maps (NSHMs) have been an important component of seismic design regulations in the United States for the past several decades. These maps present earthquake ground shaking intensities at specified probabilities of being exceeded over a 50-year time period. The previous version of the NSHMs was developed in 2008; during 2012 and 2013, scientists at the U.S. Geological Survey have been updating the maps based on their assessment of the “best available science,” resulting in the 2014 NSHMs. The update includes modifications to the seismic source models and the ground motion models (GMMs) for sites across the conterminous United States. This paper focuses on updates in the Western United States (WUS) due to the use of new GMMs for shallow crustal earthquakes in active tectonic regions developed by the Next Generation Attenuation (NGA-West2) project. Individual GMMs, their weighted combination, and their impact on the hazard maps relative to 2008 are discussed. In general, the combined effects of lower medians and increased standard deviations in the new GMMs have caused only small changes, within 5–20%, in the probabilistic ground motions for most sites across the WUS compared to the 2008 NSHMs.

  10. Changing dynamics: Time-varying autoregressive models using generalized additive modeling.

    Science.gov (United States)

    Bringmann, Laura F; Hamaker, Ellen L; Vigo, Daniel E; Aubert, André; Borsboom, Denny; Tuerlinckx, Francis

    2017-09-01

    In psychology, the use of intensive longitudinal data has steeply increased during the past decade. As a result, studying temporal dependencies in such data with autoregressive modeling is becoming common practice. However, standard autoregressive models are often suboptimal as they assume that parameters are time-invariant. This is problematic if changing dynamics (e.g., changes in the temporal dependency of a process) govern the time series. Often a change in the process, such as emotional well-being during therapy, is the very reason why it is interesting and important to study psychological dynamics. As a result, there is a need for an easily applicable method for studying such nonstationary processes that result from changing dynamics. In this article we present such a tool: the semiparametric TV-AR model. We show with a simulation study and an empirical application that the TV-AR model can approximate nonstationary processes well if there are at least 100 time points available and no unknown abrupt changes in the data. Notably, no prior knowledge of the processes that drive change in the dynamic structure is necessary. We conclude that the TV-AR model has significant potential for studying changing dynamics in psychology. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Vector generalized linear and additive models with an implementation in R

    CERN Document Server

    Yee, Thomas W

    2015-01-01

    This book presents a statistical framework that expands generalized linear models (GLMs) for regression modelling. The framework shared in this book allows analyses based on many semi-traditional applied statistics models to be performed as a coherent whole. This is possible through the approximately half-a-dozen major classes of statistical models included in the book and the software infrastructure component, which makes the models easily operable.    The book’s methodology and accompanying software (the extensive VGAM R package) are directed at these limitations, and this is the first time the methodology and software are covered comprehensively in one volume. Since their advent in 1972, GLMs have unified important distributions under a single umbrella with enormous implications. The demands of practical data analysis, however, require a flexibility that GLMs do not have. Data-driven GLMs, in the form of generalized additive models (GAMs), are also largely confined to the exponential family. This book ...

  12. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  13. Coastal hazards in a changing world: projecting and communicating future coastal flood risk at the local-scale using the Coastal Storm Modeling System (CoSMoS)

    Science.gov (United States)

    O'Neill, Andrea; Barnard, Patrick; Erikson, Li; Foxgrover, Amy; Limber, Patrick; Vitousek, Sean; Fitzgibbon, Michael; Wood, Nathan

    2017-04-01

    The risk of coastal flooding will increase for many low-lying coastal regions as predominant contributions to flooding, including sea level, storm surge, wave setup, and storm-related fluvial discharge, are altered with climate change. Community leaders and local governments therefore look to science to provide insight into how climate change may affect their areas. Many studies of future coastal flooding vulnerability consider sea level and tides, but ignore other important factors that elevate flood levels during storm events, such as waves, surge, and discharge. Here we present a modelling approach that considers a broad range of relevant processes contributing to elevated storm water levels for open coast and embayment settings along the U.S. West Coast. Additionally, we present online tools for communicating community-relevant projected vulnerabilities. The Coastal Storm Modeling System (CoSMoS) is a numerical modeling system developed to predict coastal flooding due to both sea-level rise (SLR) and plausible 21st century storms for active-margin settings like the U.S. West Coast. CoSMoS applies a predominantly deterministic framework of multi-scale models encompassing large geographic scales (100s to 1000s of kilometers) to small-scale features (10s to 1000s of meters), resulting in flood extents that can be projected at a local resolution (2 meters). In the latest iteration of CoSMoS applied to Southern California, U.S., efforts were made to incorporate water level fluctuations in response to regional storm impacts, locally wind-generated waves, coastal river discharge, and decadal-scale shoreline and cliff changes. Coastal hazard projections are available in a user-friendly web-based tool (www.prbo.org/ocof), where users can view variations in flood extent, maximum flood depth, current speeds, and wave heights in response to a range of potential SLR and storm combinations, providing direct support to adaptation and management decisions. In order to capture

  14. Representational Flexibility and Problem-Solving Ability in Fraction and Decimal Number Addition: A Structural Model

    Science.gov (United States)

    Deliyianni, Eleni; Gagatsis, Athanasios; Elia, Iliada; Panaoura, Areti

    2016-01-01

    The aim of this study was to propose and validate a structural model in fraction and decimal number addition, which is founded primarily on a synthesis of major theoretical approaches in the field of representations in Mathematics and also on previous research on the learning of fractions and decimals. The study was conducted among 1,701 primary…

  15. Modeling the use of sulfate additives for potassium chloride destruction in biomass combustion

    DEFF Research Database (Denmark)

    Wu, Hao; Grell, Morten Nedergaard; Jespersen, Jacob Boll;

    2013-01-01

    was affected by the decomposition temperature. Based on the experimental data, a model was proposed to simulate the sulfation of KCl by different sulfate addition, and the simulation results were compared with pilot-scale experiments conducted in a bubbling fluidized bed reactor. The simulation results...

  16. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data.

    Science.gov (United States)

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2013-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions.

  17. Representational Flexibility and Problem-Solving Ability in Fraction and Decimal Number Addition: A Structural Model

    Science.gov (United States)

    Deliyianni, Eleni; Gagatsis, Athanasios; Elia, Iliada; Panaoura, Areti

    2016-01-01

    The aim of this study was to propose and validate a structural model in fraction and decimal number addition, which is founded primarily on a synthesis of major theoretical approaches in the field of representations in Mathematics and also on previous research on the learning of fractions and decimals. The study was conducted among 1,701 primary…

  18. Midrapidity inclusive densities in high energy pp collisions in additive quark model

    Science.gov (United States)

    Shabelski, Yu. M.; Shuvaev, A. G.

    2016-08-01

    High energy (CERN SPS and LHC) inelastic pp (pbar{p}) scattering is treated in the framework of the additive quark model together with Pomeron exchange theory. We extract the midrapidity inclusive density of the charged secondaries produced in a single quark-quark collision and investigate its energy dependence. Predictions for the π p collisions are presented.

  19. Additional interfacial force in lattice Boltzmann models for incompressible multiphase flows

    CERN Document Server

    Li, Q; Gao, Y J

    2011-01-01

    The existing lattice Boltzmann models for incompressible multiphase flows are mostly constructed with two distribution functions, one is the order parameter distribution function, which is used to track the interface between different phases, and the other is the pressure distribution function for solving the velocity field. In this brief report, it is shown that in these models the recovered momentum equation is inconsistent with the target one: an additional interfacial force is included in the recovered momentum equation. The effects of the additional force are investigated by numerical simulations of droplet splashing on a thin liquid film and falling droplet under gravity. In the former test, it is found that the formation and evolution of secondary droplets are greatly affected, while in the latter the additional force is found to increase the falling velocity and limit the stretch of the droplet.

  20. Automated Standard Hazard Tool

    Science.gov (United States)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  1. Estimating treatment effect in a proportional hazards model in randomized clinical trials with all-or-nothing compliance.

    Science.gov (United States)

    Li, Shuli; Gray, Robert J

    2016-09-01

    We consider methods for estimating the treatment effect and/or the covariate by treatment interaction effect in a randomized clinical trial under noncompliance with time-to-event outcome. As in Cuzick et al. (2007), assuming that the patient population consists of three (possibly latent) subgroups based on treatment preference: the ambivalent group, the insisters, and the refusers, we estimate the effects among the ambivalent group. The parameters have causal interpretations under standard assumptions. The article contains two main contributions. First, we propose a weighted per-protocol (Wtd PP) estimator through incorporating time-varying weights in a proportional hazards model. In the second part of the article, under the model considered in Cuzick et al. (2007), we propose an EM algorithm to maximize a full likelihood (FL) as well as the pseudo likelihood (PL) considered in Cuzick et al. (2007). The E step of the algorithm involves computing the conditional expectation of a linear function of the latent membership, and the main advantage of the EM algorithm is that the risk parameters can be updated by fitting a weighted Cox model using standard software and the baseline hazard can be updated using closed-form solutions. Simulations show that the EM algorithm is computationally much more efficient than directly maximizing the observed likelihood. The main advantage of the Wtd PP approach is that it is more robust to model misspecifications among the insisters and refusers since the outcome model does not impose distributional assumptions among these two groups. © 2016, The International Biometric Society.

  2. Internal structure and volcanic hazard potential of Mt Tongariro, New Zealand, from 3D gravity and magnetic models

    Science.gov (United States)

    Miller, Craig A.; Williams-Jones, Glyn

    2016-06-01

    A new 3D geophysical model of the Mt Tongariro Volcanic Massif (TgVM), New Zealand, provides a high resolution view of the volcano's internal structure and hydrothermal system, from which we derive implications for volcanic hazards. Geologically constrained 3D inversions of potential field data provides a greater level of insight into the volcanic structure than is possible from unconstrained models. A complex region of gravity highs and lows (± 6 mGal) is set within a broader, ~ 20 mGal gravity low. A magnetic high (1300 nT) is associated with Mt Ngauruhoe, while a substantial, thick, demagnetised area occurs to the north, coincident with a gravity low and interpreted as representing the hydrothermal system. The hydrothermal system is constrained to the west by major faults, interpreted as an impermeable barrier to fluid migration and extends to basement depth. These faults are considered low probability areas for future eruption sites, as there is little to indicate they have acted as magmatic pathways. Where the hydrothermal system coincides with steep topographic slopes, an increased likelihood of landslides is present and the newly delineated hydrothermal system maps the area most likely to have phreatic eruptions. Such eruptions, while small on a global scale, are important hazards at the TgVM as it is a popular hiking area with hundreds of visitors per day in close proximity to eruption sites. The model shows that the volume of volcanic material erupted over the lifespan of the TgVM is five to six times greater than previous estimates, suggesting a higher rate of magma supply, in line with global rates of andesite production. We suggest that our model of physical property distribution can be used to provide constraints for other models of dynamic geophysical processes occurring at the TgVM.

  3. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    Science.gov (United States)

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  4. Formation and reduction of carcinogenic furan in various model systems containing food additives.

    Science.gov (United States)

    Kim, Jin-Sil; Her, Jae-Young; Lee, Kwang-Geun

    2015-12-15

    The aim of this study was to analyse and reduce furan in various model systems. Furan model systems consisting of monosaccharides (0.5M glucose and ribose), amino acids (0.5M alanine and serine) and/or 1.0M ascorbic acid were heated at 121°C for 25 min. The effects of food additives (each 0.1M) such as metal ions (iron sulphate, magnesium sulphate, zinc sulphate and calcium sulphate), antioxidants (BHT and BHA), and sodium sulphite on the formation of furan were measured. The level of furan formed in the model systems was 6.8-527.3 ng/ml. The level of furan in the model systems of glucose/serine and glucose/alanine increased 7-674% when food additives were added. In contrast, the level of furan decreased by 18-51% in the Maillard reaction model systems that included ribose and alanine/serine with food additives except zinc sulphate.

  5. NB-PLC channel modelling with cyclostationary noise addition & OFDM implementation for smart grid

    Science.gov (United States)

    Thomas, Togis; Gupta, K. K.

    2016-03-01

    Power line communication (PLC) technology can be a viable solution for the future ubiquitous networks because it provides a cheaper alternative to other wired technology currently being used for communication. In smart grid Power Line Communication (PLC) is used to support communication with low rate on low voltage (LV) distribution network. In this paper, we propose the channel modelling of narrowband (NB) PLC in the frequency range 5 KHz to 500 KHz by using ABCD parameter with cyclostationary noise addition. Behaviour of the channel was studied by the addition of 11KV/230V transformer, by varying load location and load. Bit error rate (BER) Vs signal to noise ratio SNR) was plotted for the proposed model by employing OFDM. Our simulation results based on the proposed channel model show an acceptable performance in terms of bit error rate versus signal to noise ratio, which enables communication required for smart grid applications.

  6. Bi-Objective Modelling for Hazardous Materials Road-Rail Multimodal Routing Problem with Railway Schedule-Based Space-Time Constraints.

    Science.gov (United States)

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-07-28

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing-Tianjin-Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study.

  7. Bi-Objective Modelling for Hazardous Materials Road–Rail Multimodal Routing Problem with Railway Schedule-Based Space–Time Constraints

    Science.gov (United States)

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-01-01

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing–Tianjin–Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study. PMID:27483294

  8. Updated Colombian Seismic Hazard Map

    Science.gov (United States)

    Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.

    2013-05-01

    The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is

  9. Open Source Software for Mapping Human Impacts on Marine Ecosystems with an Additive Model

    Directory of Open Access Journals (Sweden)

    Andy Stock

    2016-06-01

    Full Text Available This paper describes an easy-to-use open source software tool implementing a commonly used additive model (Halpern et al., 'Science', 2008 for mapping human impacts on marine ecosystems. The tool has been used to map the potential for cumulative human impacts in Arctic marine waters and can support future human impact mapping projects by 1 making the model easier to use; 2 making updates of model results straightforward when better input data become available; 3 storing input data and information about processing steps in a defined format and thus facilitating data sharing and reproduction of modeling results; 4 supporting basic visualization of model inputs and outputs without the need for advanced technical skills. The tool, called EcoImpactMapper, was implemented in Java and is thus platform-independent. A tutorial, example data, the tool and the source code are available online.

  10. Estimation for an additive growth curve model with orthogonal design matrices

    CERN Document Server

    Hu, Jianhua; You, Jinhong; 10.3150/10-BEJ315

    2012-01-01

    An additive growth curve model with orthogonal design matrices is proposed in which observations may have different profile forms. The proposed model allows us to fit data and then estimate parameters in a more parsimonious way than the traditional growth curve model. Two-stage generalized least-squares estimators for the regression coefficients are derived where a quadratic estimator for the covariance of observations is taken as the first-stage estimator. Consistency, asymptotic normality and asymptotic independence of these estimators are investigated. Simulation studies and a numerical example are given to illustrate the efficiency and parsimony of the proposed model for model specifications in the sense of minimizing Akaike's information criterion (AIC).

  11. The additive nonparametric and semiparametric Aalen model as the rate function for a counting process

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder

    2002-01-01

    We use the additive risk model of Aalen (Aalen, 1980) as a model for the rate of a counting process. Rather than specifying the intensity, that is the instantaneous probability of an event conditional on the entire history of the relevant covariates and counting processes, we present a model...... for the rate function, i.e., the instantaneous probability of an event conditional on only a selected set of covariates. When the rate function for the counting process is of Aalen form we show that the usual Aalen estimator can be used and gives almost unbiased estimates. The usual martingale based variance...... estimator is incorrect and an alternative estimator should be used. We also consider the semi-parametric version of the Aalen model as a rate model (McKeague and Sasieni, 1994) and show that the standard errors that are computed based on an assumption of intensities are incorrect and give a different...

  12. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    Science.gov (United States)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  13. Climate change in a Point-Over-Threshold model: an example on ocean-wave-storm hazard in NE Spain

    Science.gov (United States)

    Tolosana-Delgado, R.; Ortego, M. I.; Egozcue, J. J.; Sánchez-Arcilla, A.

    2009-09-01

    Climatic change is a problem of general concern. When dealing with hazardous events such as wind-storms, heavy rainfall or ocean-wave storms this concern is even more serious. Climate change might imply an increase of human and material losses, and it is worth devoting efforts to detect it. Hazard assessment of such events is often carried out with a point-over-threshold (POT) model. Time-occurrence of events is assumed to be Poisson distributed, and the magnitude of each event is modelled as an arbitrary random variable, which upper tail is described by a Generalized Pareto Distribution (GPD). Independence between this magnitude and occurrence in time is assumed, as well as independence from event to event. The GPD models excesses over a threshold. If X is the magnitude of an event and x0 a value of the support of X, the excess over the threshold x0 is Y = X - x0, conditioned to X > x0. Therefore, the support of Y is (a segment of) the positive real line. The GPD model has a scale and a shape parameter. The scale parameter of the distribution is β > 0. The shape parameter, ? is real-valued, and it defines three different sub-families of distributions. GPD distributions with ? 0, distributions have infinite heavy tails (ysup = +? ), and for ? = 0 we obtain the exponential distribution, which has an infinite support but a well-behaved tail. The GPD distribution function is ( ? )- 1 ? FY(y|β,?) = 1- 1+ β-y , 0 ? y issue as well. A handful of phenomena are better described by a relative scale (e.g. positive data where the null value is unattainable) and are thus suitably treated in a logarithmic scale. This has been already used successfully for daily rainfall data and ocean-wave-height. How to assess impact of climate change on hazardous events? In a climate change scenario, we can consider the model for description of the variable as stable, while its parameters may be taken as a function of time. Thus, magnitudes are taken in a log-scale. Excesses over a

  14. EFFECT OF NANOPOWDER ADDITION ON THE FLEXURAL STRENGTH OF ALUMINA CERAMIC - A WEIBULL MODEL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Daidong Guo

    2016-05-01

    Full Text Available Alumina ceramics were prepared either with micrometer-sized alumina powder (MAP or with the addition of nanometer-sized alumina powder (NAP. The density, crystalline phase, flexural strength and the fracture surface of the two ceramics were measured and compared. Emphasis has been put on the influence of nanopowder addition on the flexural strength of Al₂O₃ ceramic. The analysis based on the Weibull distribution model suggests the distribution of the flexural strength of the NAP ceramic is more concentrated than that of the MAP ceramic. Therefore, the NAP ceramics will be more stable and reliable in real applications.

  15. Efficient semiparametric estimation in generalized partially linear additive models for longitudinal/clustered data

    KAUST Repository

    Cheng, Guang

    2014-02-01

    We consider efficient estimation of the Euclidean parameters in a generalized partially linear additive models for longitudinal/clustered data when multiple covariates need to be modeled nonparametrically, and propose an estimation procedure based on a spline approximation of the nonparametric part of the model and the generalized estimating equations (GEE). Although the model in consideration is natural and useful in many practical applications, the literature on this model is very limited because of challenges in dealing with dependent data for nonparametric additive models. We show that the proposed estimators are consistent and asymptotically normal even if the covariance structure is misspecified. An explicit consistent estimate of the asymptotic variance is also provided. Moreover, we derive the semiparametric efficiency score and information bound under general moment conditions. By showing that our estimators achieve the semiparametric information bound, we effectively establish their efficiency in a stronger sense than what is typically considered for GEE. The derivation of our asymptotic results relies heavily on the empirical processes tools that we develop for the longitudinal/clustered data. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2014 ISI/BS.

  16. Generalized neurofuzzy network modeling algorithms using Bézier-Bernstein polynomial functions and additive decomposition.

    Science.gov (United States)

    Hong, X; Harris, C J

    2000-01-01

    This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bézier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bézier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bézier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bézier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

  17. Estimation of Hazard Functions in the Log-Linear Age-Period-Cohort Model: Application to Lung Cancer Risk Associated with Geographical Area

    Directory of Open Access Journals (Sweden)

    Tengiz Mdzinarishvili

    2010-04-01

    Full Text Available An efficient computing procedure for estimating the age-specific hazard functions by the log-linear age-period-cohort (LLAPC model is proposed. This procedure accounts for the influence of time period and birth cohort effects on the distribution of age-specific cancer incidence rates and estimates the hazard function for populations with different exposures to a given categorical risk factor. For these populations, the ratio of the corresponding age-specific hazard functions is proposed for use as a measure of relative hazard. This procedure was used for estimating the risks of lung cancer (LC for populations living in different geographical areas. For this purpose, the LC incidence rates in white men and women, in three geographical areas (namely: San Francisco-Oakland, Connecticut and Detroit, collected from the SEER 9 database during 1975–2004, were utilized. It was found that in white men the averaged relative hazard (an average of the relative hazards over all ages of LC in Connecticut vs. San Francisco-Oakland is 1.31 ± 0.02, while in Detroit vs. San Francisco-Oakland this averaged relative hazard is 1.53 ± 0.02. In white women, analogous hazards in Connecticut vs. San Francisco-Oakland and Detroit vs. San Francisco-Oakland are 1.22 ± 0.02 and 1.32 ± 0.02, correspondingly. The proposed computing procedure can be used for assessing hazard functions for other categorical risk factors, such as gender, race, lifestyle, diet, obesity, etc.

  18. The effect of tailor-made additives on crystal growth of methyl paraben: Experiments and modelling

    Science.gov (United States)

    Cai, Zhihui; Liu, Yong; Song, Yang; Guan, Guoqiang; Jiang, Yanbin

    2017-03-01

    In this study, methyl paraben (MP) was selected as the model component, and acetaminophen (APAP), p-methyl acetanilide (PMAA) and acetanilide (ACET), which share the similar molecular structure as MP, were selected as the three tailor-made additives to study the effect of tailor-made additives on the crystal growth of MP. HPLC results indicated that the MP crystals induced by the three additives contained MP only. Photographs of the single crystals prepared indicated that the morphology of the MP crystals was greatly changed by the additives, but PXRD and single crystal diffraction results illustrated that the MP crystals were the same polymorph only with different crystal habits, and no new crystal form was found compared with other references. To investigate the effect of the additives on the crystal growth, the interaction between additives and facets was discussed in detail using the DFT methods and MD simulations. The results showed that APAP, PMAA and ACET would be selectively adsorbed on the growth surfaces of the crystal facets, which induced the change in MP crystal habits.

  19. A flexible additive inflation scheme for treating model error in ensemble Kalman Filters

    Science.gov (United States)

    Sommer, Matthias; Janjic, Tijana

    2017-04-01

    Data assimilation algorithms require an accurate estimate of the uncertainty of the prior, background, field. However, the background error covariance derived from the ensemble of numerical model simulations does not adequately represent the uncertainty of it. This is partially due to the sampling error that arises from the use of a small number of ensemble members to represent the background error covariance. It is also partially a consequence of the fact that the model does not represent its own error. Several mechanisms have been introduced so far aiming at alleviating the detrimental e ffects of misrepresented ensemble covariances, allowing for the successful implementation of ensemble data assimilation techniques for atmospheric dynamics. One of the established approaches in ensemble data assimilation is additive inflation which perturbs each ensemble member with a sample from a given distribution. This results in a fixed rank of the model error covariance matrix. Here, a more flexible approach is suggested where the model error samples are treated as additional synthetic ensemble members which are used in the update step of data assimilation but are not forecast. In this way, the rank of the model error covariance matrix can be chosen independently of the ensemble. The eff ect of this altered additive inflation method on the performance of the filter is analyzed here in an idealised experiment. It is shown that the additional synthetic ensemble members can make it feasible to achieve convergence in an otherwise divergent setting of data assimilation. The use of this method also allows for a less stringent localization radius.

  20. Computer Models Used to Support Cleanup Decision Making at Hazardous and Radioactive Waste Sites

    Science.gov (United States)

    This report is a product of the Interagency Environmental Pathway Modeling Workgroup. This report will help bring a uniform approach to solving environmental modeling problems common to site remediation and restoration efforts.

  1. Performance of Models for Flash Flood Warning and Hazard Assessment: The 2015 Kali Gandaki Landslide Dam Breach in Nepal

    Directory of Open Access Journals (Sweden)

    Jeremy D. Bricker

    2017-02-01

    Full Text Available The 2015 magnitude 7.8 Gorkha earthquake and its aftershocks weakened mountain slopes in Nepal. Co- and postseismic landsliding and the formation of landslide-dammed lakes along steeply dissected valleys were widespread, among them a landslide that dammed the Kali Gandaki River. Overtopping of the landslide dam resulted in a flash flood downstream, though casualties were prevented because of timely evacuation of low-lying areas. We hindcast the flood using the BREACH physically based dam-break model for upstream hydrograph generation, and compared the resulting maximum flow rate with those resulting from various empirical formulas and a simplified hydrograph based on published observations. Subsequent modeling of downstream flood propagation was compromised by a coarse-resolution digital elevation model with several artifacts. Thus, we used a digital-elevation-model preprocessing technique that combined carving and smoothing to derive topographic data. We then applied the 1-dimensional HEC-RAS model for downstream flood routing, and compared it to the 2-dimensional Delft-FLOW model. Simulations were validated using rectified frames of a video recorded by a resident during the flood in the village of Beni, allowing estimation of maximum flow depth and speed. Results show that hydrological smoothing is necessary when using coarse topographic data (such as SRTM or ASTER, as using raw topography underestimates flow depth and speed and overestimates flood wave arrival lag time. Results also show that the 2-dimensional model produces more accurate results than the 1-dimensional model but the 1-dimensional model generates a more conservative result and can be run in a much shorter time. Therefore, a 2-dimensional model is recommended for hazard assessment and planning, whereas a 1-dimensional model would facilitate real-time warning declaration.

  2. Hydrological risks in anthropized watersheds: modeling of hazard, vulnerability and impacts on population from south-west of Madagascar

    Science.gov (United States)

    Mamy Rakotoarisoa, Mahefa; Fleurant, Cyril; Taibi, Nuscia; Razakamanana, Théodore

    2016-04-01

    Hydrological risks, especially for floods, are recurrent on the Fiherenana watershed - southwest of Madagascar. The city of Toliara, which is located at the outlet of the river basin, is subjected each year to hurricane hazards and floods. The stakes are of major importance in this part of the island. This study begins with the analysis of hazard by collecting all existing hydro-climatic data on the catchment. It then seeks to determine trends, despite the significant lack of data, using simple statistical models (decomposition of time series). Then, two approaches are conducted to assess the vulnerability of the city of Toliara and the surrounding villages. First, a static approach, from surveys of land and the use of GIS are used. Then, the second method is the use of a multi-agent-based simulation model. The first step is the mapping of a vulnerability index which is the arrangement of several static criteria. This is a microscale indicator (the scale used is the housing). For each House, there are several criteria of vulnerability, which are the potential water depth, the flow rate, or the architectural typology of the buildings. For the second part, simulations involving scenes of agents are used in order to evaluate the degree of vulnerability of homes from flooding. Agents are individual entities to which we can assign behaviours on purpose to simulate a given phenomenon. The aim is not to give a criterion to the house as physical building, such as its architectural typology or its strength. The model wants to know the chances of the occupants of the house to escape from a catastrophic flood. For this purpose, we compare various settings and scenarios. Some scenarios are conducted to take into account the effect of certain decision made by the responsible entities (Information and awareness of the villagers for example). The simulation consists of two essential parts taking place simultaneously in time: simulation of the rise of water and the flow using

  3. Exploration of land-use scenarios for flood hazard modeling – the case of Santiago de Chile

    Directory of Open Access Journals (Sweden)

    A. Müller

    2011-04-01

    Full Text Available Urban expansion leads to modifications in land use and land cover and to the loss of vegetated areas. These developments are in some regions of the world accelerated by a changing regional climate. As a conseque