WorldWideScience

Sample records for additive hazards model

  1. A flexible additive multiplicative hazard model

    Martinussen, Torben; Scheike, Thomas H.

    2002-01-01

    Aalen's additive model; Counting process; Cox regression; Hazard model; Proportional excess harzard model; Time-varying effect......Aalen's additive model; Counting process; Cox regression; Hazard model; Proportional excess harzard model; Time-varying effect...

  2. Coordinate descent methods for the penalized semiprarametric additive hazard model

    Gorst-Rasmussen, Anders; Scheike, Thomas

    2012-01-01

    . The semiparametric additive hazards model is a flexible alternative which is a natural survival analogue of the standard linear regression model. Building on this analogy, we develop a cyclic coordinate descent algorithm for fitting the lasso and elastic net penalized additive hazards model. The...

  3. Additive Hazard Regression Models: An Application to the Natural History of Human Papillomavirus

    Xianhong Xie; STRICKLER, Howard D.; Xiaonan Xue

    2013-01-01

    There are several statistical methods for time-to-event analysis, among which is the Cox proportional hazards model that is most commonly used. However, when the absolute change in risk, instead of the risk ratio, is of primary interest or when the proportional hazard assumption for the Cox proportional hazards model is violated, an additive hazard regression model may be more appropriate. In this paper, we give an overview of this approach and then apply a semiparametric as well as a nonpara...

  4. Coordinate descent methods for the penalized semiparametric additive hazards model

    Gorst-Rasmussen, Anders; Scheike, Thomas

    For survival data with a large number of explanatory variables, lasso penalized Cox regression is a popular regularization strategy. However, a penalized Cox model may not always provide the best fit to data and can be difficult to estimate in high dimension because of its intrinsic nonlinearity....... algorithm requires no nonlinear optimization steps and offers excellent performance and stability. An implementation is available in the R-package ahaz and we demonstrate this package in a small timing study and in an application to real data....

  5. Additive Hazard Regression Models: An Application to the Natural History of Human Papillomavirus

    Xianhong Xie

    2013-01-01

    Full Text Available There are several statistical methods for time-to-event analysis, among which is the Cox proportional hazards model that is most commonly used. However, when the absolute change in risk, instead of the risk ratio, is of primary interest or when the proportional hazard assumption for the Cox proportional hazards model is violated, an additive hazard regression model may be more appropriate. In this paper, we give an overview of this approach and then apply a semiparametric as well as a nonparametric additive model to a data set from a study of the natural history of human papillomavirus (HPV in HIV-positive and HIV-negative women. The results from the semiparametric model indicated on average an additional 14 oncogenic HPV infections per 100 woman-years related to CD4 count < 200 relative to HIV-negative women, and those from the nonparametric additive model showed an additional 40 oncogenic HPV infections per 100 women over 5 years of followup, while the estimated hazard ratio in the Cox model was 3.82. Although the Cox model can provide a better understanding of the exposure disease association, the additive model is often more useful for public health planning and intervention.

  6. Asymptotics on Semiparametric Analysis of Multivariate Failure Time Data Under the Additive Hazards Model

    Huan-bin Liu; Liu-quan Sun; Li-xing Zhu

    2005-01-01

    Many survival studies record the times to two or more distinct failures on each subject. The failures may be events of different natures or may be repetitions of the same kind of event. In this article, we consider the regression analysis of such multivariate failure time data under the additive hazards model. Simple weighted estimating functions for the regression parameters are proposed, and asymptotic distribution theory of the resulting estimators are derived. In addition, a class of generalized Wald and generalized score statistics for hypothesis testing and model selection are presented, and the asymptotic properties of these statistics are examined.

  7. Estimation of direct effects for survival data by using the Aalen additive hazards model

    Martinussen, T.; Vansteelandt, S.; Gerster, M.;

    2011-01-01

    We extend the definition of the controlled direct effect of a point exposure on a survival outcome, other than through some given, time-fixed intermediate variable, to the additive hazard scale. We propose two-stage estimators for this effect when the exposure is dichotomous and randomly assigned...

  8. Sparse Additive Models

    Ravikumar, Pradeep; Lafferty, John; Liu, Han; Wasserman, Larry

    2007-01-01

    We present a new class of methods for high-dimensional nonparametric regression and classification called sparse additive models (SpAM). Our methods combine ideas from sparse linear modeling and additive nonparametric regression. We derive an algorithm for fitting the models that is practical and effective even when the number of covariates is larger than the sample size. SpAM is closely related to the COSSO model of Lin and Zhang (2006), but decouples smoothing and sparsity, enabling the use...

  9. Additive Hazards Regression with Random Eff ects for Clustered Failure Times

    Deng PAN; Yan Yan LIU; Yuan Shan WU

    2015-01-01

    Additive hazards model with random eff ects is proposed for modelling the correlated failure time data when focus is on comparing the failure times within clusters and on estimating the correlation between failure times from the same cluster, as well as the marginal regression parameters. Our model features that, when marginalized over the random eff ect variable, it still enjoys the structure of the additive hazards model. We develop the estimating equations for inferring the regression parameters. The proposed estimators are shown to be consistent and asymptotically normal under appropriate regularity conditions. Furthermore, the estimator of the baseline hazards function is proposed and its asymptotic properties are also established. We propose a class of diagnostic methods to assess the overall fitting adequacy of the additive hazards model with random eff ects. We conduct simulation studies to evaluate the finite sample behaviors of the proposed estimators in various scenarios. Analysis of the Diabetic Retinopathy Study is provided as an illustration for the proposed method.

  10. Model Additional Protocol

    Since the end of the cold war a series of events has changed the circumstances and requirements of the safeguards system. The discovery of a clandestine nuclear weapons program in Iraq, the continuing difficulty in verifying the initial report of Democratic People's Republic of Korea upon entry into force of their safeguards agreement, and the decision of the South African Government to give up its nuclear weapons program and join the Treaty on the Non-Proliferation of Nuclear Weapons have all played a role in an ambitious effort by IAEA Member States and the Secretariat to strengthen the safeguards system. A major milestone in this effort was reached in May 1997 when the IAEA Board of Governors approved a Model Protocol Additional to Safeguards Agreements. The Model Additional Protocol was negotiated over a period of less than a year by an open-ended committee of the Board involving some 70 Member States and two regional inspectorates. The IAEA is now in the process of negotiating additional protocols, State by State, and implementing them. These additional protocols will provide the IAEA with rights of access to information about all activities related to the use of nuclear material in States with comprehensive safeguards agreements and greatly expanded physical access for IAEA inspectors to confirm or verify this information. In conjunction with this, the IAEA is working on the integration of these measures with those provided for in comprehensive safeguards agreements, with a view to maximizing the effectiveness and efficiency, within available resources, the implementation of safeguards. Details concerning the Model Additional Protocol are given. (author)

  11. Computer Model Locates Environmental Hazards

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  12. Comparative Distributions of Hazard Modeling Analysis

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  13. Satellite image collection modeling for large area hazard emergency response

    Liu, Shufan; Hodgson, Michael E.

    2016-08-01

    Timely collection of critical hazard information is the key to intelligent and effective hazard emergency response decisions. Satellite remote sensing imagery provides an effective way to collect critical information. Natural hazards, however, often have large impact areas - larger than a single satellite scene. Additionally, the hazard impact area may be discontinuous, particularly in flooding or tornado hazard events. In this paper, a spatial optimization model is proposed to solve the large area satellite image acquisition planning problem in the context of hazard emergency response. In the model, a large hazard impact area is represented as multiple polygons and image collection priorities for different portion of impact area are addressed. The optimization problem is solved with an exact algorithm. Application results demonstrate that the proposed method can address the satellite image acquisition planning problem. A spatial decision support system supporting the optimization model was developed. Several examples of image acquisition problems are used to demonstrate the complexity of the problem and derive optimized solutions.

  14. Modeling and Hazard Analysis Using STPA

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  15. Validation of a heteroscedastic hazards regression model.

    Wu, Hong-Dar Isaac; Hsieh, Fushing; Chen, Chen-Hsin

    2002-03-01

    A Cox-type regression model accommodating heteroscedasticity, with a power factor of the baseline cumulative hazard, is investigated for analyzing data with crossing hazards behavior. Since the approach of partial likelihood cannot eliminate the baseline hazard, an overidentified estimating equation (OEE) approach is introduced in the estimation procedure. It by-product, a model checking statistic, is presented to test for the overall adequacy of the heteroscedastic model. Further, under the heteroscedastic model setting, we propose two statistics to test the proportional hazards assumption. Implementation of this model is illustrated in a data analysis of a cancer clinical trial. PMID:11878222

  16. POTENTIAL HAZARDS DUE TO FOOD ADDITIVES IN ORAL HYGIENE PRODUCTS

    Damla TUNCER-BUDANUR; Murat Cengizhan YAŞ; SEPET, Elif

    2016-01-01

    Food additives used to preserve flavor or to enhance the taste and appearance of foods are also available in oral hygiene products. The aim of this review is to provide information concerning food additives in oral hygiene products and their adverse effects. A great many of food additives in oral hygiene products are potential allergens and they may lead to allergic reactions such as urticaria, contact dermatitis, rhinitis, and angioedema. Dental practitioners, as w...

  17. POTENTIAL HAZARDS DUE TO FOOD ADDITIVES IN ORAL HYGIENE PRODUCTS

    Damla TUNCER-BUDANUR

    2016-04-01

    Full Text Available Food additives used to preserve flavor or to enhance the taste and appearance of foods are also available in oral hygiene products. The aim of this review is to provide information concerning food additives in oral hygiene products and their adverse effects. A great many of food additives in oral hygiene products are potential allergens and they may lead to allergic reactions such as urticaria, contact dermatitis, rhinitis, and angioedema. Dental practitioners, as well as health care providers, must be aware of the possibility of allergic reactions due to food additives in oral hygiene products. Proper dosage levels, delivery vehicles, frequency, potential benefits, and adverse effects of oral health products should be explained completely to the patients. There is a necessity to raise the awareness among dental professionals on this subject and to develop a data gathering system for possible adverse reactions.

  18. Accelerated Hazards Mixture Cure Model

    Zhang, Jiajia; Peng, Yingwei

    2009-01-01

    We propose a new cure model for survival data with a surviving or cure fraction. The new model is a mixture cure model where the covariate effects on the proportion of cure and the distribution of the failure time of uncured patients are separately modeled. Unlike the existing mixture cure models, the new model allows covariate effects on the failure time distribution of uncured patients to be negligible at time zero and to increase as time goes by. Such a model is particularly useful in some...

  19. Hazard Warning: model misuse ahead

    Dickey-Collas, M.; Payne, Mark; Trenkel, V.;

    2014-01-01

    The use of modelling approaches in marine science, and in particular fisheries science, is explored. We highlight that the choice of model used for an analysis should account for the question being posed or the context of the management problem. We examine a model-classification scheme based on R...

  20. A conflict model for the international hazardous waste disposal dispute

    A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.

  1. Spatial extended hazard model with application to prostate cancer survival.

    Li, Li; Hanson, Timothy; Zhang, Jiajia

    2015-06-01

    This article develops a Bayesian semiparametric approach to the extended hazard model, with generalization to high-dimensional spatially grouped data. County-level spatial correlation is accommodated marginally through the normal transformation model of Li and Lin (2006, Journal of the American Statistical Association 101, 591-603), using a correlation structure implied by an intrinsic conditionally autoregressive prior. Efficient Markov chain Monte Carlo algorithms are developed, especially applicable to fitting very large, highly censored areal survival data sets. Per-variable tests for proportional hazards, accelerated failure time, and accelerated hazards are efficiently carried out with and without spatial correlation through Bayes factors. The resulting reduced, interpretable spatial models can fit significantly better than a standard additive Cox model with spatial frailties. PMID:25521422

  2. Business models for additive manufacturing

    Hadar, Ronen; Bilberg, Arne; Bogers, Marcel

    2015-01-01

    Digital fabrication — including additive manufacturing (AM), rapid prototyping and 3D printing — has the potential to revolutionize the way in which products are produced and delivered to the customer. Therefore, it challenges companies to reinvent their business model — describing the logic of c...... effectively takes over the productive activities of the manufacturer. We discuss some of the main implications for research and practice of consumer-centric business models and the changing decoupling point in consumer goods’ manufacturing supply chains....... of creating and capturing value. In this paper, we explore the implications that AM technologies have for manufacturing systems in the new business models that they enable. In particular, we consider how a consumer goods manufacturer can organize the operations of a more open business model when moving from...... a manufacturer-centric to a consumer-centric value logic. A major shift includes a move from centralized to decentralized supply chains, where consumer goods manufacturers can implement a “hybrid” approach with a focus on localization and accessibility or develop a fully personalized model where the consumer...

  3. On multiple agent models of moral hazard

    Andrea Attar; Eloisa Campioni; Gwena�l Piaser; Uday Rajan

    2006-01-01

    In multiple principal, multiple agent models of moral hazard, we provide conditions under which the outcomes of equilibria in direct mechanisms are preserved when principals can offer indirect communication schemes. We discuss the role of random allocations and recommendations and relate the result to the existing literature.

  4. Hazard identification based on plant functional modelling

    A major objective of the present work is to provide means for representing a process plant as a socio-technical system, so as to allow hazard identification at a high level. The method includes technical, human and organisational aspects and is intended to be used for plant level hazard identification so as to identify critical areas and the need for further analysis using existing methods. The first part of the method is the preparation of a plant functional model where a set of plant functions link together hardware, software, operations, work organisation and other safety related aspects of the plant. The basic principle of the functional modelling is that any aspect of the plant can be represented by an object (in the sense that this term is used in computer science) based upon an Intent (or goal); associated with each Intent are Methods, by which the Intent is realized, and Constraints, which limit the Intent. The Methods and Constraints can themselves be treated as objects and decomposed into lower-level Intents (hence the procedure is known as functional decomposition) so giving rise to a hierarchical, object-oriented structure. The plant level hazard identification is carried out on the plant functional model using the Concept Hazard Analysis method. In this, the user will be supported by checklists and keywords and the analysis is structured by pre-defined worksheets. The preparation of the plant functional model and the performance of the hazard identification can be carried out manually or with computer support. (au) (4 tabs., 10 ills., 7 refs.)

  5. Ground-Level Ozone Following Astrophysical Ionizing Radiation Events: An Additional Biological Hazard?

    Thomas, Brian C; Goracke, Byron D

    2016-01-01

    Astrophysical ionizing radiation events such as supernovae, gamma-ray bursts, and solar proton events have been recognized as a potential threat to life on Earth, primarily through depletion of stratospheric ozone and subsequent increase in solar UV radiation at Earth's surface and in the upper levels of the ocean. Other work has also considered the potential impact of nitric acid rainout, concluding that no significant threat is likely. Not yet studied to date is the potential impact of ozone produced in the lower atmosphere following an ionizing radiation event. Ozone is a known irritant to organisms on land and in water and therefore may be a significant additional hazard. Using previously completed atmospheric chemistry modeling, we examined the amount of ozone produced in the lower atmosphere for the case of a gamma-ray burst and found that the values are too small to pose a significant additional threat to the biosphere. These results may be extended to other ionizing radiation events, including supernovae and extreme solar proton events. PMID:26745353

  6. Ground-level ozone following astrophysical ionizing radiation events: an additional biological hazard?

    Thomas, Brian C

    2015-01-01

    Astrophysical ionizing radiation events such as supernovae, gamma-ray bursts, and solar proton events have been recognized as a potential threat to life on Earth, primarily through depletion of stratospheric ozone and subsequent increase in solar UV radiation at Earth's surface and in the upper levels of the ocean. Other work has also considered the potential impact of nitric acid rainout, concluding that no significant threat is likely. Not yet studied to-date is the potential impact of ozone produced in the lower atmosphere following an ionizing radiation event. Ozone is a known irritant to organisms on land and in water and therefore may be a significant additional hazard. Using previously completed atmospheric chemistry modeling we have examined the amount of ozone produced in the lower atmosphere for the case of a gamma-ray burst and find that the values are too small to pose a significant additional threat to the biosphere. These results may be extended to other ionizing radiation events, including supe...

  7. Application of a hazard-based visual predictive check to evaluate parametric hazard models.

    Huh, Yeamin; Hutmacher, Matthew M

    2016-02-01

    Parametric models used in time to event analyses are evaluated typically by survival-based visual predictive checks (VPC). Kaplan-Meier survival curves for the observed data are compared with those estimated using model-simulated data. Because the derivative of the log of the survival curve is related to the hazard--the typical quantity modeled in parametric analysis--isolation, interpretation and correction of deficiencies in the hazard model determined by inspection of survival-based VPC's is indirect and thus more difficult. The purpose of this study is to assess the performance of nonparametric hazard estimators of hazard functions to evaluate their viability as VPC diagnostics. Histogram-based and kernel-smoothing estimators were evaluated in terms of bias of estimating the hazard for Weibull and bathtub-shape hazard scenarios. After the evaluation of bias, these nonparametric estimators were assessed as a method for VPC evaluation of the hazard model. The results showed that nonparametric hazard estimators performed reasonably at the sample sizes studied with greater bias near the boundaries (time equal to 0 and last observation) as expected. Flexible bandwidth and boundary correction methods reduced these biases. All the nonparametric estimators indicated a misfit of the Weibull model when the true hazard was a bathtub shape. Overall, hazard-based VPC plots enabled more direct interpretation of the VPC results compared to survival-based VPC plots. PMID:26563504

  8. Integrated Modeling for Flood Hazard Mapping Using Watershed Modeling System

    Seyedeh S. Sadrolashrafi

    2008-01-01

    Full Text Available In this stduy, a new framework which integrates the Geographic Information System (GIS with the Watershed Modeling System (WMS for flood modeling is developed. It also interconnects the terrain models and the GIS software, with commercial standard hydrological and hydraulic models, including HEC-1, HEC-RAS, etc. The Dez River Basin (about 16213 km2 in Khuzestan province, IRAN, is domain of study because of occuring frequent severe flash flooding. As a case of study, a major flood in autumn of 2001 is chosen to examine the modeling framework. The model consists of a rainfall-runoff model (HEC-1 that converts excess precipitation to overland flow and channel runoff and a hydraulic model (HEC-RAS that simulates steady state flow through the river channel network based on the HEC-1, peak hydrographs. In addition, it delineates the maps of potential flood zonation for the Dez River Basin. These are achieved based on the state of the art GIS with using WMS software. Watershed parameters are calibrated manually to perform a good simulation of discharge at three sub-basins. With the calibrated discharge, WMS is capable of producing flood hazard map. The modeling framework presented in this study demonstrates the accuracy and usefulness of the WMS software for flash flooding control. The results of this research will benefit future modeling efforts by providing validate hydrological software to forecast flooding on a regional scale. This model designed for the Dez River Basin, while this regional scale model may be used as a prototype for model applications in other areas.

  9. Reactive Additive Stabilization Process (RASP) for hazardous and mixed waste vitrification

    Solidification of hazardous/mixed wastes into glass is being examined at the Savannah River Site (SRS) for (1) nickel plating line (F006) sludges and (2) incinerator wastes. Vitrification of these wastes using high surface area additives, the Reactive Additive Stabilization Process (RASP), has been determined to greatly enhance the dissolution and retention of hazardous, mixed, and heavy metal species in glass. RASP lowers melt temperatures (typically 1050-- 1150 degrees C), thereby minimizing volatility concerns during vitrification. RASP maximizes waste loading (typically 50--75 wt% on a dry oxide basis) by taking advantage of the glass forming potential of the waste. RASP vitrification thereby minimizes waste disposal volume (typically 86--97 vol. %), and maximizes cost savings. Solidification of the F006 plating line sludges containing depleted uranium has been achieved in both soda-lime-silica (SLS) and borosilicate glasses at 1150 degrees C up to waste loadings of 75 wt%. Solidification of incinerator blowdown and mixtures of incinerator blowdown and bottom kiln ash have been achieved in SLS glass at 1150 degrees C up to waste loadings of 50% using RASP. These waste loadings correspond to volume reductions of 86 and 94 volume %, respectively, with large associated savings in storage costs

  10. Reactive Additive Stabilization Process (RASP) for hazardous and mixed waste vitrification

    Jantzen, C.M.; Pickett, J.B.; Ramsey, W.G.

    1993-07-01

    Solidification of hazardous/mixed wastes into glass is being examined at the Savannah River Site (SRS) for (1) nickel plating line (F006) sludges and (2) incinerator wastes. Vitrification of these wastes using high surface area additives, the Reactive Additive Stabilization Process (RASP), has been determined to greatly enhance the dissolution and retention of hazardous, mixed, and heavy metal species in glass. RASP lowers melt temperatures (typically 1050-- 1150{degrees}C), thereby minimizing volatility concerns during vitrification. RASP maximizes waste loading (typically 50--75 wt% on a dry oxide basis) by taking advantage of the glass forming potential of the waste. RASP vitrification thereby minimizes waste disposal volume (typically 86--97 vol. %), and maximizes cost savings. Solidification of the F006 plating line sludges containing depleted uranium has been achieved in both soda-lime-silica (SLS) and borosilicate glasses at 1150{degrees}C up to waste loadings of 75 wt%. Solidification of incinerator blowdown and mixtures of incinerator blowdown and bottom kiln ash have been achieved in SLS glass at 1150{degrees}C up to waste loadings of 50% using RASP. These waste loadings correspond to volume reductions of 86 and 94 volume %, respectively, with large associated savings in storage costs.

  11. Lahar Hazard Modeling at Tungurahua Volcano, Ecuador

    Sorensen, O. E.; Rose, W. I.; Jaya, D.

    2003-04-01

    lahar-hazard-zones using a digital elevation model (DEM), was used to construct a hazard map for the volcano. The 10 meter resolution DEM was constructed for Tungurahua Volcano using scanned topographic lines obtained from the GIS Department at the Escuela Politécnica Nacional, Quito, Ecuador. The steep topographic gradients and rapid downcutting of most rivers draining the edifice prevents the deposition of lahars on the lower flanks of Tungurahua. Modeling confirms the high degree of flow channelization in the deep Tungurahua canyons. Inundation zones observed and shown by LAHARZ at Baños yield identification of safe zones within the city which would provide safety from even the largest magnitude lahar expected.

  12. Incident duration modeling using flexible parametric hazard-based models.

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time. PMID:25530753

  13. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  14. Incorporation of all hazard categories into U.S. NRC PRA models

    Over the last two decades, the U.S. Nuclear Regulatory Commission (NRC) has maintained independent probabilistic risk assessment (PRA) models to calculate nuclear power plant (NPP) core damage frequency (CDF) from internal events at power. These models are known as Standardized Plan Analysis Risk (SPAR) models. There are 79 such models representing 104 domestic nuclear plants; with some SPAR models representing more than one unit on the site. These models allow the NRC risk analysts to perform independent quantitative risk estimates of operational events and degraded plant conditions. It is well recognized that using only the internal events contribution to overall plant risk estimates provides a useful, but limited, assessment of the complete plant risk profile. Inclusion, of all hazard categories applicable to a plant in the plant PRA model would provide a more comprehensive assessment of a plant risk. However, implementation of a more comprehensive treatment of additional hazard categories (e.g., fire, flooding, high winds, seismic) presents a number of challenges, including technical considerations. The U.S. NRC has been incorporating additional hazard categories into its set of nuclear power plant PRA models since 2004. Currently, 18 SPAR models include additional hazard categories such as internal flooding, internal fire, seismic, and wind events. In most cases, these external hazard models were derived from Generic Letter 88-20 Individual Plant Examination of External Events (IPEEE) reports. Recently, NRC started incorporating detailed Fire PRA (FPRA) information based on the current licensing effort that allows licensees to transition into a risk-informed fire protection framework, as well as additional external hazards developed by some licensees into enhanced SPAR models. These updated external hazards SPAR models are referred to as SPAR All-Hazard (SPAR-AHZ) models (i.e., they incorporate additional risk contributors beyond internal events). This paper

  15. Nonparametric and semiparametric dynamic additive regression models

    Scheike, Thomas Harder; Martinussen, Torben

    Dynamic additive regression models provide a flexible class of models for analysis of longitudinal data. The approach suggested in this work is suited for measurements obtained at random time points and aims at estimating time-varying effects. Both fully nonparametric and semiparametric models can...

  16. a model based on crowsourcing for detecting natural hazards

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  17. A Moral Hazard Model of Parental Care

    Baomin Dong; Tianpeng Zhou

    2013-01-01

    One perplexing observation is that although men and women have different comparative advantages, cooperation is often only seen during child-bearing and rearing periods. One interpretation is that the juvenile offspring serves as an indivisible public goods to facilitate cooperation between opposite sexes of adults. We show that moral hazard in maternal parental care will either force the father to pay the mother a rent in order to induce optimal care (when the child is of intrinsic high qual...

  18. 2015 USGS Seismic Hazard Model for Induced Seismicity

    Petersen, M. D.; Mueller, C. S.; Moschetti, M. P.; Hoover, S. M.; Ellsworth, W. L.; Llenos, A. L.; Michael, A. J.

    2015-12-01

    Over the past several years, the seismicity rate has increased markedly in multiple areas of the central U.S. Studies have tied the majority of this increased activity to wastewater injection in deep wells and hydrocarbon production. These earthquakes are induced by human activities that change rapidly based on economic and policy decisions, making them difficult to forecast. Our 2014 USGS National Seismic Hazard Model and previous models are intended to provide the long-term hazard (2% probability of exceedance in 50 years) and are based on seismicity rates and patterns observed mostly from tectonic earthquakes. However, potentially induced earthquakes were identified in 14 regions that were not included in the earthquake catalog used for constructing the 2014 model. We recognized the importance of considering these induced earthquakes in a separate hazard analysis, and as a result in April 2015 we released preliminary models that explored the impact of this induced seismicity on the hazard. Several factors are important in determining the hazard from induced seismicity: period of the catalog that optimally forecasts the next year's activity, earthquake magnitude-rate distribution, earthquake location statistics, maximum magnitude, ground motion models, and industrial drivers such as injection rates. The industrial drivers are not currently available in a form that we can implement in a 1-year model. Hazard model inputs have been evaluated by a broad group of scientists and engineers to assess the range of acceptable models. Results indicate that next year's hazard is significantly higher by more than a factor of three in Oklahoma, Texas, and Colorado compared to the long-term 2014 hazard model. These results have raised concern about the impacts of induced earthquakes on the built environment and have led to many engineering and policy discussions about how to mitigate these effects for the more than 7 million people that live near areas of induced seismicity.

  19. Flood hazard maps from SAR data and global hydrodynamic models

    Giustarini, Laura; Chini, Marci; Hostache, Renaud; Matgen, Patrick; Pappenberger, Florian; Bally, Phillippe

    2015-04-01

    With flood consequences likely to amplify because of growing population and ongoing accumulation of assets in flood-prone areas, global flood hazard and risk maps are greatly needed for improving flood preparedness at large scale. At the same time, with the rapidly growing archives of SAR images of floods, there is a high potential of making use of these images for global and regional flood management. In this framework, an original method is presented to integrate global flood inundation modeling and microwave remote sensing. It takes advantage of the combination of the time and space continuity of a global inundation model with the high spatial resolution of satellite observations. The availability of model simulations over a long time period offers the opportunity to estimate flood non-exceedance probabilities in a robust way. The probabilities can later be attributed to historical satellite observations. SAR-derived flood extent maps with their associated non-exceedance probabilities are then combined to generate flood hazard maps with a spatial resolution equal to that of the satellite images, which is most of the time higher than that of a global inundation model. The method can be applied to any area of interest in the world, provided that a sufficient number of relevant remote sensing images are available. We applied the method on the Severn River (UK) and on the Zambezi River (Mozambique), where large archives of Envisat flood images can be exploited. The global ECMWF flood inundation model is considered for computing the statistics of extreme events. A comparison with flood hazard maps estimated with in situ measured discharge is carried out. An additional analysis has been performed on the Severn River, using high resolution SAR data from the COSMO-SkyMed SAR constellation, acquired for a single flood event (one flood map per day between 27/11/2012 and 4/12/2012). The results showed that it is vital to observe the peak of the flood. However, a single

  20. Optimal design for additive partially nonlinear models

    Biedermann, Stefanie; Dette, Holger; Woods, David C.

    2010-01-01

    We develop optimal design theory for additive partially nonlinear regression models, and show that D-optimal designs can be found as the products of the corresponding D-optimal designs in one dimension. For partially nonlinear models, D-optimal designs depend on the unknown nonlinear model parameters, and misspecifications of these parameters can lead to poor designs. Hence we generalise our results to parameter robust optimality criteria, namely Bayesian and standardised maximin D-optimality...

  1. Modeling seismic hazard in the Lower Rhine Graben using a fault-based source model

    Vanneste, Kris; Vleminckx, Bart; Verbeeck, Koen; Camelbeeck, Thierry

    2013-04-01

    Earthquake Model (GEM). Compared to other commonly-used, non-commercial hazard engines, OpenQuake offers better support for fault sources with simple or complex geometries. We compute hazard maps for return periods of 475, 2375, and 10,000 yr, and compare the results with hazard maps based on area sources. In addition, we conduct sensitivity tests to determine the impact of various parameter choices, e.g. maximum magnitude, inclusion of a background zone to account for lower magnitudes, and GMPE distance metric.

  2. Optimal design for additive partially nonlinear models

    Biedermann, S.; Dette, H.; Woods, D.C.

    2011-01-01

    We develop optimal design theory for additive partially nonlinear regression models, showing that Bayesian and standardized maximin D-optimal designs can be found as the products of the corresponding optimal designs in one dimension. A sufficient condition under which analogous results hold for Ds-optimality is derived to accommodate situations in which only a subset of the model parameters is of interest. To facilitate prediction of the response at unobserved locations, we prove similar resu...

  3. Automated economic analysis model for hazardous waste minimization

    The US Army has established a policy of achieving a 50 percent reduction in hazardous waste generation by the end of 1992. To assist the Army in reaching this goal, the Environmental Division of the US Army Construction Engineering Research Laboratory (USACERL) designed the Economic Analysis Model for Hazardous Waste Minimization (EAHWM). The EAHWM was designed to allow the user to evaluate the life cycle costs for various techniques used in hazardous waste minimization and to compare them to the life cycle costs of current operating practices. The program was developed in C language on an IBM compatible PC and is consistent with other pertinent models for performing economic analyses. The potential hierarchical minimization categories used in EAHWM include source reduction, recovery and/or reuse, and treatment. Although treatment is no longer an acceptable minimization option, its use is widespread and has therefore been addressed in the model. The model allows for economic analysis for minimization of the Army's six most important hazardous waste streams. These include, solvents, paint stripping wastes, metal plating wastes, industrial waste-sludges, used oils, and batteries and battery electrolytes. The EAHWM also includes a general application which can be used to calculate and compare the life cycle costs for minimization alternatives of any waste stream, hazardous or non-hazardous. The EAHWM has been fully tested and implemented in more than 60 Army installations in the United States

  4. RELIABILITY AND HAZARD RATE ESTIMATION OF A LIFE TESTING MODEL

    Vinod Kumar

    2010-01-01

    Full Text Available The present paper deals with the reliability and hazard rate estimation of a Weibulltype life testing model. Its use as a life testing model has also been illustrated. The proposedmodel has been found better then exponential for several sets of lifetime data. Somecharacteristics of the model have also been investigated.

  5. A high-resolution global flood hazard model

    Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-09-01

    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.

  6. Generalized Additive Models for Nowcasting Cloud Shading

    Brabec, Marek; Paulescu, M.; Badescu, V.

    2014-01-01

    Roč. 101, March (2014), s. 272-282. ISSN 0038-092X R&D Projects: GA MŠk LD12009 Grant ostatní: European Cooperation in Science and Technology(XE) COST ES1002 Institutional support: RVO:67985807 Keywords : sunshine number * nowcasting * generalized additive model * Markov chain Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.469, year: 2014

  7. A generalized additive regression model for survival times

    Scheike, Thomas H.

    2001-01-01

    Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models......Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models...

  8. Regional landslide hazard assessment based on Distance Evaluation Model

    Jiacun LI; Yan QIN; Jing LI

    2008-01-01

    There are many factors influencing landslide occurrence. The key for landslide control is to confirm the regional landslide hazard factors. The Cameron Highlands of Malaysia was selected as the study area. By bivariate statistical analysis method with GIS software the authors analyzed the relationships among landslides and environmental factors such as lithology, geomorphy, elevation, road and land use. Distance Evaluation Model was developed with Landslide Density(LD). And the assessment of landslide hazard of Cameron Highlands was performed. The result shows that the model has higher prediction precision.

  9. Analysis of time to event outcomes in randomized controlled trials by generalized additive models.

    Christos Argyropoulos

    Full Text Available Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking.By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population.PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data.By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial results under proportional and

  10. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  11. Generalized Additive Models in Business and Economics

    Sunil K Sapra

    2013-06-01

    Full Text Available The paper presents applications of a class of semi-parametric models called generalized additive models (GAMs to several business and economic datasets. Applications include analysis of wage-education relationship, brand choice, and number of trips to a doctor’s office. The dependent variable may be continuous, categorical or count.  These semi-parametric models are flexible and robust extensions of Logit, Poisson, Negative Binomial and other generalized linear models. The GAMs are represented using penalized regression splines and are estimated by penalized regression methods. The degree of smoothness for the unknown functions in the linear predictor part of the GAM is estimated using cross validation. The GAMs allow us to build a regression surface as a sum of lower-dimensional nonparametric terms circumventing the curse of dimensionality: the slow convergence of an estimator to the true value in high dimensions. For each application studied in the paper, several GAMs are compared and the best model is selected using AIC, UBRE score, deviances, and R-sq (adjusted. The econometric techniques utilized in the paper are widely applicable to the analysis of count, binary response and duration types of data encountered in business and economics.

  12. Neural network modeling for regional hazard assessment of debris flow in Lake Qionghai Watershed, China

    Liu, Y.; Guo, H. C.; Zou, R.; Wang, L. J.

    2006-04-01

    This paper presents a neural network (NN) based model to assess the regional hazard degree of debris flows in Lake Qionghai Watershed, China. The NN model was used as an alternative for the more conventional linear model MFCAM (multi-factor composite assessment model) in order to effectively handle the nonlinearity and uncertainty inherent in the debris flow hazard analysis. The NN model was configured using a three layer structure with eight input nodes and one output node, and the number of nodes in the hidden layer was determined through an iterative process of varying the number of nodes in the hidden layer until an optimal performance was achieved. The eight variables used to represent the eight input nodes include density of debris flow gully, degree of weathering of rocks, active fault density, area percentage of slope land greater than 25° of the total land (APL25), frequency of flooding hazards, average covariance of monthly precipitation by 10 years (ACMP10), average days with rainfall >25 mm by 10 years (25D10Y), and percentage of cultivated land with slope land greater than 25° of the total cultivated land (PCL25). The output node represents the hazard-degree ranks (HDR). The model was trained with the 35 sets of data obtained from previous researches reported in literatures, and an explicit uncertainty analysis was undertaken to address the uncertainty in model training and prediction. Before the NN model is extrapolated to Lake Qionghai Watershed, a validation case, different from the above data, is conducted. In addition, the performances of the NN model and the MFCAM were compared. The NN model predicted that the HDRs of the five sub-watersheds in the Lake Qionghai Watershed were IV, IV, III, III, and IV V, indicating that the study area covers normal hazard and severe hazard areas. Based on the NN model results, debris flow management and economic development strategies in the study are proposed for each sub-watershed.

  13. Probabilistic modelling of rainfall induced landslide hazard assessment

    Kawagoe, S.; Kazama, S.; P. R. Sarukkalige

    2010-01-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential facto...

  14. Toward Building a New Seismic Hazard Model for Mainland China

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  15. Computational Process Modeling for Additive Manufacturing (OSU)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  16. Session Clustering Using Mixtures of Proportional Hazards Models

    Mair, Patrick; Hudec, Marcus

    2008-01-01

    Emanating from classical Weibull mixture models we propose a framework for clustering survival data with various proportionality restrictions imposed. By introducing mixtures of Weibull proportional hazards models on a multivariate data set a parametric cluster approach based on the EM-algorithm is carried out. The problem of non-response in the data is considered. The application example is a real life data set stemming from the analysis of a world-wide operating eCommerce application. Sessi...

  17. Probabilistic modelling of rainfall induced landslide hazard assessment

    S. Kawagoe

    2010-01-01

    Full Text Available To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide susceptibility maps are produced showing the spatial-temporal distribution of landslide hazard susceptibility over Japan. To represent the susceptibility in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard susceptibility exists in the mountain ranges on the western side of Japan (Japan Sea side, including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard susceptibility maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  18. Rockfall hazard analysis using LiDAR and spatial modeling

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  19. Estimation of the 2-sample hazard ratio function using a semiparametric model

    Yang, Song; Prentice, Ross L.

    2010-01-01

    The hazard ratio provides a natural target for assessing a treatment effect with survival data, with the Cox proportional hazards model providing a widely used special case. In general, the hazard ratio is a function of time and provides a visual display of the temporal pattern of the treatment effect. A variety of nonproportional hazards models have been proposed in the literature. However, available methods for flexibly estimating a possibly time-dependent hazard ratio are limited. Here, we...

  20. Random weighting method for Cox’s proportional hazards model

    2008-01-01

    Variance of parameter estimate in Cox’s proportional hazards model is based on asymptotic variance. When sample size is small, variance can be estimated by bootstrap method. However, if censoring rate in a survival data set is high, bootstrap method may fail to work properly. This is because bootstrap samples may be even more heavily censored due to repeated sampling of the censored observations. This paper proposes a random weighting method for variance estimation and confidence interval estimation for proportional hazards model. This method, unlike the bootstrap method, does not lead to more severe censoring than the original sample does. Its large sample properties are studied and the consistency and asymptotic normality are proved under mild conditions. Simulation studies show that the random weighting method is not as sensitive to heavy censoring as bootstrap method is and can produce good variance estimates or confidence intervals.

  1. Random weighting method for Cox's proportional hazards model

    CUI WenQuan; LI Kai; YANG YaNing; WU YueHua

    2008-01-01

    Variance of parameter estimate in Cox's proportional hazards model is based on asymptotic variance.When sample size is small,variance can be estimated by bootstrap method.However,if censoring rate in a survival data set is high,bootstrap method may fail to work properly.This is because bootstrap samples may be even more heavily censored due to repeated sampling of the censored observations.This paper proposes a random weighting method for variance estimation and confidence interval estimation for proportional hazards model.This method,unlike the bootstrap method,does not lead to more severe censoring than the original sample does.Its large sample properties are studied and the consistency and asymptotic normality are proved under mild conditions.Simulation studies show that the random weighting method is not as sensitive to heavy censoring as bootstrap method is and can produce good variance estimates or confidence intervals.

  2. Defaultable Game Options in a Hazard Process Model

    Tomasz R. Bielecki

    2009-01-01

    Full Text Available The valuation and hedging of defaultable game options is studied in a hazard process model of credit risk. A convenient pricing formula with respect to a reference filteration is derived. A connection of arbitrage prices with a suitable notion of hedging is obtained. The main result shows that the arbitrage prices are the minimal superhedging prices with sigma martingale cost under a risk neutral measure.

  3. An Additive-Multiplicative Cox-Aalen Regression Model

    Scheike, Thomas H.; Zhang, Mei-Jie

    2002-01-01

    Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects...

  4. Modelling public risk evaluation of natural hazards: a conceptual approach

    Th. Plattner

    2005-01-01

    Full Text Available In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  5. Modelling public risk evaluation of natural hazards: a conceptual approach

    Plattner, Th.

    2005-04-01

    In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk) situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  6. Development of hazard-compatible building fragility and vulnerability models

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  7. Model averaging for semiparametric additive partial linear models

    2010-01-01

    To improve the prediction accuracy of semiparametric additive partial linear models(APLM) and the coverage probability of confidence intervals of the parameters of interest,we explore a focused information criterion for model selection among ALPM after we estimate the nonparametric functions by the polynomial spline smoothing,and introduce a general model average estimator.The major advantage of the proposed procedures is that iterative backfitting implementation is avoided,which thus results in gains in computational simplicity.The resulting estimators are shown to be asymptotically normal.A simulation study and a real data analysis are presented for illustrations.

  8. A DNA based model for addition computation

    GAO Lin; YANG Xiao; LIU Wenbin; XU Jin

    2004-01-01

    Much effort has been made to solve computing problems by using DNA-an organic simulating method, which in some cases is preferable to the current electronic computer. However, No one at present has proposed an effective and applicable method to solve addition problem with molecular algorithm due to the difficulty in solving the carry problem which can be easily solved by hardware of an electronic computer. In this article, we solved this problem by employing two kinds of DNA strings, one is called result and operation string while the other is named carrier. The result and operation string contains some carry information by its own and denotes the ultimate result while the carrier is just for carrying use. The significance of this algorithm is the original code, the fairly easy steps to follow and the feasibility under current molecular biological technology.

  9. On penalized likelihood estimation for a non-proportional hazards regression model

    Devarajan, Karthik; Ebrahimi, Nader

    2013-01-01

    In this paper, a semi-parametric generalization of the Cox model that permits crossing hazard curves is described. A theoretical framework for estimation in this model is developed based on penalized likelihood methods. It is shown that the optimal solution to the baseline hazard, baseline cumulative hazard and their ratio are hyperbolic splines with knots at the distinct failure times.

  10. A decision model for the risk management of hazardous processes

    A decision model for risk management of hazardous processes as an optimisation problem of a point process is formulated in the study. In the approach, the decisions made by the management are divided into three categories: (1) planned process lifetime, (2) selection of the design and, (3) operational decisions. These three controlling methods play quite different roles in the practical risk management, which is also reflected in our approach. The optimisation of the process lifetime is related to the licensing problem of the process. It provides a boundary condition for a feasible utility function that is used as the actual objective function, i.e., maximizing the process lifetime utility. By design modifications, the management can affect the inherent accident hazard rate of the process. This is usually a discrete optimisation task. The study particularly concentrates upon the optimisation of the operational strategies given a certain design and licensing time. This is done by a dynamic risk model (marked point process model) representing the stochastic process of events observable or unobservable to the decision maker. An optimal long term control variable guiding the selection of operational alternatives in short term problems is studied. The optimisation problem is solved by the stochastic quasi-gradient procedure. The approach is illustrated by a case study. (23 refs.)

  11. Uncertainties in modeling hazardous gas releases for emergency response

    Baumann-Stanzer, Kathrin; Stenzel, Sirma [Zentralanstalt fuer Meteorologie und Geodynamik, Vienna (Austria)

    2011-02-15

    In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA) in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms{sup -1} in wind speed, on the scale of 50 degrees in wind direction, up to 4 C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders. (orig.)

  12. Modeling techniques for gaining additional urban space

    Thunig, Holger; Naumann, Simone; Siegmund, Alexander

    2009-09-01

    One of the major accompaniments of the globalization is the rapid growing of urban areas. Urban sprawl is the main environmental problem affecting those cities across different characteristics and continents. Various reasons for the increase in urban sprawl in the last 10 to 30 years have been proposed [1], and often depend on the socio-economic situation of cities. The quantitative reduction and the sustainable handling of land should be performed by inner urban development instead of expanding urban regions. Following the principal "spare the urban fringe, develop the inner suburbs first" requires differentiated tools allowing for quantitative and qualitative appraisals of current building potentials. Using spatial high resolution remote sensing data within an object-based approach enables the detection of potential areas while GIS-data provides information for the quantitative valuation. This paper presents techniques for modeling urban environment and opportunities of utilization of the retrieved information for urban planners and their special needs.

  13. Conveying Lava Flow Hazards Through Interactive Computer Models

    Thomas, D.; Edwards, H. K.; Harnish, E. P.

    2007-12-01

    As part of an Information Sciences senior class project, a software package of an interactive version of the FLOWGO model was developed for the Island of Hawaii. The software is intended for use in an ongoing public outreach and hazards awareness program that educates the public about lava flow hazards on the island. The design parameters for the model allow an unsophisticated user to initiate a lava flow anywhere on the island and allow it to flow down-slope to the shoreline while displaying a timer to show the rate of advance of the flow. The user is also able to modify a range of input parameters including eruption rate, the temperature of the lava at the vent, and crystal fraction present in the lava at the source. The flow trajectories are computed using a 30 m digital elevation model for the island and the rate of advance of the flow is estimated using the average slope angle and the computed viscosity of the lava as it cools in either a channel (high heat loss) or lava tube (low heat loss). Even though the FLOWGO model is not intended to, and cannot, accurately predict the rate of advance of a tube- fed or channel-fed flow, the relative rates of flow advance for steep or flat-lying terrain convey critically important hazard information to the public: communities located on the steeply sloping western flanks of Mauna Loa may have no more than a few hours to evacuate in the face of a threatened flow from Mauna Loa's southwest rift whereas communities on the more gently sloping eastern flanks of Mauna Loa and Kilauea may have weeks to months to prepare for evacuation. Further, the model also can show the effects of loss of critical infrastructure with consequent impacts on access into and out of communities, loss of electrical supply, and communications as a result of lava flow implacement. The interactive model has been well received in an outreach setting and typically generates greater involvement by the participants than has been the case with static maps

  14. Modeling of seismic hazards for dynamic reliability analysis

    This paper investigates the appropriate indices of seismic hazard curves (SHCs) for seismic reliability analysis. In the most seismic reliability analyses of structures, the seismic hazards are defined in the form of the SHCs of peak ground accelerations (PGAs). Usually PGAs play a significant role in characterizing ground motions. However, PGA is not always a suitable index of seismic motions. When random vibration theory developed in the frequency domain is employed to obtain statistics of responses, it is more convenient for the implementation of dynamic reliability analysis (DRA) to utilize an index which can be determined in the frequency domain. In this paper, we summarize relationships among the indices which characterize ground motions. The relationships between the indices and the magnitude M are arranged as well. In this consideration, duration time plays an important role in relating two distinct class, i.e. energy class and power class. Fourier and energy spectra are involved in the energy class, and power and response spectra and PGAs are involved in the power class. These relationships are also investigated by using ground motion records. Through these investigations, we have shown the efficiency of employing the total energy as an index of SHCs, which can be determined in the time and frequency domains and has less variance than the other indices. In addition, we have proposed the procedure of DRA based on total energy. (author)

  15. Hazard identification by extended multilevel flow modelling with function roles

    Wu, Jing; Zhang, Laibin; Jørgensen, Sten Bay;

    2014-01-01

    HAZOP studies are widely accepted in chemical and petroleum industries as the method for conducting process hazard analysis related to design, maintenance and operation of th e systems. In this paper, a HAZOP reasoning method based on function-oriented modelling, multilevel flow modelling (MFM) is...... extended with functi on roles to complete HAZOP studies in principle. A graphical MFM editor, which is combined with the reasoning engine (MFM Workbench) developed by DTU is applied to automate HAZOP studies. The method is proposed to suppor t the ‘brain-storming’ sessions in traditional HAZOP analysis. As...... a case study, the extended MFM-based HAZOP methodology is applied to an o ffshore three-phase separation process. The results show that the cause-consequence analysis in MFM can infer the cause and effect of a deviation used in HAZOP and used to fill HAZOP worksheet. This paper is the first pa per...

  16. Integrated Modeling for Flood Hazard Mapping Using Watershed Modeling System

    Seyedeh S. Sadrolashrafi; Thamer A. Mohamed; Ahmad R.B. Mahmud; Majid K. Kholghi; Amir Samadi

    2008-01-01

    In this stduy, a new framework which integrates the Geographic Information System (GIS) with the Watershed Modeling System (WMS) for flood modeling is developed. It also interconnects the terrain models and the GIS software, with commercial standard hydrological and hydraulic models, including HEC-1, HEC-RAS, etc. The Dez River Basin (about 16213 km2) in Khuzestan province, IRAN, is domain of study because of occuring frequent severe flash flooding. As a case of study, a major flood in autumn...

  17. Hazard based models for freeway traffic incident duration.

    Tavassoli Hojati, Ahmad; Ferreira, Luis; Washington, Simon; Charles, Phil

    2013-03-01

    Assessing and prioritising cost-effective strategies to mitigate the impacts of traffic incidents and accidents on non-recurrent congestion on major roads represents a significant challenge for road network managers. This research examines the influence of numerous factors associated with incidents of various types on their duration. It presents a comprehensive traffic incident data mining and analysis by developing an incident duration model based on twelve months of incident data obtained from the Australian freeway network. Parametric accelerated failure time (AFT) survival models of incident duration were developed, including log-logistic, lognormal, and Weibul-considering both fixed and random parameters, as well as a Weibull model with gamma heterogeneity. The Weibull AFT models with random parameters were appropriate for modelling incident duration arising from crashes and hazards. A Weibull model with gamma heterogeneity was most suitable for modelling incident duration of stationary vehicles. Significant variables affecting incident duration include characteristics of the incidents (severity, type, towing requirements, etc.), and location, time of day, and traffic characteristics of the incident. Moreover, the findings reveal no significant effects of infrastructure and weather on incident duration. A significant and unique contribution of this paper is that the durations of each type of incident are uniquely different and respond to different factors. The results of this study are useful for traffic incident management agencies to implement strategies to reduce incident duration, leading to reduced congestion, secondary incidents, and the associated human and economic losses. PMID:23333698

  18. Modeling and mitigating natural hazards: Stationarity is immortal!

    Montanari, Alberto; Koutsoyiannis, Demetris

    2014-12-01

    Environmental change is a reason of relevant concern as it is occurring at an unprecedented pace and might increase natural hazards. Moreover, it is deemed to imply a reduced representativity of past experience and data on extreme hydroclimatic events. The latter concern has been epitomized by the statement that "stationarity is dead." Setting up policies for mitigating natural hazards, including those triggered by floods and droughts, is an urgent priority in many countries, which implies practical activities of management, engineering design, and construction. These latter necessarily need to be properly informed, and therefore, the research question on the value of past data is extremely important. We herein argue that there are mechanisms in hydrological systems that are time invariant, which may need to be interpreted through data inference. In particular, hydrological predictions are based on assumptions which should include stationarity. In fact, any hydrological model, including deterministic and nonstationary approaches, is affected by uncertainty and therefore should include a random component that is stationary. Given that an unnecessary resort to nonstationarity may imply a reduction of predictive capabilities, a pragmatic approach, based on the exploitation of past experience and data is a necessary prerequisite for setting up mitigation policies for environmental risk.

  19. Extenics Model for Evaluating Vulnerable Degree of Regional Sustaining Hazard Body

    Fan Yunxiao; Luo Yun; Chen Qingshou

    2004-01-01

    The effect of hazard was determined by the dangerous degree of hazard factor-environment and the vulnerable degree of sustaining body. The research into the latter is of importance for the hazard theory and the formation of laws on the mitigation of natural hazards. The way to evaluate the vulnerable degree is the foundation of and the key to the research. In this paper, the extenics model is established to do this job.

  20. A Mathematical Model for the Industrial Hazardous Waste Location-Routing Problem

    Boyer, Omid; Sai Hong, Tang; Pedram, Ali; Mohd Yusuff, Rosnah Bt; Zulkifli, Norzima

    2013-01-01

    Technology progress is a cause of industrial hazardous wastes increasing in the whole world . Management of hazardous waste is a significant issue due to the imposed risk on environment and human life. This risk can be a result of location of undesirable facilities and also routing hazardous waste. In this paper a biobjective mixed integer programing model for location-routing industrial hazardous waste with two objectives is developed. First objective is total cost minimization including tra...

  1. The transport exponent in percolation models with additional loops

    Babalievski, F.

    1994-10-01

    Several percolation models with additional loops were studied. The transport exponents for these models were estimated numerically by means of a transfer-matrix approach. It was found that the transport exponent has a drastically changed value for some of the models. This result supports some previous numerical studies on the vibrational properties of similar models (with additional loops).

  2. Practical aspects of modelling of repairable systems data using proportional hazards models

    Cox's Proportional Hazards Model (PHM) has been widely applied in the analysis of lifetime data. The model is semi-parametric, so that weak assumptions are made about form of the hazard function. There have been medical developments of this model which have aided studies of repairable systems. A review of the practical use of this PHM model is given and particular attention is paid to the used of diagnostics statistics and graphs. Illustrations are given using field data from the semiconductor and electrical industries, and repairable data will be illustrated by data from the hydrocarbon industry

  3. Preliminary deformation model for National Seismic Hazard map of Indonesia

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year

  4. Preliminary deformation model for National Seismic Hazard map of Indonesia

    Meilano, Irwan; Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z. [Geodesy Research Division, Faculty of Earth Science and Technology, Institute of Technology Bandung (Indonesia); Susilo,; Efendi, Joni [Agency for Geospatial Information (BIG) (Indonesia)

    2015-04-24

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  5. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research&Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorist's actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  6. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  7. Regional Integrated Meteorological Forecasting and Warning Model for Geological Hazards Based on Logistic Regression

    XU Jing; YANG Chi; ZHANG Guoping

    2007-01-01

    Information model is adopted to integrate factors of various geosciences to estimate the susceptibility of geological hazards. Further combining the dynamic rainfall observations, Logistic regression is used for modeling the probabilities of geological hazard occurrences, upon which hierarchical warnings for rainfall-induced geological hazards are produced. The forecasting and warning model takes numerical precipitation forecasts on grid points as its dynamic input, forecasts the probabilities of geological hazard occurrences on the same grid, and translates the results into likelihoods in the form of a 5-level hierarchy. Validation of the model with observational data for the year 2004 shows that 80% of the geological hazards of the year have been identified as "likely enough to release warning messages". The model can satisfy the requirements of an operational warning system, thus is an effective way to improve the meteorological warnings for geological hazards.

  8. On Model Specification and Selection of the Cox Proportional Hazards Model*

    Lin, Chen-Yen; Halabi, Susan

    2013-01-01

    Prognosis plays a pivotal role in patient management and trial design. A useful prognostic model should correctly identify important risk factors and estimate their effects. In this article, we discuss several challenges in selecting prognostic factors and estimating their effects using the Cox proportional hazards model. Although a flexible semiparametric form, the Cox’s model is not entirely exempt from model misspecification. To minimize possible misspecification, instead of imposing tradi...

  9. Measurements and models for hazardous chemical and mixed wastes. 1998 annual progress report

    'Aqueous waste of various chemical compositions constitutes a significant fraction of the total waste produced by industry in the US. A large quantity of the waste generated by the US chemical process industry is waste water. In addition, the majority of the waste inventory at DoE sites previously used for nuclear weapons production is aqueous waste. Large quantities of additional aqueous waste are expected to be generated during the clean-up of those sites. In order to effectively treat, safely handle, and properly dispose of these wastes, accurate and comprehensive knowledge of basic thermophysical property information is paramount. This knowledge will lead to huge savings by aiding in the design and optimization of treatment and disposal processes. The main objectives of this project are: Develop and validate models that accurately predict the phase equilibria and thermodynamic properties of hazardous aqueous systems necessary for the safe handling and successful design of separation and treatment processes for hazardous chemical and mixed wastes. Accurately measure the phase equilibria and thermodynamic properties of a representative system (water + acetone + isopropyl alcohol + sodium nitrate) over the applicable ranges of temperature, pressure, and composition to provide the pure component, binary, ternary, and quaternary experimental data required for model development. As of May, 1998, nine months into the first year of a three year project, the authors have made significant progress in the database development, have begun testing the models, and have been performance testing the apparatus on the pure components.'

  10. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  11. A mental models approach to exploring perceptions of hazardous processes

    Based on mental models theory, a decision-analytic methodology is developed to elicit and represent perceptions of hazardous processes. An application to indoor radon illustrates the methodology. Open-ended interviews were used to elicit non-experts' perceptions of indoor radon, with explicit prompts for knowledge about health effects, exposure processes, and mitigation. Subjects then sorted photographs into radon-related and unrelated piles, explaining their rationale aloud as they sorted. Subjects demonstrated a small body of correct but often unspecific knowledge about exposure and effects processes. Most did not mention radon-decay processes, and seemed to rely on general knowledge about gases, radioactivity, or pollution to make inferences about radon. Some held misconceptions about contamination and health effects resulting from exposure to radon. In two experiments, subjects reading brochures designed according to the author's guidelines outperformed subjects reading a brochure distributed by the EPA on a diagnostic test, and did at least as well on an independently designed quiz. In both experiments, subjects who read any one of the brochures had more complete and correct knowledge about indoor radon than subjects who did not, whose knowledge resembled the radon-interview subjects'

  12. Hidden Markov models for estimating animal mortality from anthropogenic hazards

    Carcasses searches are a common method for studying the risk of anthropogenic hazards to wildlife, including non-target poisoning and collisions with anthropogenic structures. Typically, numbers of carcasses found must be corrected for scavenging rates and imperfect detection. ...

  13. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  14. Statistical modeling of ground motion relations for seismic hazard analysis

    Raschke, Mathias

    2012-01-01

    We introduce a new approach for ground motion relations (GMR) in the probabilistic seismic hazard analysis (PSHA), being influenced by the extreme value theory of mathematical statistics. Therein, we understand a GMR as a random function. We derive mathematically the principle of area-equivalence; wherein two alternative GMRs have an equivalent influence on the hazard if these GMRs have equivalent area functions. This includes local biases. An interpretation of the difference between these GM...

  15. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures

  16. Conceptual geoinformation model of natural hazards risk assessment

    Kulygin, Valerii

    2016-04-01

    Natural hazards are the major threat to safe interactions between nature and society. The assessment of the natural hazards impacts and their consequences is important in spatial planning and resource management. Today there is a challenge to advance our understanding of how socio-economical and climate changes will affect the frequency and magnitude of hydro-meteorological hazards and associated risks. However, the impacts from different types of natural hazards on various marine and coastal economic activities are not of the same type. In this study, the conceptual geomodel of risk assessment is presented to highlight the differentiation by the type of economic activities in extreme events risk assessment. The marine and coastal ecosystems are considered as the objects of management, on the one hand, and as the place of natural hazards' origin, on the other hand. One of the key elements in describing of such systems is the spatial characterization of their components. Assessment of ecosystem state is based on ecosystem indicators (indexes). They are used to identify the changes in time. The scenario approach is utilized to account for the spatio-temporal dynamics and uncertainty factors. Two types of scenarios are considered: scenarios of using ecosystem services by economic activities and scenarios of extreme events and related hazards. The reported study was funded by RFBR, according to the research project No. 16-35-60043 mol_a_dk.

  17. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    Grasso, S.; Maugeri, M.

    rigorous complex methods of analysis or qualitative procedures. A semi quantitative procedure based on the definition of the geotechnical hazard index has been applied for the zonation of the seismic geotechnical hazard of the city of Catania. In particular this procedure has been applied to define the influence of geotechnical properties of soil in a central area of the city of Catania, where some historical buildings of great importance are sited. It was also performed an investigation based on the inspection of more than one hundred historical ecclesiastical buildings of great importance, located in the city. Then, in order to identify the amplification effects due to the site conditions, a geotechnical survey form was prepared, to allow a semi quantitative evaluation of the seismic geotechnical hazard for all these historical buildings. In addition, to evaluate the foundation soil time -history response, a 1-D dynamic soil model was employed for all these buildings, considering the non linearity of soil behaviour. Using a GIS, a map of the seismic geotechnical hazard, of the liquefaction hazard and a preliminary map of the seismic hazard for the city of Catania have been obtained. From the analysis of obtained results it may be noticed that high hazard zones are mainly clayey sites

  18. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes

  19. Modelling the costs of natural hazards in games

    Bostenaru-Dan, M.

    2012-04-01

    City are looked for today, including a development at the University of Torino called SimTorino, which simulates the development of the city in the next 20 years. The connection to another games genre as video games, the board games, will be investigated, since there are games on construction and reconstruction of a cathedral and its tower and a bridge in an urban environment of the middle ages based on the two novels of Ken Follett, "Pillars of the Earth" and "World Without End" and also more recent games, such as "Urban Sprawl" or the Romanian game "Habitat", dealing with the man-made hazard of demolition. A review of these games will be provided based on first hand playing experience. In games like "World without End" or "Pillars of the Earth", just like in the recently popular games of Zynga on social networks, construction management is done through providing "building" an item out of stylised materials, such as "stone", "sand" or more specific ones as "nail". Such approach could be used also for retrofitting buildings for earthquakes, in the series of "upgrade", not just for extension as it is currently in games, and this is what our research is about. "World without End" includes a natural disaster not so analysed today but which was judged by the author as the worst of manhood: the Black Death. The Black Death has effects and costs as well, not only modelled through action cards, but also on the built environment, by buildings remaining empty. On the other hand, games such as "Habitat" rely on role playing, which has been recently recognised as a way to bring games theory to decision making through the so-called contribution of drama, a way to solve conflicts through balancing instead of weighting, and thus related to Analytic Hierarchy Process. The presentation aims to also give hints on how to design a game for the problem of earthquake retrofit, translating the aims of the actors in such a process into role playing. Games are also employed in teaching of urban

  20. Complex Modelling Scheme Of An Additive Manufacturing Centre

    Popescu, Liliana Georgeta

    2015-09-01

    This paper presents a modelling scheme sustaining the development of an additive manufacturing research centre model and its processes. This modelling is performed using IDEF0, the resulting model process representing the basic processes required in developing such a centre in any university. While the activities presented in this study are those recommended in general, changes may occur in specific existing situations in a research centre.

  1. Marginal integration $M-$estimators for additive models

    Boente, Graciela; Martinez, Alejandra

    2015-01-01

    Additive regression models have a long history in multivariate nonparametric regression. They provide a model in which each regression function depends only on a single explanatory variable allowing to obtain estimators at the optimal univariate rate. Beyond backfitting, marginal integration is a common procedure to estimate each component. In this paper, we propose a robust estimator of the additive components which combines local polynomials on the component to be estimated and marginal int...

  2. Process chain modeling and selection in an additive manufacturing context

    Thompson, Mary Kathryn; Stolfi, Alessandro; Mischkot, Michael

    2016-01-01

    This paper introduces a new two-dimensional approach to modeling manufacturing process chains. This approach is used to consider the role of additive manufacturing technologies in process chains for a part with micro scale features and no internal geometry. It is shown that additive manufacturing...... can compete with traditional process chains for small production runs. Combining both types of technology added cost but no benefit in this case. The new process chain model can be used to explain the results and support process selection, but process chain prototyping is still important for rapidly...... evolving fields like additive manufacturing....

  3. Spatial Distributed Seismicity Model of Seismic Hazard Mapping in the North-China Region: A Comparison with the GSHAP

    Zhong, Q.; Shi, B.; Meng, L.

    2010-12-01

    The North China is one of the most seismically active regions in the mainland China. The moderate to large earthquakes have occurred here throughout history, resulting in huge losses of human life and properties. With the probabilistic seismic hazard analysis (PSHA) approach, we investigate the influence of different seismic environments, incorporating both near surface soil properties and distributed historical and modern seismicity. A simplified seismic source model, derived with the consideration of regional active fault distributions, is presented for the North China region. The spatial distributed seismicity model of PSHA is used to calculate the level of ground motion likely to be exceeded in a given time period. Following Frankel (1995) approach of circular Gaussian smoothing procedure, in the PSHA’s calculation, we proposed the fault-rupture-oriented elliptical Gaussian smoothing with the assumptions that earthquakes occur on faults or fault zones of past earthquakes to delineate the potential seismic zones (Lapajine et al., 2003). This is combined with regional active fault strike directions and the seismicity distribution patterns. Next Generation Attenuation model ((NGA), Boore et al., 2007) is used in generating hazard map for PGA with 2%, 5%, and 10 % probability of being exceeded in 50 years, and the resultant hazard map is compared with the result given by Global Seismic Hazard Assessment Project (GSHAP). There is general agreement for PGA distribution patterns between the results of this study and the GSHAP map that used the same seismic source zones. However, peak ground accelerations predicted in this study are typically 10-20% less than those of the GSHAP, and the seismic source models, such as fault distributions and regional seismicity used in the GSHAP seem to be oversimplified. We believe this study represents an improvement on prior seismic hazard evaluations for the region. In addition to the updated input data, we believe that, by

  4. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study

  5. Seismic hazard methodology for nuclear facilities: modeling input interpretations

    Recently developments of probabilistic seismic hazard methodology specifically to assess hazard at low probabilities of (-3 per year) at locations in the central and eastern US have been based on input interpretations by multiple experts. In these studies, a number of individual scientists or teams of scientists provide interpretations of seismic sources and their associated seismicity parameters. To express uncertainty, multiple alternative interpretations are provided. Uncertainty about seismic wave attenuation is treated similarly by assigning weights to potentially applicable attenuation relationships. A seismic hazard methodology developed at the Electric Power Research Institute (EPRI) follows this general approach. However, a number of modifications have been incorporated to provide fully trackable interpretations of input parameters based on state-of-the-art earth science practice, to specifically distinguish scientific and information uncertainty, and to make maximum use of historic earthquake data to assess seismicity parameters. The goal of the program has been to develop a procedure that is consistent with earth science practice, that facilitates expressions of uncertainty in seismic hazard input interpretations, and that is generally applicable

  6. Additive Intensity Regression Models in Corporate Default Analysis

    Lando, David; Medhat, Mamdouh; Nielsen, Mads Stenbo;

    2013-01-01

    We consider additive intensity (Aalen) models as an alternative to the multiplicative intensity (Cox) models for analyzing the default risk of a sample of rated, nonfinancial U.S. firms. The setting allows for estimating and testing the significance of time-varying effects. We use a variety of mo...

  7. Comparison of empirical and data driven hydrometeorological hazard models on coastal cities of São Paulo, Brazil

    Koga-Vicente, A.; Friedel, M. J.

    2010-12-01

    Every year thousands of people are affected by floods and landslide hazards caused by rainstorms. The problem is more serious in tropical developing countries because of the susceptibility as a result of the high amount of available energy to form storms, and the high vulnerability due to poor economic and social conditions. Predictive models of hazards are important tools to manage this kind of risk. In this study, a comparison of two different modeling approaches was made for predicting hydrometeorological hazards in 12 cities on the coast of São Paulo, Brazil, from 1994 to 2003. In the first approach, an empirical multiple linear regression (MLR) model was developed and used; the second approach used a type of unsupervised nonlinear artificial neural network called a self-organized map (SOM). By using twenty three independent variables of susceptibility (precipitation, soil type, slope, elevation, and regional atmospheric system scale) and vulnerability (distribution and total population, income and educational characteristics, poverty intensity, human development index), binary hazard responses were obtained. Model performance by cross-validation indicated that the respective MLR and SOM model accuracy was about 67% and 80%. Prediction accuracy can be improved by the addition of information, but the SOM approach is preferred because of sparse data and highly nonlinear relations among the independent variables.

  8. A nonparametric dynamic additive regression model for longitudinal data

    Martinussen, Torben; Thomas H. Scheike

    2000-01-01

    In this work we study additive dynamic regression models for longitudinal data. These models provide a flexible and nonparametric method for investigating the time-dynamics of longitudinal data. The methodology is aimed at data where measurements are recorded at random time points. We model the conditional mean of responses given the full internal history and possibly time-varying covariates. We derive the asymptotic distribution for a new nonparametric least squares estimat...

  9. Debris flow hazard modelling on medium scale: Valtellina di Tirano, Italy

    Blahut, J.; Horton, P.; Sterlacchini, S.; Jaboyedoff, M.

    2010-11-01

    Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from

  10. Debris flow hazard modelling on medium scale: Valtellina di Tirano, Italy

    J. Blahut

    2010-11-01

    Full Text Available Debris flow hazard modelling at medium (regional scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal, and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy. The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R, developed at the University of Lausanne (Switzerland. An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise

  11. Linear non-threshold radiation hazards (LNT) model and the evaluation of the current model

    To introduce linear non-threshold (LNT) model used in study of the dose effect of radiation hazards and to evaluate current application of the model. The comprehensive analysis of the literatures, presents an objective points of view. Results: LNT model describes the biological effects induced by high dose is better than description of the biological effects induced by low doses m accuracy; repairable-conditionally repairable model in study of cell radiation effects can well take into account on cell survival curve on the conditions of high, medium and low radiation dose range; assessment model of effective dose of internal radiation based on the LNT assumptions and individual mean organ equivalent dose still exists many uncertainties, taking gender differences into account it is necessary to establish gender-specific voxel human model. Conclusion: The advantages and disadvantages of various models coexist. Before the birth of the new theory and new model following the current theories and assessment of radiation hazards LNT model is still the most scientific attitude and a wise choice. (authors)

  12. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.

  13. Investigation of the Effect of Traffic Parameters on Road Hazard Using Classification Tree Model

    Md. Mahmud Hasan

    2012-09-01

    Full Text Available This paper presents a method for the identification of hazardous situations on the freeways. For this study, about 18 km long section of Eastern Freeway in Melbourne, Australia was selected as a test bed. Three categories of data i.e. traffic, weather and accident record data were used for the analysis and modelling. In developing the crash risk probability model, classification tree based model was developed in this study. In formulating the models, it was found that weather conditions did not have significant impact on accident occurrence so the classification tree was built using two traffic indices; traffic flow and vehicle speed only. The formulated classification tree is able to identify the possible hazard and non-hazard situations on freeway. The outcome of the study will aid the hazard mitigation strategies.

  14. Using Set Model for Learning Addition of Integers

    Umi Puji Lestari

    2015-07-01

    Full Text Available This study aims to investigate how set model can help students' understanding of addition of integers in fourth grade. The study has been carried out to 23 students and a teacher of IVC SD Iba Palembang in January 2015. This study is a design research that also promotes PMRI as the underlying design context and activity. Results showed that the use of set models that is packaged in activity of recording of financial transactions in two color chips and card game can help students to understand the concept of zero pair, addition with the same colored chips, and cancellation strategy.

  15. Modeling Errors in Daily Precipitation Measurements: Additive or Multiplicative?

    Tian, Yudong; Huffman, George J.; Adler, Robert F.; Tang, Ling; Sapiano, Matthew; Maggioni, Viviana; Wu, Huan

    2013-01-01

    The definition and quantification of uncertainty depend on the error model used. For uncertainties in precipitation measurements, two types of error models have been widely adopted: the additive error model and the multiplicative error model. This leads to incompatible specifications of uncertainties and impedes intercomparison and application.In this letter, we assess the suitability of both models for satellite-based daily precipitation measurements in an effort to clarify the uncertainty representation. Three criteria were employed to evaluate the applicability of either model: (1) better separation of the systematic and random errors; (2) applicability to the large range of variability in daily precipitation; and (3) better predictive skills. It is found that the multiplicative error model is a much better choice under all three criteria. It extracted the systematic errors more cleanly, was more consistent with the large variability of precipitation measurements, and produced superior predictions of the error characteristics. The additive error model had several weaknesses, such as non constant variance resulting from systematic errors leaking into random errors, and the lack of prediction capability. Therefore, the multiplicative error model is a better choice.

  16. A new method for avalanche hazard mapping using a combination of statistical and deterministic models

    M. Barbolini; Keylock, C. J.

    2002-01-01

    The purpose of the present paper is to propose a new method for avalanche hazard mapping using a combination of statistical and deterministic modelling tools. The methodology is based on frequency-weighted impact pressure, and uses an avalanche dynamics model embedded within a statistical framework. The outlined procedure provides a useful way for avalanche experts to produce hazard maps for the typical case of avalanche sites where histor...

  17. Scalable audio separation with light kernel additive modelling

    Liutkus, Antoine; Fitzgerald, Derry; Rafii, Zafar

    2015-01-01

    Recently, Kernel Additive Modelling (KAM) was proposed as a unified framework to achieve multichannel audio source separation. Its main feature is to use kernel models for locally describing the spectrograms of the sources. Such kernels can capture source features such as repetitivity, stability over time and/or frequency, self-similarity, etc. KAM notably subsumes many popular and effective methods from the state of the art, including REPET and harmonic/percussive separation with median filt...

  18. A-optimal designs for an additive quadratic mixture model

    Chan, LY; Guan, YN; Zhang, CQ

    1998-01-01

    Quadratic models are widely used in the analysis of experiments involving mixtures. This paper gives A-optimal designs for an additive quadratic mixture model for q ≥ 3 mixture components. It is proved that in these A-optimal designs, vertices of the simplex S q-1 are support points, and other support points shift gradually from barycentres of depth 1 to barycentres of depth 3 as q increases. A-optimal designs with minimal support are also discussed.

  19. Single-Index Additive Vector Autoregressive Time Series Models

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  20. Potential of weight of evidence modelling for gully erosion hazard assessment in Mbire District - Zimbabwe

    Dube, F.; Nhapi, I.; Murwira, A.; Gumindoga, W.; Goldin, J.; Mashauri, D. A.

    Gully erosion is an environmental concern particularly in areas where landcover has been modified by human activities. This study assessed the extent to which the potential of gully erosion could be successfully modelled as a function of seven environmental factors (landcover, soil type, distance from river, distance from road, Sediment Transport Index (STI), Stream Power Index (SPI) and Wetness Index (WI) using a GIS-based Weight of Evidence Modelling (WEM) in the Mbire District of Zimbabwe. Results show that out of the studied seven factors affecting gully erosion, five were significantly correlated (p 0.05). A gully erosion hazard map showed that 78% of the very high hazard class area is within a distance of 250 m from rivers. Model validation indicated that 70% of the validation set of gullies were in the high hazard and very high hazard class. The resulting map of areas susceptible to gully erosion has a prediction accuracy of 67.8%. The predictive capability of the weight of evidence model in this study suggests that landcover, soil type, distance from river, STI and SPI are useful in creating a gully erosion hazard map but may not be sufficient to produce a valid map of gully erosion hazard.

  1. Traffic Incident Clearance Time and Arrival Time Prediction Based on Hazard Models

    Yang beibei Ji

    2014-01-01

    Full Text Available Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an effective input for travel time prediction. In this paper, the hazard based prediction models are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.

  2. Bayesian Analysis of Hazard Regression Models under Order Restrictions on Covariate Effects and Ageing

    Bhattacharjee, Arnab; Bhattacharjee, Madhuchhanda

    2007-01-01

    We propose Bayesian inference in hazard regression models where the baseline hazard is unknown, covariate effects are possibly age-varying (non-proportional), and there is multiplicative frailty with arbitrary distribution. Our framework incorporates a wide variety of order restrictions on covariate dependence and duration dependence (ageing). We propose estimation and evaluation of age-varying covariate effects when covariate dependence is monotone rather than proportional. In particular, we...

  3. An approach for flood hazard modelling and mapping in the medium Valtellina

    Poretti, I; Amicis, M.

    2011-01-01

    In the Lombardy Region, as in many other contexts all over the world, hazard maps do not have a precise legislative confirmation. Despite this, they are necessary to support several institutional activities, and among these, local urban planning. An approach to hazard analysis and mapping that fits the Lombardy Region legislative framework is presented here that introduces a level of experimental modelling, making use of SOBEK 1-D–2-D as a tool for hydrodynamic simulations. A stretch of 17 km...

  4. 24/7 population modelling for enhanced assessment of exposure to natural hazards

    Smith, Alan; Martin, David; Cockings, Samantha

    2013-01-01

    There is a growing need for accurate spatio-temporal population estimates free from arbitrary administrative boundaries and temporal divisions to make enhanced assessments of population exposure to natural hazards. The approach proposed here combines the use of a spatio-temporal gridded population model to estimate temporary variations in population with natural hazard exposure estimations. It has been exemplified through a Southampton (UK) centred application using Environment Agency flood m...

  5. Additive modelling reveals spatiotemporal PCBs trends in marine sediments

    G. EVERAERT; De Laender, F.; Deneudt, K.; Roose, P.; Mees, J.; Goethals, P.L.M.; Janssen, C.R.

    2014-01-01

    We developed generalised additive mixed models (GAMMs) to infer spatiotemporal trends of environmental PCB concentrations from an extensive dataset (n = 1219) of PCB concentrations measured between 1991 and 2010 in sediments of the Belgian Coastal Zone (BCZ) and the Western Scheldt estuary. A GAMM with time, geographical zone, periodicity and the organic carbon - water partition coefficient as covariates explained 49% of the variability in the log transformed PCB sediment concentrations. The ...

  6. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  7. Ground water flow modeling at a hazardous waste site for regulatory compliance

    The Pacific Northwest Laboratory has developed a model of an unconfined ground water flow system that is located beneath a hazardous waste facility and is subject to regulations outlined in the Resource Conservation and Recovery Act (RCRA). This facility is located on the Hanford Site in southeastern Washington State near the Columbia River. Characterization of the ground water flow system is complicated by continuous river-stage fluctuations. Water-table elevation changes of several feet per day are observed in monitoring wells near the facility because of changes in the river stage that are controlled by discharges from Priest Rapids Dam located upstream. A two-dimensional, finite difference, ground water flow model was calibrated with 6 months of continuous water-level measurements from three wells and continuous river-stage data. Two-hour time steps were used in transient simulations. Measured river elevations were used during each time step as a constant-head boundary. The modeled responses at well locations showed an acceptable match with ground water levels recorded in the field and hydraulic gradients in the vicinity of the facility. Simulations of the unconfined aquifer were made to estimate the probable paths of any transport from the facility, and recommendations were made for the placement of additional monitoring wells for regulatory compliance. The flow model and a planned solute transport model will be used to site and limit the number of wells required later. Ultimately, modeling will assist in additional site characterization and will be used to support the evaluation of future actions

  8. Understanding Rasch Measurement: The Rasch Model, Additive Conjoint Measurement, and New Models of Probabilistic Measurement Theory.

    Karabatsos, George

    2001-01-01

    Describes similarities and differences between additive conjoint measurement and the Rasch model, and formalizes some new nonparametric item response models that are, in a sense, probabilistic measurement theory models. Applies these new models to published and simulated data. (SLD)

  9. LNG fires: A review of experimental results, models and hazard prediction challenges

    A number of experimental investigations of LNG fires (of sizes 35 m diameter and smaller) were undertaken, world wide, during the 1970s and 1980s to study their physical and radiative characteristics. This paper reviews the published data from several of these tests including from the largest test to date, the 35 m, Montoir tests. Also reviewed in this paper is the state of the art in modeling LNG pool and vapor fires, including thermal radiation hazard modeling. The review is limited to considering the integral and semi-empirical models (solid flame and point source); CFD models are not reviewed. Several aspects of modeling LNG fires are reviewed including, the physical characteristics, such as the (visible) fire size and shape, tilt and drag in windy conditions, smoke production, radiant thermal output, etc., and the consideration of experimental data in the models. Comparisons of model results with experimental data are indicated and current deficiencies in modeling are discussed. The requirements in the US and European regulations related to LNG fire hazard assessment are reviewed, in brief, in the light of model inaccuracies, criteria for hazards to people and structures, and the effects of mitigating circumstances. The paper identifies: (i) critical parameters for which there exist no data, (ii) uncertainties and unknowns in modeling and (iii) deficiencies and gaps in current regulatory recipes for predicting hazards

  10. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  11. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  12. Checking Fine and Gray subdistribution hazards model with cumulative sums of residuals

    Li, Jianing; Scheike, Thomas; Zhang, Mei Jie

    2015-01-01

    of residuals, which validate the model in three aspects: (1) proportionality of hazard ratio, (2) the linear functional form and (3) the link function. For each assumption testing, we provide a p-values and a visualized plot against the null hypothesis using a simulation-based approach. We also consider......Recently, Fine and Gray (J Am Stat Assoc 94:496–509, 1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter...

  13. Two-stage local M-estimation of additive models

    2008-01-01

    This paper studies local M-estimation of the nonparametric components of additive models.A two-stage local M-estimation procedure is proposed for estimating the additive components and their derivatives.Under very mild conditions,the proposed estimators of each additive component and its derivative are jointly asymptotically normal and share the same asymptotic distributions as they would be if the other components were known.The established asymptotic results also hold for two particular local M-estimations:the local least squares and least absolute deviation estimations.However,for general two-stage local M-estimation with continuous and nonlinear ψ-functions,its implementation is time-consuming.To reduce the computational burden,one-step approximations to the two-stage local M-estimators are developed.The one-step estimators are shown to achieve the same effciency as the fully iterative two-stage local M-estimators,which makes the two-stage local M-estimation more feasible in practice.The proposed estimators inherit the advantages and at the same time overcome the disadvantages of the local least-squares based smoothers.In addition,the practical implementation of the proposed estimation is considered in details.Simulations demonstrate the merits of the two-stage local M-estimation,and a real example illustrates the performance of the methodology.

  14. Two-stage local M-estimation of additive models

    JIANG JianCheng; LI JianTao

    2008-01-01

    This paper studies local M-estimation of the nonparametric components of additive models. A two-stage local M-estimation procedure is proposed for estimating the additive components and their derivatives. Under very mild conditions, the proposed estimators of each additive component and its derivative are jointly asymptotically normal and share the same asymptotic distributions as they would be if the other components were known. The established asymptotic results also hold for two particular local M-estimations: the local least squares and least absolute deviation estimations. However,for general two-stage local M-estimation with continuous and nonlinear ψ-functions, its implementation is time-consuming. To reduce the computational burden, one-step approximations to the two-stage local M-estimators are developed. The one-step estimators are shown to achieve the same efficiency as the fully iterative two-stage local M-estimators, which makes the two-stage local M-estimation more feasible in practice. The proposed estimators inherit the advantages and at the same time overcome the disadvantages of the local least-squares based smoothers. In addition, the practical implementation of the proposed estimation is considered in details. Simulations demonstrate the merits of the two-stage local M-estimation, and a real example illustrates the performance of the methodology.

  15. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    the outburst of landslide-dammed lakes) remains a challenge: • The knowledge about the onset of the process is often limited (bathymetry of the lakes, subsurface water, properties of dam (content of ice), type of dam breach, understanding of process chains and interactions). • The size of glacial lakes may change rapidly but continuously, and many lakes break out within a short time after their development. Continuous monitoring is therefore required to keep updated on the existing hazards. • Also the outburst of small glacial lakes may lead to significant debris floods or even debris flows if there is plenty of erodible material available. • The available modeling software packages are of limited suitability for lake outburst floods: e.g. software developed by the hydrological community is specialized to simulate (debris) floods with input hydrographs on moderately steep flow channels and with lower sediment loads. In contrast to this, programs for rapid mass movements are better suited on steeper slopes and sudden onset of the movement. The typical characteristics of GLOFs are in between and vary for different channel sections. In summary, the major bottlenecks remain in deriving realistic or worst case scenarios and predicting their magnitude and area of impact. This mainly concerns uncertainties in the dam break process, involved volumes, erosion rates, changing rheologies, and the limited capabilities of available software packages to simulate process interactions and transformations such as the development of a hyperconcentrated flow into a debris flow. In addition, many areas prone to lake outburst floods are located in developing countries with a limited scope of the threatened population for decision-making and limited resources for mitigation.

  16. Addition Table of Colours: Additive and Subtractive Mixtures Described Using a Single Reasoning Model

    Mota, A. R.; Lopes dos Santos, J. M. B.

    2014-01-01

    Students' misconceptions concerning colour phenomena and the apparent complexity of the underlying concepts--due to the different domains of knowledge involved--make its teaching very difficult. We have developed and tested a teaching device, the addition table of colours (ATC), that encompasses additive and subtractive mixtures in a single…

  17. Three multimedia models used at hazardous and radioactive waste sites

    Multimedia models are used commonly in the initial phases of the remediation process where technical interest is focused on determining the relative importance of various exposure pathways. This report provides an approach for evaluating and critically reviewing the capabilities of multimedia models. This study focused on three specific models MEPAS Version 3.0, MMSOILS Version 2.2, and PRESTO-EPA-CPG Version 2.0. These models evaluate the transport and fate of contaminants from source to receptor through more than a single pathway. The presence of radioactive and mixed wastes at a site poses special problems. Hence, in this report, restrictions associated with the selection and application of multimedia models for sites contaminated with radioactive and mixed wastes are highlighted. This report begins with a brief introduction to the concept of multimedia modeling, followed by an overview of the three models. The remaining chapters present more technical discussions of the issues associated with each compartment and their direct application to the specific models. In these analyses, the following components are discussed: source term; air transport; ground water transport; overland flow, runoff, and surface water transport; food chain modeling; exposure assessment; dosimetry/risk assessment; uncertainty; default parameters. The report concludes with a description of evolving updates to the model; these descriptions were provided by the model developers

  18. Three multimedia models used at hazardous and radioactive waste sites

    Moskowitz, P.D.; Pardi, R.; Fthenakis, V.M.; Holtzman, S.; Sun, L.C. [Brookhaven National Lab., Upton, NY (United States); Rambaugh, J.O.; Potter, S. [Geraghty and Miller, Inc., Plainview, NY (United States)

    1996-02-01

    Multimedia models are used commonly in the initial phases of the remediation process where technical interest is focused on determining the relative importance of various exposure pathways. This report provides an approach for evaluating and critically reviewing the capabilities of multimedia models. This study focused on three specific models MEPAS Version 3.0, MMSOILS Version 2.2, and PRESTO-EPA-CPG Version 2.0. These models evaluate the transport and fate of contaminants from source to receptor through more than a single pathway. The presence of radioactive and mixed wastes at a site poses special problems. Hence, in this report, restrictions associated with the selection and application of multimedia models for sites contaminated with radioactive and mixed wastes are highlighted. This report begins with a brief introduction to the concept of multimedia modeling, followed by an overview of the three models. The remaining chapters present more technical discussions of the issues associated with each compartment and their direct application to the specific models. In these analyses, the following components are discussed: source term; air transport; ground water transport; overland flow, runoff, and surface water transport; food chain modeling; exposure assessment; dosimetry/risk assessment; uncertainty; default parameters. The report concludes with a description of evolving updates to the model; these descriptions were provided by the model developers.

  19. Estimation and variable selection for generalized additive partial linear models

    Wang, Li

    2011-08-01

    We study generalized additive partial linear models, proposing the use of polynomial spline smoothing for estimation of nonparametric functions, and deriving quasi-likelihood based estimators for the linear parameters. We establish asymptotic normality for the estimators of the parametric components. The procedure avoids solving large systems of equations as in kernel-based procedures and thus results in gains in computational simplicity. We further develop a class of variable selection procedures for the linear parameters by employing a nonconcave penalized quasi-likelihood, which is shown to have an asymptotic oracle property. Monte Carlo simulations and an empirical example are presented for illustration. © Institute of Mathematical Statistics, 2011.

  20. High energy pp scattering in the additive eikonal quark model

    Our additive eikonal quark model is generalized and applied to the elastic pp scattering in the energy range 50 - 2050 GeV. A new long-range interaction term was called for, in particular by the sharp change of the slope of dsigma/dt at the very small values of /t/. An alternative mechanism to geometrical scaling, in which the radii are almost fixed and the core strength is mainly responsible for the shift of the dip position in dsigma/dt, leads to an equally good agreement with experiment. (orig.)

  1. Multiscale Modeling of Powder Bed–Based Additive Manufacturing

    Markl, Matthias; Körner, Carolin

    2016-07-01

    Powder bed fusion processes are additive manufacturing technologies that are expected to induce the third industrial revolution. Components are built up layer by layer in a powder bed by selectively melting confined areas, according to sliced 3D model data. This technique allows for manufacturing of highly complex geometries hardly machinable with conventional technologies. However, the underlying physical phenomena are sparsely understood and difficult to observe during processing. Therefore, an intensive and expensive trial-and-error principle is applied to produce components with the desired dimensional accuracy, material characteristics, and mechanical properties. This review presents numerical modeling approaches on multiple length scales and timescales to describe different aspects of powder bed fusion processes. In combination with tailored experiments, the numerical results enlarge the process understanding of the underlying physical mechanisms and support the development of suitable process strategies and component topologies.

  2. Hazardous Concentrations for Ecosystems (HCE): calculation with CATS models

    Traas TP; Aldenberg T; Janse JH; Brock TCM; Roghair CJ; Rijksinstituut voor Volksgezondheid en Milieu (RIVM), Winand Staring Centrum (SC-DLO); LWD

    1995-01-01

    Dose-response functions were fitted on data from laboratory toxicity tests and were used to predict the response of functional groups in food webs. Direct effects of Chlorpyrifos (CPF), as observed in microcosm experiments, could be modelled adequately by incorporating dose-response functions in a CATS model. Indirect effects of CPF on functional groups, resulting from direct toxicity, could be predicted with the model too. The ecosystem response to toxicants was used to propose a quality sta...

  3. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  4. Modeling exposure to persistent chemicals in hazard and risk assessment.

    Cowan-Ellsberry, Christina E; McLachlan, Michael S; Arnot, Jon A; Macleod, Matthew; McKone, Thomas E; Wania, Frank

    2009-10-01

    Fate and exposure modeling has not, thus far, been explicitly used in the risk profile documents prepared for evaluating the significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of persistent organic pollutants (POP) and persistent, bioaccumulative, and toxic (PBT) chemicals in the environment. The goal of this publication is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include 1) benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk; 2) directly estimating the exposure of the environment, biota, and humans to provide information to complement measurements or where measurements are not available or are limited; 3) to identify the key processes and chemical or environmental parameters that determine the exposure, thereby allowing the effective prioritization of research or measurements to improve the risk profile; and 4) forecasting future time trends, including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and

  5. Proposal for a probabilistic local level landslide hazard assessment model: The case of Suluktu, Kyrgyzstan

    Vidar Vangelsten, Bjørn; Fornes, Petter; Cepeda, Jose Mauricio; Ekseth, Kristine Helene; Eidsvig, Unni; Ormukov, Cholponbek

    2015-04-01

    Landslides are a significant threat to human life and the built environment in many parts of Central Asia. To improve understanding of the magnitude of the threat and propose appropriate risk mitigation measures, landslide hazard mapping is needed both at regional and local level. Many different approaches for landslide hazard mapping exist depending on the scale and purpose of the analysis and what input data are available. This paper presents a probabilistic local scale landslide hazard mapping methodology for rainfall triggered landslides, adapted to the relatively dry climate found in South-Western Kyrgyzstan. The GIS based approach makes use of data on topography, geology, land use and soil characteristics to assess landslide susceptibility. Together with a selected rainfall scenario, these data are inserted into a triggering model based on an infinite slope formulation considering pore pressure and suction effects for unsaturated soils. A statistical model based on local landslide data has been developed to estimate landslide run-out. The model links the spatial extension of the landslide to land use and geological features. The model is tested and validated for the town of Suluktu in the Ferghana Valley in South-West Kyrgyzstan. Landslide hazard is estimated for the urban area and the surrounding hillsides. The case makes use of a range of data from different sources, both remote sensing data and in-situ data. Public global data sources are mixed with case specific data obtained from field work. The different data and models have various degrees of uncertainty. To account for this, the hazard model has been inserted into a Monte Carlo simulation framework to produce a probabilistic landslide hazard map identifying areas with high landslide exposure. The research leading to these results has received funding from the European Commission's Seventh Framework Programme [FP7/2007-2013], under grant agreement n° 312972 "Framework to integrate Space-based and in

  6. Modeling contractor and company employee behavior in high hazard operation

    Lin, P.H.; Hanea, D.; Ale, B.J.M.

    2013-01-01

    The recent blow-out and subsequent environmental disaster in the Gulf of Mexico have highlighted a number of serious problems in scientific thinking about safety. Risk models have generally concentrated on technical failures, which are easier to model and for which there are more concrete data. Howe

  7. Modeling and Testing Landslide Hazard Using Decision Tree

    Mutasem Sh. Alkhasawneh

    2014-01-01

    Full Text Available This paper proposes a decision tree model for specifying the importance of 21 factors causing the landslides in a wide area of Penang Island, Malaysia. These factors are vegetation cover, distance from the fault line, slope angle, cross curvature, slope aspect, distance from road, geology, diagonal length, longitude curvature, rugosity, plan curvature, elevation, rain perception, soil texture, surface area, distance from drainage, roughness, land cover, general curvature, tangent curvature, and profile curvature. Decision tree models are used for prediction, classification, and factors importance and are usually represented by an easy to interpret tree like structure. Four models were created using Chi-square Automatic Interaction Detector (CHAID, Exhaustive CHAID, Classification and Regression Tree (CRT, and Quick-Unbiased-Efficient Statistical Tree (QUEST. Twenty-one factors were extracted using digital elevation models (DEMs and then used as input variables for the models. A data set of 137570 samples was selected for each variable in the analysis, where 68786 samples represent landslides and 68786 samples represent no landslides. 10-fold cross-validation was employed for testing the models. The highest accuracy was achieved using Exhaustive CHAID (82.0% compared to CHAID (81.9%, CRT (75.6%, and QUEST (74.0% model. Across the four models, five factors were identified as most important factors which are slope angle, distance from drainage, surface area, slope aspect, and cross curvature.

  8. Conceptual model of volcanism and volcanic hazards of the region of Ararat valley, Armenia

    Meliksetian, Khachatur; Connor, Charles; Savov, Ivan; Connor, Laura; Navasardyan, Gevorg; Manucharyan, Davit; Ghukasyan, Yura; Gevorgyan, Hripsime

    2015-04-01

    Armenia and the adjacent volcanically active regions in Iran, Turkey and Georgia are located in the collision zone between the Arabian and Eurasian lithospheric plates. The majority of studies of regional collision related volcanism use the model proposed by Keskin, (2003) where volcanism is driven by Neo-Tethyan slab break-off. In Armenia, >500 Quaternary-Holocene volcanoes from the Gegham, Vardenis and Syunik volcanic fields are hosted within pull-apart structures formed by active faults and their segments (Karakhanyan et al., 2002), while tectonic position of the large in volume basalt-dacite Aragats volcano and periphery volcanic plateaus is different and its position away from major fault lines necessitates more complex volcano-tectonic setup. Our detailed volcanological, petrological and geochemical studies provide insight into the nature of such volcanic activity in the region of Ararat Valley. Most magmas, such as those erupted in Armenia are volatile-poor and erupt fairly hot. Here we report newly discovered tephra sequences in Ararat valley, that were erupted from historically active Ararat stratovolcano and provide evidence for explosive eruption of young, mid K2O calc-alkaline and volatile-rich (>4.6 wt% H2O; amph-bearing) magmas. Such young eruptions, in addition to the ignimbrite and lava flow hazards from Gegham and Aragats, present a threat to the >1.4 million people (~ ½ of the population of Armenia). We will report numerical simulations of potential volcanic hazards for the region of Ararat valley near Yerevan that will include including tephra fallout, lava flows and opening of new vents. Connor et al. (2012) J. Applied Volcanology 1:3, 1-19; Karakhanian et al. (2002), JVGR, 113, 319-344; Keskin, M. (2003) Geophys. Res. Lett. 30, 24, 8046.

  9. Additive manufacturing for consumer-centric business models

    Bogers, Marcel; Hadar, Ronen; Bilberg, Arne

    2016-01-01

    Digital fabrication—including additive manufacturing (AM), rapid prototyping and 3D printing—has the potential to revolutionize the way in which products are produced and delivered to the customer. Therefore, it challenges companies to reinvent their business model—describing the logic of creatin...... the productive activities of the manufacturer. We discuss some of the main implications for research and practice of consumer-centric business models and the changing decoupling point in consumer goods' manufacturing supply chains....... and capturing value. In this paper, we explore the implications that AM technologies have for manufacturing systems in the new business models that they enable. In particular, we consider how a consumer goods manufacturer can organize the operations of a more open business model when moving from a manufacturer......-centric to a consumer-centric value logic. A major shift includes a move from centralized to decentralized supply chains, where consumer goods manufacturers can implement a “hybrid” approach with a focus on localization and accessibility or develop a fully personalized model where the consumer effectively takes over...

  10. WATEQ3 geochemical model: thermodynamic data for several additional solids

    Geochemical models such as WATEQ3 can be used to model the concentrations of water-soluble pollutants that may result from the disposal of nuclear waste and retorted oil shale. However, for a model to competently deal with these water-soluble pollutants, an adequate thermodynamic data base must be provided that includes elements identified as important in modeling these pollutants. To this end, several minerals and related solid phases were identified that were absent from the thermodynamic data base of WATEQ3. In this study, the thermodynamic data for the identified solids were compiled and selected from several published tabulations of thermodynamic data. For these solids, an accepted Gibbs free energy of formation, ΔG0/sub f,298/, was selected for each solid phase based on the recentness of the tabulated data and on considerations of internal consistency with respect to both the published tabulations and the existing data in WATEQ3. For those solids not included in these published tabulations, Gibbs free energies of formation were calculated from published solubility data (e.g., lepidocrocite), or were estimated (e.g., nontronite) using a free-energy summation method described by Mattigod and Sposito (1978). The accepted or estimated free energies were then combined with internally consistent, ancillary thermodynamic data to calculate equilibrium constants for the hydrolysis reactions of these minerals and related solid phases. Including these values in the WATEQ3 data base increased the competency of this geochemical model in applications associated with the disposal of nuclear waste and retorted oil shale. Additional minerals and related solid phases that need to be added to the solubility submodel will be identified as modeling applications continue in these two programs

  11. Checking Fine and Gray Subdistribution Hazards Model with Cumulative Sums of Residuals

    Li, Jianing; Scheike, Thomas H.; Zhang, Mei-Jie

    2014-01-01

    Recently, Fine and Gray (1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. T...

  12. Probabilistic Modelling of the Seismic Hazard using the Romanian Earthquake Catalogue

    Bogdan F. Popa; Gabriela-Maria Atanasiu

    2005-01-01

    The actual trend of performance based modelling of the seismic action is to adopt probabilistic models of the seismic hazard. In the first part of the paper are presented theoretical aspects of the seismic hazard definition from the probabilistic point of view as „a function P(Y > y) that describes the probability that in a given region (M) and for a time interval (T), the value of a parameter, Y (for example: macroseismic intensity, acceleration, velocity and displacement of the soil) to ove...

  13. Modeling contractor and company employee behavior in high hazard operation

    Lin, P. H.; Hanea, D.; Ale, B.J.M.

    2013-01-01

    The recent blow-out and subsequent environmental disaster in the Gulf of Mexico have highlighted a number of serious problems in scientific thinking about safety. Risk models have generally concentrated on technical failures, which are easier to model and for which there are more concrete data. However, many primary cause of the disasters, such as BP’s Texas City and Deepwater Horizon, are rooted in management decisions and organizational. Therefore, there is a strong need to develop a risk m...

  14. Comparing the European (SHARE) and the reference Italian seismic hazard models

    Visini, Francesco; Meletti, Carlo; D'Amico, Vera; Rovida, Andrea; Stucchi, Massimiliano

    2016-04-01

    A probabilistic seismic hazard evaluation for Europe has been recently released by the SHARE project (www.share-eu.org, Giardini et al., 2013; Woessner et al., 2015). A comparison between SHARE results for Italy and the official Italian seismic hazard model (MPS04, Stucchi et al., 2011), currently adopted by the building code, has been carried on to identify the main input elements that produce the differences between the two models. The SHARE model shows increased expected values (up to 70%) with respect to the MPS04 model for PGA with 10% probability of exceedance in 50 years. However, looking in detail at all output parameters of both the models, we observe that for spectral periods greater than 0.3 s, the reference PSHA for Italy proposes higher values than the SHARE model for many and large areas. This behaviour is mainly guided by the adoption of recent ground-motion prediction equations (GMPEs) that estimate higher values for PGA and for accelerations with periods lower than 0.3 s and lower values for higher periods with respect to older GMPEs used in MPS04. Another important set of tests consisted in analyzing separately the PSHA results obtained by the three source models adopted in SHARE (i.e., area sources, fault sources with background, and a refined smoothed seismicity model), whereas MPS04 only used area sources. Results show that, besides the strong impact of the GMPEs, the differences on the seismic hazard estimates among the three source models are relevant and, in particular, for some selected test sites, the fault-based model returns lowest estimates of seismic hazard. This result arises questions on the completeness of the fault database, their parameterization and assessment of activity rates as well as on the impact of the threshold magnitude between faults and background. Giardini D. et al., 2013. Seismic Hazard Harmonization in Europe (SHARE): Online Data Resource, doi:10.12686/SED-00000001-SHARE. Stucchi M. et al., 2011. Seismic Hazard

  15. Geo-additive modelling of malaria in Burundi

    Gebhardt Albrecht

    2011-08-01

    Full Text Available Abstract Background Malaria is a major public health issue in Burundi in terms of both morbidity and mortality, with around 2.5 million clinical cases and more than 15,000 deaths each year. It is still the single main cause of mortality in pregnant women and children below five years of age. Because of the severe health and economic burden of malaria, there is still a growing need for methods that will help to understand the influencing factors. Several studies/researches have been done on the subject yielding different results as which factors are most responsible for the increase in malaria transmission. This paper considers the modelling of the dependence of malaria cases on spatial determinants and climatic covariates including rainfall, temperature and humidity in Burundi. Methods The analysis carried out in this work exploits real monthly data collected in the area of Burundi over 12 years (1996-2007. Semi-parametric regression models are used. The spatial analysis is based on a geo-additive model using provinces as the geographic units of study. The spatial effect is split into structured (correlated and unstructured (uncorrelated components. Inference is fully Bayesian and uses Markov chain Monte Carlo techniques. The effects of the continuous covariates are modelled by cubic p-splines with 20 equidistant knots and second order random walk penalty. For the spatially correlated effect, Markov random field prior is chosen. The spatially uncorrelated effects are assumed to be i.i.d. Gaussian. The effects of climatic covariates and the effects of other spatial determinants are estimated simultaneously in a unified regression framework. Results The results obtained from the proposed model suggest that although malaria incidence in a given month is strongly positively associated with the minimum temperature of the previous months, regional patterns of malaria that are related to factors other than climatic variables have been identified

  16. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological

  17. Hazardous Concentrations for Ecosystems (HCE): calculation with CATS models

    Traas TP; Aldenberg T; Janse JH; Brock TCM; Roghair CJ; Rijksinstituut voor; LWD

    1995-01-01

    Dose-response functions were fitted on data from laboratory toxicity tests and were used to predict the response of functional groups in food webs. Direct effects of Chlorpyrifos (CPF), as observed in microcosm experiments, could be modelled adequately by incorporating dose-response functions in a C

  18. Bayesian nonparametric estimation of hazard rate in monotone Aalen model

    Timková, Jana

    2014-01-01

    Roč. 50, č. 6 (2014), s. 849-868. ISSN 0023-5954 Institutional support: RVO:67985556 Keywords : Aalen model * Bayesian estimation * MCMC Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.541, year: 2014 http://library.utia.cas.cz/separaty/2014/SI/timkova-0438210.pdf

  19. A NOVEL SOFT COMPUTING MODEL ON LANDSLIDE HAZARD ZONE MAPPING

    Iqbal Quraishi

    2012-11-01

    Full Text Available The effect of landslide is very prominent in India as well as world over. In India North-East region and all the areas beneath the Himalayan range is prone to landslide. As state wise Uttrakhand, Himachal Pradesh and northern part of West Bengal are identified as a risk zone for landslide. In West Bengal, Darjeeling area is identified as our focus zone. There are several types of landslides depending upon various conditions. Most contributing factor of landslide is Earthquakes. Both field and the GIS data are very versatile and large in amount. Creating a proper data warehouse includes both Remote and field studies. Our proposed soft computing model merge the field and remote sensing data and create an optimized landslide susceptible map of the zone and also provide a broad risk assessment. It takes into account census and economic survey data as an input to calculate and predict the probable number of damaged houses, roads, other amenities including the effect on GDP. The model is highly customizable and tends to provide situation specific results. A fuzzy logic based approach has been considered to partially implement the model in terms of different parameter data sets to show the effectiveness of the proposed model.

  20. Combining multiple nondestructive inspection images with a generalized additive model

    In this paper, multiple nondestructive inspection (NDI) images are combined with a generalized additive model to achieve a more precise and reliable assessment of hidden corrosion in aircraft lap joints. Two inspection techniques are considered in this study. One is the conventional multi-frequency eddy current testing technique and the other is the pulsed eddy current technique. To characterize the thickness loss or equivalently to achieve a quantitative measure of corrosion, multiple NDI images are fused to produce a thickness map that reflected the amount of corrosion damage. These results are further compared with corresponding digital x-ray thickness maps, which are obtained by mapping the remaining thickness after the specimen is dissembled and all the corrosion products are cleaned. Experimental results demonstrate that the proposed algorithms outperform the traditional calibration method aligned with a single testing approach

  1. Utilizing NASA Earth Observations to Model Volcanic Hazard Risk Levels in Areas Surrounding the Copahue Volcano in the Andes Mountains

    Keith, A. M.; Weigel, A. M.; Rivas, J.

    2014-12-01

    Copahue is a stratovolcano located along the rim of the Caviahue Caldera near the Chile-Argentina border in the Andes Mountain Range. There are several small towns located in proximity of the volcano with the two largest being Banos Copahue and Caviahue. During its eruptive history, it has produced numerous lava flows, pyroclastic flows, ash deposits, and lahars. This isolated region has steep topography and little vegetation, rendering it poorly monitored. The need to model volcanic hazard risk has been reinforced by recent volcanic activity that intermittently released several ash plumes from December 2012 through May 2013. Exposure to volcanic ash is currently the main threat for the surrounding populations as the volcano becomes more active. The goal of this project was to study Copahue and determine areas that have the highest potential of being affected in the event of an eruption. Remote sensing techniques were used to examine and identify volcanic activity and areas vulnerable to experiencing volcanic hazards including volcanic ash, SO2 gas, lava flow, pyroclastic density currents and lahars. Landsat 7 Enhanced Thematic Mapper Plus (ETM+), Landsat 8 Operational Land Imager (OLI), EO-1 Advanced Land Imager (ALI), Terra Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Shuttle Radar Topography Mission (SRTM), ISS ISERV Pathfinder, and Aura Ozone Monitoring Instrument (OMI) products were used to analyze volcanic hazards. These datasets were used to create a historic lava flow map of the Copahue volcano by identifying historic lava flows, tephra, and lahars both visually and spectrally. Additionally, a volcanic risk and hazard map for the surrounding area was created by modeling the possible extent of ash fallout, lahars, lava flow, and pyroclastic density currents (PDC) for future eruptions. These model results were then used to identify areas that should be prioritized for disaster relief and evacuation orders.

  2. Flood Hazard Mapping Combining Hydrodynamic Modeling and Multi Annual Remote Sensing data

    Laura Giustarini; Marco Chini; Renaud Hostache; Florian Pappenberger; Patrick Matgen

    2015-01-01

    This paper explores a method to combine the time and space continuity of a large-scale inundation model with discontinuous satellite microwave observations, for high-resolution flood hazard mapping. The assumption behind this approach is that hydraulic variables computed from continuous spatially-distributed hydrodynamic modeling and observed as discrete satellite-derived flood extents are correlated in time, so that probabilities can be transferred from the model series to the observations...

  3. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  4. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

    Nielsen, Jan; Parner, Erik

    2010-01-01

    In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

  5. Evaluation and hydrological modelization in the natural hazard prevention

    Soil degradation affects negatively his functions as a base to produce food, to regulate the hydrological cycle and the environmental quality. All over the world soil degradation is increasing partly due to lacks or deficiencies in the evaluations of the processes and causes of this degradation on each specific situation. The processes of soil physical degradation are manifested through several problems as compaction, runoff, hydric and Eolic erosion, landslides with collateral effects in situ and in the distance, often with disastrous consequences as foods, landslides, sedimentations, droughts, etc. These processes are frequently associated to unfavorable changes into the hydrologic processes responsible of the water balance and soil hydric regimes, mainly derived to soil use changes and different management practices and climatic changes. The evaluation of these processes using simple simulation models; under several scenarios of climatic change, soil properties and land use and management; would allow to predict the occurrence of this disastrous processes and consequently to select and apply the appropriate practices of soil conservation to eliminate or reduce their effects. This simulation models require, as base, detailed climatic information and hydrologic soil properties data. Despite of the existence of methodologies and commercial equipment (each time more sophisticated and precise) to measure the different physical and hydrological soil properties related with degradation processes, most of them are only applicable under really specific or laboratory conditions. Often indirect methodologies are used, based on relations or empiric indexes without an adequate validation, that often lead to expensive mistakes on the evaluation of soil degradation processes and their effects on natural disasters. It could be preferred simple field methodologies, direct and adaptable to different soil types and climates and to the sample size and the spatial variability of the

  6. Global river flood hazard maps: hydraulic modelling methods and appropriate uses

    Townend, Samuel; Smith, Helen; Molloy, James

    2014-05-01

    Flood hazard is not well understood or documented in many parts of the world. Consequently, the (re-)insurance sector now needs to better understand where the potential for considerable river flooding aligns with significant exposure. For example, international manufacturing companies are often attracted to countries with emerging economies, meaning that events such as the 2011 Thailand floods have resulted in many multinational businesses with assets in these regions incurring large, unexpected losses. This contribution addresses and critically evaluates the hydraulic methods employed to develop a consistent global scale set of river flood hazard maps, used to fill the knowledge gap outlined above. The basis of the modelling approach is an innovative, bespoke 1D/2D hydraulic model (RFlow) which has been used to model a global river network of over 5.3 million kilometres. Estimated flood peaks at each of these model nodes are determined using an empirically based rainfall-runoff approach linking design rainfall to design river flood magnitudes. The hydraulic model is used to determine extents and depths of floodplain inundation following river bank overflow. From this, deterministic flood hazard maps are calculated for several design return periods between 20-years and 1,500-years. Firstly, we will discuss the rationale behind the appropriate hydraulic modelling methods and inputs chosen to produce a consistent global scaled river flood hazard map. This will highlight how a model designed to work with global datasets can be more favourable for hydraulic modelling at the global scale and why using innovative techniques customised for broad scale use are preferable to modifying existing hydraulic models. Similarly, the advantages and disadvantages of both 1D and 2D modelling will be explored and balanced against the time, computer and human resources available, particularly when using a Digital Surface Model at 30m resolution. Finally, we will suggest some

  7. Flood Hazard Mapping Combining Hydrodynamic Modeling and Multi Annual Remote Sensing data

    Laura Giustarini

    2015-10-01

    Full Text Available This paper explores a method to combine the time and space continuity of a large-scale inundation model with discontinuous satellite microwave observations, for high-resolution flood hazard mapping. The assumption behind this approach is that hydraulic variables computed from continuous spatially-distributed hydrodynamic modeling and observed as discrete satellite-derived flood extents are correlated in time, so that probabilities can be transferred from the model series to the observations. A prerequisite is, therefore, the existence of a significant correlation between a modeled variable (i.e., flood extent or volume and the synchronously-observed flood extent. If this is the case, the availability of model simulations over a long time period allows for a robust estimate of non-exceedance probabilities that can be attributed to corresponding synchronously-available satellite observations. The generated flood hazard map has a spatial resolution equal to that of the satellite images, which is higher than that of currently available large scale inundation models. The method was applied on the Severn River (UK, using the outputs of a global inundation model provided by the European Centre for Medium-range Weather Forecasts and a large collection of ENVISAT ASAR imagery. A comparison between the hazard map obtained with the proposed method and with a more traditional numerical modeling approach supports the hypothesis that combining model results and satellite observations could provide advantages for high-resolution flood hazard mapping, provided that a sufficient number of remote sensing images is available and that a time correlation is present between variables derived from a global model and obtained from satellite observations.

  8. Rainfall Hazards Prevention based on a Local Model Forecasting System

    Buendia, F.; Ojeda, B.; Buendia Moya, G.; Tarquis, A. M.; Andina, D.

    2009-04-01

    Rainfall is one of the most important events of human life and society. Some rainfall phenomena like floods or hailstone are a threat to the agriculture, business and even life. However in the meteorological observatories there are methods to detect and alarm about this kind of events, nowadays the prediction techniques based on synoptic measurements need to be improved to achieve medium term feasible forecasts. Any deviation in the measurements or in the model description makes the forecast to diverge in time from the real atmosphere evolution. In this paper the advances in a local rainfall forecasting system based on time series estimation with General Regression Neural Networks are presented. The system is introduced, explaining the measurements, methodology and the current state of the development. The aim of the work is to provide a complementary criteria to the current forecast systems, based on the daily atmosphere observation and tracking over a certain place.

  9. Building a risk-targeted regional seismic hazard model for South-East Asia

    Woessner, J.; Nyst, M.; Seyhan, E.

    2015-12-01

    The last decade has tragically shown the social and economic vulnerability of countries in South-East Asia to earthquake hazard and risk. While many disaster mitigation programs and initiatives to improve societal earthquake resilience are under way with the focus on saving lives and livelihoods, the risk management sector is challenged to develop appropriate models to cope with the economic consequences and impact on the insurance business. We present the source model and ground motions model components suitable for a South-East Asia earthquake risk model covering Indonesia, Malaysia, the Philippines and Indochine countries. The source model builds upon refined modelling approaches to characterize 1) seismic activity from geologic and geodetic data on crustal faults and 2) along the interface of subduction zones and within the slabs and 3) earthquakes not occurring on mapped fault structures. We elaborate on building a self-consistent rate model for the hazardous crustal fault systems (e.g. Sumatra fault zone, Philippine fault zone) as well as the subduction zones, showcase some characteristics and sensitivities due to existing uncertainties in the rate and hazard space using a well selected suite of ground motion prediction equations. Finally, we analyze the source model by quantifying the contribution by source type (e.g., subduction zone, crustal fault) to typical risk metrics (e.g.,return period losses, average annual loss) and reviewing their relative impact on various lines of businesses.

  10. Mortality and socio-economic differences in Denmark: a competing risks proportional hazard model.

    Munch, Jakob Roland; Svarer, Michael

    2005-03-01

    This paper explores how mortality is related to such socio-economic factors as education, occupation, skill level and income for the years 1992-1997 using an extensive sample of the Danish population. We employ a competing risks proportional hazard model to allow for different causes of death. This method is important as some factors have unequal (and sometimes opposite) influence on the cause-specific mortality rates. We find that the often-found inverse correlation between socio-economic status and mortality is to a large degree absent among Danish women who die of cancer. In addition, for men the negative correlation between socio-economic status and mortality prevails for some diseases, but for women we find that factors such as being married, income, wealth and education are not significantly associated with higher life expectancy. Marriage increases the likelihood of dying from cancer for women, early retirement prolongs survival for men, and homeownership increases life expectancy in general. PMID:15722260

  11. Large Historical Earthquakes and Tsunami Hazards in the Western Mediterranean: Source Characteristics and Modelling

    Harbi, Assia; Meghraoui, Mustapha; Belabbes, Samir; Maouche, Said

    2010-05-01

    The western Mediterranean region was the site of numerous large earthquakes in the past. Most of these earthquakes are located at the East-West trending Africa-Eurasia plate boundary and along the coastline of North Africa. The most recent recorded tsunamigenic earthquake occurred in 2003 at Zemmouri-Boumerdes (Mw 6.8) and generated ~ 2-m-high tsunami wave. The destructive wave affected the Balearic Islands and Almeria in southern Spain and Carloforte in southern Sardinia (Italy). The earthquake provided a unique opportunity to gather instrumental records of seismic waves and tide gauges in the western Mediterranean. A database that includes a historical catalogue of main events, seismic sources and related fault parameters was prepared in order to assess the tsunami hazard of this region. In addition to the analysis of the 2003 records, we study the 1790 Oran and 1856 Jijel historical tsunamigenic earthquakes (Io = IX and X, respectively) that provide detailed observations on the heights and extension of past tsunamis and damage in coastal zones. We performed the modelling of wave propagation using NAMI-DANCE code and tested different fault sources from synthetic tide gauges. We observe that the characteristics of seismic sources control the size and directivity of tsunami wave propagation on both northern and southern coasts of the western Mediterranean.

  12. A nonparametric dynamic additive regression model for longitudinal data

    Martinussen, Torben; Scheike, Thomas H.

    2000-01-01

    dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models......dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models...

  13. Mass movement hazard assessment model in the slope profile

    Colangelo, A. C.

    2003-04-01

    The central aim of this work is to assess the spatial behaviour of critical depths for slope stability and the behaviour of their correlated variables in the soil-regolith transition along slope profiles over granite, migmatite and mica-schist parent materials in an humid tropical environment. In this way, we had making measures of shear strength for residual soils and regolith materials with soil "Cohron Sheargraph" apparatus and evaluated the shear stress tension behaviour at soil-regolith boundary along slope profiles, in each referred lithology. In the limit equilibrium approach applied here we adapt the infinite slope model for slope analysis in whole slope profile by means of finite element solution like in Fellenius or Bishop methods. In our case, we assume that the potential rupture surface occurs at soil-regolith or soil-rock boundary in slope material. For each slice, the factor of safety was calculated considering the value of shear strength (cohesion and friction) of material, soil-regolith boundary depth, soil moisture level content, slope gradient, top of subsurface flow gradient, apparent soil bulk density. The correlations showed the relative weight of cohesion, internal friction angle, apparent bulk density of soil materials and slope gradient variables with respect to the evaluation of critical depth behaviour for different simulated soil moisture content levels at slope profile scale. Some important results refer to the central role of behaviour of soil bulk-density variable along slope profile during soil evolution and in present day, because the intense clay production, mainly Kaolinite and Gibbsite at B and C-horizons, in the humid tropical environment. A increase in soil clay content produce a fall of friction angle and bulk density of material, specially when some montmorillonite or illite clay are present. We have observed too at threshold conditions, that a slight change in soil bulk-density value may disturb drastically the equilibrium of

  14. Time-aggregation effects on the baseline of continuous-time and discrete-time hazard models

    ter Hofstede, F.; Wedel, M.

    1999-01-01

    In this study we reinvestigate the effect of time-aggregation for discrete- and continuous-time hazard models. We reanalyze the results of a previous Monte Carlo study by ter Hofstede and Wedel (1998), in which the effects of time-aggregation on the parameter estimates of hazard models were investig

  15. Advances in National Capabilities for Consequence Assessment Modeling of Airborne Hazards

    Nasstrom, J; Sugiyama, G; Foster, K; Larsen, S; Kosovic, B; Eme, B; Walker, H; Goldstein, P; Lundquist, J; Pobanz, B; Fulton, J

    2007-11-26

    This paper describes ongoing advancement of airborne hazard modeling capabilities in support of multiple agencies through the National Atmospheric Release Advisory Center (NARAC) and the Interagency Atmospheric Modeling and Atmospheric Assessment Center (IMAAC). A suite of software tools developed by Lawrence Livermore National Laboratory (LLNL) and collaborating organizations includes simple stand-alone, local-scale plume modeling tools for end user's computers, Web- and Internet-based software to access advanced 3-D flow and atmospheric dispersion modeling tools and expert analysis from the national center at LLNL, and state-of-the-science high-resolution urban models and event reconstruction capabilities.

  16. Generalized Additive Modelling of Mixed Distribution Markov Models with Application to Melbourne's Rainfall.

    Hyndman, R. J.; Grunwald, G. K.

    1999-01-01

    We consider modelling time series using a generalized additive model with first- order Markov structure and mixed transition density having a discrete component at zero and a continuous component with positive sample space. Such models have application, for example, in modelling daily occurrence and intensity of rainfall, and in modelling the number and size of insurance claims. We show how these methods extend the usual sinusoidal seasonal assumption in standard chain- dependent models by as...

  17. A Mathematical Model for the Industrial Hazardous Waste Location-Routing Problem

    Omid Boyer

    2013-01-01

    Full Text Available Technology progress is a cause of industrial hazardous wastes increasing in the whole world . Management of hazardous waste is a significant issue due to the imposed risk on environment and human life. This risk can be a result of location of undesirable facilities and also routing hazardous waste. In this paper a biobjective mixed integer programing model for location-routing industrial hazardous waste with two objectives is developed. First objective is total cost minimization including transportation cost, operation cost, initial investment cost, and cost saving from selling recycled waste. Second objective is minimization of transportation risk. Risk of population exposure within bandwidth along route is used to measure transportation risk. This model can help decision makers to locate treatment, recycling, and disposal centers simultaneously and also to route waste between these facilities considering risk and cost criteria. The results of the solved problem prove conflict between two objectives. Hence, it is possible to decrease the cost value by marginally increasing the transportation risk value and vice versa. A weighted sum method is utilized to combine two objectives function into one objective function. To solve the problem GAMS software with CPLEX solver is used. The problem is applied in Markazi province in Iran.

  18. Analysis of the General Hazards and Health Hazards Suffered by the Locals of Kodungaiyur Using Specially Linked Merged Fuzzy Cognitive Maps (SLMFCMs) Model

    K. THULUKKANAM; R.VASUKI

    2015-01-01

    In this paper we use the Specially Merged Linked Fuzzy Cognitive Maps (SMLFCMs) model to study all problems faced by the Kodungaiyur locals like poverty, ecological imbalance, soil water pollution, problems faced by local rag pickers, literacy rate, health hazards suffered by locals so on.

  19. Percolation model with an additional source of disorder

    Kundu, Sumanta; Manna, S. S.

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p . Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R1 and R2 of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R1-R2 plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is pc(sq) , the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R ∈{0 ,R0} and a percolation transition is observed with R0 as the control variable, similar to the site occupation probability.

  20. Proportional-hazards models for improving the analysis of light-water-reactor-component failure data

    The reliability of a power plant component may depend on a variety of factors (or covariates). If a single regression model can be specified to relate these factors to the failure rate, then all available data can be used to estimate and test for the effects of these covariates. One such model is a proportional hazards function that is specified as a product of two terms: a nominal hazard rate that is a function of time and a second term that is a function of the covariates. The purpose of this paper is to adapt two such models to LWR valve failure rate analysis, to compare the results, and to discuss the strengths and weaknesses of these applications

  1. An Uncertain Wage Contract Model with Adverse Selection and Moral Hazard

    Xiulan Wang

    2014-01-01

    it can be characterized as an uncertain variable. Moreover, the employee's effort is unobservable to the employer, and the employee can select her effort level to maximize her utility. Thus, an uncertain wage contract model with adverse selection and moral hazard is established to maximize the employer's expected profit. And the model analysis mainly focuses on the equivalent form of the proposed wage contract model and the optimal solution to this form. The optimal solution indicates that both the employee's effort level and the wage increase with the employee's ability. Lastly, a numerical example is given to illustrate the effectiveness of the proposed model.

  2. Data Model for Multi Hazard Risk Assessment Spatial Support Decision System

    Andrejchenko, Vera; Bakker, Wim; van Westen, Cees

    2014-05-01

    The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The

  3. The unconvincing product - Consumer versus expert hazard identification: A mental models study of novel foods

    Hagemann, Kit; Scholderer, Joachim

    Novel foods have been the object of intense public debate in recent years. Despite efforts to communicate the outcomes of risk assessments to consumers, public confidence in the management of potential risks associated has been low. Various reasons behind this has identified, chiefly a disagreement...... between technical experts and consumers e.g. over the nature of the hazards on which risk assessments should focus and perceptions of insufficient openness about uncertainties in risk assessment. The consumers part of the EU-project, NOFORISK, investigate the disagreement by comparing laypeople...... and experts understanding of benefits and risks associated with three Novel foods (a potato, rice and functional food ingredients) using a relatively new methodology for the study of risk perception called Mental models. Mental models focus on the way people conceptualise hazardous processes and allows...

  4. Flood and erosion hazard maps based on 2D hydraulic model

    Lovšin, Gregor

    2014-01-01

    In 2007, the European Union adopted the Flood Directive with the aim of better regulation in case of flood. The European Union member states are required to create flood hazard maps and flood risk maps. To achieve these objectives, Rules on Methodology to Define Flood Risk Areas and Erosion Areas Connected to Floods and Classification of Plots into Risk Classes were adopted in Slovenia. The two-dimensional hydraulic models, which are increasingly in use, represent an irreplaceable tool in pre...

  5. The timing of disability insurance application: a choice-based semiparametric hazard model

    Richard V. Burkhauser; Butler, J. S.; Yang-Woo Kim

    1996-01-01

    We use a choice-based subsample of Social Security Disability Insurance applicants from the 1978 Social Security Survey of Disability and Work to test the importance of policy variables on the timing of application for disability insurance benefits following the onset of a work limiting health condition. We correct for choice-based sampling by extending the Manski-Lerman (1977) correction to the likelihood function of our continuous time hazard model defined with semiparametric unmeasured het...

  6. Objective assessment of source models for seismic hazard studies : with a worked example from UK data

    R. M. W. Musson; Winter, P.W.

    2012-01-01

    Up to now, the search for increased reliability in probabilistic seismic hazard analysis (PSHA) has concentrated on ways of assessing expert opinion and subjective judgement. Although in some areas of PSHA subjective opinion is unavoidable, there is a danger that assessment procedures and review methods contribute further subjective judgements on top of those already elicited. It is helpful to find techniques for objectively assessing seismic source models that show what the interpretations p...

  7. Combining computational models for landslide hazard assessment of Guantánamo province, Cuba

    Castellanos Abella, E.A.

    2009-01-01

    As part of the Cuban system for landslide disaster management, a methodology was developed for regional scale landslide hazard assessment, which is a combination of different models. The method was applied in Guantánamo province at 1:100 000 scale. The analysis started with an extensive aerial photointerpretation to produce a landslide inventory map. Five main types of landslide movements were identified: slides (186), rockfalls (22), debrisflows (26), topples (18), and large rockslides (29)....

  8. Flood Hazard Mapping by Using Geographic Information System and Hydraulic Model: Mert River, Samsun, Turkey

    Vahdettin Demir; Ozgur Kisi

    2016-01-01

    In this study, flood hazard maps were prepared for the Mert River Basin, Samsun, Turkey, by using GIS and Hydrologic Engineering Centers River Analysis System (HEC-RAS). In this river basin, human life losses and a significant amount of property damages were experienced in 2012 flood. The preparation of flood risk maps employed in the study includes the following steps: (1) digitization of topographical data and preparation of digital elevation model using ArcGIS, (2) simulation of flood lows...

  9. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    Álvarez-Gómez, J. A.; Í. Aniel-Quiroga; O. Q. Gutiérrez-Gutiérrez; J. Larreynaga; González, M.; M. Castro; F. Gavidia; Aguirre-Ayerbe, I.; P. González-Riancho; Carreño, E

    2013-01-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has approximately a length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there have been 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and hundreds of victims. The hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached from both Probabilistic and D...

  10. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models.

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; O. Q. Gutiérrez-Gutiérrez; J. Larreynaga; González, M.; M. Castro; F. Gavidia; Aguirre-Ayerbe, I.; P. González-Riancho; Carreño, E

    2013-01-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and determinis...

  11. A new approach for deriving Flood hazard maps from SAR data and global hydrodynamic models

    Matgen, P.; Hostache, R.; Chini, M.; Giustarini, L.; Pappenberger, F.; Bally, P.

    2014-12-01

    With the flood consequences likely to amplify because of the growing population and ongoing accumulation of assets in flood-prone areas, global flood hazard and risk maps are needed for improving flood preparedness at large scale. At the same time, with the rapidly growing archives of SAR images of floods, there is a high potential of making use of these images for global and regional flood management. In this framework, an original method that integrates global flood inundation modeling and microwave remote sensing is presented. It takes advantage of the combination of the time and space continuity of a global inundation model with the high spatial resolution of satellite observations. The availability of model simulations over a long time period offers opportunities for estimating flood non-exceedance probabilities in a robust way. These probabilities can be attributed to historical satellite observations. Time series of SAR-derived flood extent maps and associated non-exceedance probabilities can then be combined generate flood hazard maps with a spatial resolution equal to that of the satellite images, which is most of the time higher than that of a global inundation model. In principle, this can be done for any area of interest in the world, provided that a sufficient number of relevant remote sensing images are available. As a test case we applied the method on the Severn River (UK) and the Zambezi River (Mozambique), where large archives of Envisat flood images can be exploited. The global ECMWF flood inundation model is considered for computing the statistics of extreme events. A comparison with flood hazard maps estimated with in situ measured discharge is carried out. The first results confirm the potentiality of the method. However, further developments on two aspects are required to improve the quality of the hazard map and to ensure the acceptability of the product by potential end user organizations. On the one hand, it is of paramount importance to

  12. Linear non-threshold (LNT) radiation hazards model and its evaluation

    In order to introduce linear non-threshold (LNT) model used in study on the dose effect of radiation hazards and to evaluate its application, the analysis of comprehensive literatures was made. The results show that LNT model is more suitable to describe the biological effects in accuracy for high dose than that for low dose. Repairable-conditionally repairable model of cell radiation effects can be well taken into account on cell survival curve in the all conditions of high, medium and low absorbed dose range. There are still many uncertainties in assessment model of effective dose of internal radiation based on the LNT assumptions and individual mean organ equivalent, and it is necessary to establish gender-specific voxel human model, taking gender differences into account. From above, the advantages and disadvantages of various models coexist. Before the setting of the new theory and new model, LNT model is still the most scientific attitude. (author)

  13. A seismic source zone model for the seismic hazard assessment of the Italian territory

    Meletti, C.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Milano-Pavia, Milano, Italia; Galadini, F.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Milano-Pavia, Milano, Italia; Valensise, G.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Roma1, Roma, Italia; Stucchi, M.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Milano-Pavia, Milano, Italia; Basili, R.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Roma1, Roma, Italia; Barba, S.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Roma1, Roma, Italia; Vannucci, G.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Bologna, Bologna, Italia; Boschi, E.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione AC, Roma, Italia

    2008-01-01

    We designed a new seismic source model for Italy to be used as an input for country-wide probabilistic seismic hazard assessment (PSHA) in the frame of the compilation of a new national reference map. We started off by reviewing existing models available for Italy and for other European countries, then discussed the main open issues in the current practice of seismogenic zoning. The new model, termed ZS9, is largely based on data collected in the past 10 years, including historical eart...

  14. Avalanche Hazard Mapping with Satellite Data and a Digital Elevation Model

    Urs Gruber

    1995-04-01

    Full Text Available Today avalanche hazard mapping is a very time-consuming affair. To map large remote areas, a method based on satellite imagery and digital elevation model has been developed. For this purpose, two test-sites in the Swiss Apls were selected. To simulate the avalanche hazard, the existing Salm-Voellmy model was modified to the computer environment and extended to the characteristics of avalanches within forested terrain. The forests were classified with Landsat-TM data. So far, only a single forest-class was established. The separation of forest, shrub, and non-forested area along the timberline poses a problem. On the other hand, a classification of small openings and avalanche tracks within the forest could be achieved. A comparison with the existing avalanche cadastral map revealed that 85 per cent of the risk areas were correctly classified. On the other hand, the separation into the defined red and blue danger zones was not satisfactory. For the model's application to become operational, further improvements are needed. However, the general approach is very promising, and should lead to more reliable hazard maps for planning purposes, as well as to new and better insights into the mutual effects between snow and forest.

  15. Concentration addition, independent action and generalized concentration addition models for mixture effect prediction of sex hormone synthesis in vitro

    Hadrup, Niels; Taxvig, Camilla; Pedersen, Mikael;

    2013-01-01

    of the concentration addition (CA), independent action (IA) and generalized concentration addition (GCA) models. First we measured effects of single chemicals and mixtures thereof on steroid synthesis in H295R cells. Then single chemical data were applied to the models; predictions of mixture effects were calculated...... and compared to the experimental mixture data. Mixture 1 contained environmental chemicals adjusted in ratio according to human exposure levels. Mixture 2 was a potency adjusted mixture containing five pesticides. Prediction of testosterone effects coincided with the experimental Mixture 1 data. In contrast...

  16. Seismic rupture modelling, strong motion prediction and seismic hazard assessment: fundamental and applied approaches

    The defence to obtain the 'Habilitation a Diriger des Recherches' is a synthesis of the research work performed since the end of my Ph D. thesis in 1997. This synthesis covers the two years as post doctoral researcher at the Bureau d'Evaluation des Risques Sismiques at the Institut de Protection (BERSSIN), and the seven consecutive years as seismologist and head of the BERSSIN team. This work and the research project are presented in the framework of the seismic risk topic, and particularly with respect to the seismic hazard assessment. Seismic risk combines seismic hazard and vulnerability. Vulnerability combines the strength of building structures and the human and economical consequences in case of structural failure. Seismic hazard is usually defined in terms of plausible seismic motion (soil acceleration or velocity) in a site for a given time period. Either for the regulatory context or the structural specificity (conventional structure or high risk construction), seismic hazard assessment needs: to identify and locate the seismic sources (zones or faults), to characterize their activity, to evaluate the seismic motion to which the structure has to resist (including the site effects). I specialized in the field of numerical strong-motion prediction using high frequency seismic sources modelling and forming part of the IRSN allowed me to rapidly working on the different tasks of seismic hazard assessment. Thanks to the expertise practice and the participation to the regulation evolution (nuclear power plants, conventional and chemical structures), I have been able to work on empirical strong-motion prediction, including site effects. Specific questions related to the interface between seismologists and structural engineers are also presented, especially the quantification of uncertainties. This is part of the research work initiated to improve the selection of the input ground motion in designing or verifying the stability of structures. (author)

  17. An Additive-Multiplicative Restricted Mean Residual Life Model

    Mansourvar, Zahra; Martinussen, Torben; Scheike, Thomas H.

    2016-01-01

    The mean residual life measures the expected remaining life of a subject who has survived up to a particular time. When survival time distribution is highly skewed or heavy tailed, the restricted mean residual life must be considered. In this paper, we propose an additive-multiplicative restricte...

  18. Large scale debris-flow hazard assessment: a geotechnical approach and GIS modelling

    G. Delmonaco

    2003-01-01

    Full Text Available A deterministic distributed model has been developed for large-scale debris-flow hazard analysis in the basin of River Vezza (Tuscany Region – Italy. This area (51.6 km 2 was affected by over 250 landslides. These were classified as debris/earth flow mainly involving the metamorphic geological formations outcropping in the area, triggered by the pluviometric event of 19 June 1996. In the last decades landslide hazard and risk analysis have been favoured by the development of GIS techniques permitting the generalisation, synthesis and modelling of stability conditions on a large scale investigation (>1:10 000. In this work, the main results derived by the application of a geotechnical model coupled with a hydrological model for the assessment of debris flows hazard analysis, are reported. This analysis has been developed starting by the following steps: landslide inventory map derived by aerial photo interpretation, direct field survey, generation of a database and digital maps, elaboration of a DTM and derived themes (i.e. slope angle map, definition of a superficial soil thickness map, geotechnical soil characterisation through implementation of a backanalysis on test slopes, laboratory test analysis, inference of the influence of precipitation, for distinct return times, on ponding time and pore pressure generation, implementation of a slope stability model (infinite slope model and generalisation of the safety factor for estimated rainfall events with different return times. Such an approach has allowed the identification of potential source areas of debris flow triggering. This is used to detected precipitation events with estimated return time of 10, 50, 75 and 100 years. The model shows a dramatic decrease of safety conditions for the simulation when is related to a 75 years return time rainfall event. It corresponds to an estimated cumulated daily intensity of 280–330 mm. This value can be considered the hydrological triggering

  19. Perspectives on open access high resolution digital elevation models to produce global flood hazard layers

    Christopher Charles Sampson

    2016-01-01

    Full Text Available Global flood hazard models have recently become a reality thanks to the release of open access global digital elevation models, the development of simplified and highly efficient flow algorithms, and the steady increase in computational power. In this commentary we argue that although the availability of open access global terrain data has been critical in enabling the development of such models, the relatively poor resolution and precision of these data now limit significantly our ability to estimate flood inundation and risk for the majority of the planet’s surface. The difficulty of deriving an accurate ‘bare-earth’ terrain model due to the interaction of vegetation and urban structures with the satellite-based remote sensors means that global terrain data are often poorest in the areas where people, property (and thus vulnerability are most concentrated. Furthermore, the current generation of open access global terrain models are over a decade old and many large floodplains, particularly those in developing countries, have undergone significant change in this time. There is therefore a pressing need for a new generation of high resolution and high vertical precision open access global digital elevation models to allow significantly improved global flood hazard models to be developed.

  20. Geological complexity and performance assessment: Volcanic hazards modeling at Yucca Mountain

    Quantitative performance assessments (PA) are often used by regulatory agencies to evaluate the ability of a designed structure to accomplish its objectives. For various reasons, the physics of failure is sometimes 'abstracted' in these analyses. Because there is no general model for magmatic processes, this approach has some appeal for volcanic hazards assessment. Alternatively, the author has used a simple physical model loosely coupled to a consequence simulator to study the intrusion of a basaltic dike into to a presumed high-level radioactive waste repository at Yucca Mountain, NV. The model uses a simplified dike-intrusion model and Monte Carlo-style parameter variation with distributions taken from analogous volcanic systems. Dike contamination follows (1) hydrologic transport of radionuclides and entrainment of lithic fragments of wall rock and (2) direct interaction of waste and magma. The physical model gave average dikes of 0.25 to 1.5 m width, mean = 0.5 m, and 750 to 3,000 m length, mean = 1,500 m. The volume of contaminated dike rock is 100 m3. Under current assumptions, releases due to magmatic activity should not exceed the regulatory limit set by EPA. This work shows that magma physics can constrain volcanic hazards in PA modeling. Stochastically varied input parameters, however, may violate the physical plausibility of the model. The parameter distributions should be categorically biased by the geological and tectonic context, detailed in the PA scenarios

  1. Modelling dissimilarity: generalizing ultrametric and additive tree representations.

    Hubert, L; Arabie, P; Meulman, J

    2001-05-01

    Methods for the hierarchical clustering of an object set produce a sequence of nested partitions such that object classes within each successive partition are constructed from the union of object classes present at the previous level. Any such sequence of nested partitions can in turn be characterized by an ultrametric. An approach to generalizing an (ultrametric) representation is proposed in which the nested character of the partition sequence is relaxed and replaced by the weaker requirement that the classes within each partition contain objects consecutive with respect to a fixed ordering of the objects. A method for fitting such a structure to a given proximity matrix is discussed, along with several alternative strategies for graphical representation. Using this same ultrametric extension, additive tree representations can also be generalized by replacing the ultrametric component in the decomposition of an additive tree (into an ultrametric and a centroid metric). A common numerical illustration is developed and maintained throughout the paper. PMID:11393895

  2. Primary circuit iodine model addition to IMPAIR-3

    Osetek, D.J.; Louie, D.L.Y. [Los Alamos Technical Associates, Inc., Albuquerque, NM (United States); Guntay, S.; Cripps, R. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1996-12-01

    As part of a continuing effort to provide the U.S. Department of Energy (DOE) Advanced Reactor Severe Accident Program (ARSAP) with complete iodine analysis capability, a task was undertaken to expand the modeling of IMPAIR-3, an iodine chemistry code. The expanded code will enable the DOE to include detailed iodine behavior in the assessment of severe accident source terms used in the licensing of U.S. Advanced Light Water Reactors (ALWRs). IMPAIR-3 was developed at the Paul Scherrer Institute (PSI), Switzerland, and has been used by ARSAP for the past two years to analyze containment iodine chemistry for ALWR source term analyses. IMPAIR-3 is primarily a containment code but the iodine chemistry inside the primary circuit (the Reactor Coolant System or RCS) may influence the iodine species released into the the containment; therefore, a RCS iodine chemistry model must be implemented in IMPAIR-3 to ensure thorough source term analysis. The ARSAP source term team and the PSI IMPAIR-3 developers are working together to accomplish this task. This cooperation is divided into two phases. Phase I, taking place in 1996, involves developing a stand-alone RCS iodine chemistry program called IMPRCS (IMPAIR -Reactor Coolant System). This program models a number of the chemical and physical processes of iodine that are thought to be important at conditions of high temperature and pressure in the RCS. In Phase II, which is tentatively scheduled for 1997, IMPRCS will be implemented as a subroutine in IMPAIR-3. To ensure an efficient calculation, an interface/tracking system will be developed to control the use of the RCS model from the containment model. These two models will be interfaced in such a way that once the iodine is released from the RCS, it will no longer be tracked by the RCS model but will be tracked by the containment model. All RCS thermal-hydraulic parameters will be provided by other codes. (author) figs., tabs., refs.

  3. Primary circuit iodine model addition to IMPAIR-3

    As part of a continuing effort to provide the U.S. Department of Energy (DOE) Advanced Reactor Severe Accident Program (ARSAP) with complete iodine analysis capability, a task was undertaken to expand the modeling of IMPAIR-3, an iodine chemistry code. The expanded code will enable the DOE to include detailed iodine behavior in the assessment of severe accident source terms used in the licensing of U.S. Advanced Light Water Reactors (ALWRs). IMPAIR-3 was developed at the Paul Scherrer Institute (PSI), Switzerland, and has been used by ARSAP for the past two years to analyze containment iodine chemistry for ALWR source term analyses. IMPAIR-3 is primarily a containment code but the iodine chemistry inside the primary circuit (the Reactor Coolant System or RCS) may influence the iodine species released into the the containment; therefore, a RCS iodine chemistry model must be implemented in IMPAIR-3 to ensure thorough source term analysis. The ARSAP source term team and the PSI IMPAIR-3 developers are working together to accomplish this task. This cooperation is divided into two phases. Phase I, taking place in 1996, involves developing a stand-alone RCS iodine chemistry program called IMPRCS (IMPAIR -Reactor Coolant System). This program models a number of the chemical and physical processes of iodine that are thought to be important at conditions of high temperature and pressure in the RCS. In Phase II, which is tentatively scheduled for 1997, IMPRCS will be implemented as a subroutine in IMPAIR-3. To ensure an efficient calculation, an interface/tracking system will be developed to control the use of the RCS model from the containment model. These two models will be interfaced in such a way that once the iodine is released from the RCS, it will no longer be tracked by the RCS model but will be tracked by the containment model. All RCS thermal-hydraulic parameters will be provided by other codes. (author) figs., tabs., refs

  4. Energy consumption model of Binder-jetting additive manufacturing processes

    Xu, Xin; METEYER, Simon; PERRY, Nicolas; ZHAO, Yaoyao Fiona

    2014-01-01

    Considering the potential for new product design possibilities and the reduction of environmental impacts, Additive Manufacturing (AM) processes are considered to possess significant advantages for automotive, aerospace and medical equipment industries. One of the commercial AM techniques is Binder-Jetting (BJ). This technique can be used to process a variety of materials including stainless steel, ceramic, polymer and glass. However, there is very limited research about this AM technology on...

  5. Additional Research Needs to Support the GENII Biosphere Models

    Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen

    2013-11-30

    In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models. It is recommended that priorities be set by NRC staff to guide selection of the most useful improvements in a cost-effective manner. Suggestions are made based on relatively easy and inexpensive changes, and longer-term more costly studies. In the short term, there are several improved model formulations that could be applied to the GENII suite of codes to make them more generally useful. • Implementation of the separation of the translocation and weathering processes • Implementation of an improved model for carbon-14 from non-atmospheric sources • Implementation of radon exposure pathways models • Development of a KML processor for the output report generator module data that are calculated on a grid that could be superimposed upon digital maps for easier presentation and display • Implementation of marine mammal models (manatees, seals, walrus, whales, etc.). Data needs in the longer term require extensive (and potentially expensive) research. Before picking any one radionuclide or food type, NRC staff should perform an in-house review of current and anticipated environmental analyses to select “dominant” radionuclides of interest to allow setting of cost-effective priorities for radionuclide- and pathway-specific research. These include • soil-to-plant uptake studies for oranges and other citrus fruits, and • Development of models for evaluation of radionuclide concentration in highly-processed foods such as oils and sugars. Finally, renewed

  6. Additive gamma frailty models with applications to competing risks in related individuals

    Eriksson, Frank; Scheike, Thomas

    2015-01-01

    Epidemiological studies of related individuals are often complicated by the fact that follow-up on the event type of interest is incomplete due to the occurrence of other events. We suggest a class of frailty models with cause-specific hazards for correlated competing events in related individual...

  7. Detailed Flood Modeling and Hazard Assessment from Storm Tides, Rainfall and Sea Level Rise

    Orton, P. M.; Hall, T. M.; Georgas, N.; Conticello, F.; Cioffi, F.; Lall, U.; Vinogradov, S. V.; Blumberg, A. F.

    2014-12-01

    A flood hazard assessment has been conducted for the Hudson River from New York City to Troy at the head of tide, using a three-dimensional hydrodynamic model and merging hydrologic inputs and storm tides from tropical and extra-tropical cyclones, as well as spring freshet floods. Our recent work showed that neglecting freshwater flows leads to underestimation of peak water levels at up-river sites and neglecting stratification (typical with two-dimensional modeling) leads to underestimation all along the Hudson. The hazard assessment framework utilizes a representative climatology of over 1000 synthetic tropical cyclones (TCs) derived from a statistical-stochastic TC model, and historical extra-tropical cyclones and freshets from 1950-present. Hydrodynamic modeling is applied with seasonal variations in mean sea level and ocean and estuary stratification. The model is the Stevens ECOM model and is separately used for operational ocean forecasts on the NYHOPS domain (http://stevens.edu/NYHOPS). For the synthetic TCs, an Artificial Neural Network/ Bayesian multivariate approach is used for rainfall-driven freshwater inputs to the Hudson, translating the TC attributes (e.g. track, SST, wind speed) directly into tributary stream flows (see separate presentation by Cioffi for details). Rainfall intensity has been rising in recent decades in this region, and here we will also examine the sensitivity of Hudson flooding to future climate warming-driven increases in storm precipitation. The hazard assessment is being repeated for several values of sea level, as projected for future decades by the New York City Panel on Climate Change. Recent studies have given widely varying estimates of the present-day 100-year flood at New York City, from 2.0 m to 3.5 m, and special emphasis will be placed on quantifying our study's uncertainty.

  8. Development of hydrogeological modelling approaches for assessment of consequences of hazardous accidents at nuclear power plants

    This paper introduces some modeling approaches for predicting the influence of hazardous accidents at nuclear reactors on groundwater quality. Possible pathways for radioactive releases from nuclear power plants were considered to conceptualize boundary conditions for solving the subsurface radionuclides transport problems. Some approaches to incorporate physical-and-chemical interactions into transport simulators have been developed. The hydrogeological forecasts were based on numerical and semi-analytical scale-dependent models. They have been applied to assess the possible impact of the nuclear power plants designed in Russia on groundwater reservoirs

  9. Mathematical Decision Models Applied for Qualifying and Planning Areas Considering Natural Hazards and Human Dealing

    Anton, Jose M.; Grau, Juan B.; Tarquis, Ana M.; Sanchez, Elena; Andina, Diego

    2014-05-01

    The authors were involved in the use of some Mathematical Decision Models, MDM, to improve knowledge and planning about some large natural or administrative areas for which natural soils, climate, and agro and forest uses where main factors, but human resources and results were important, natural hazards being relevant. In one line they have contributed about qualification of lands of the Community of Madrid, CM, administrative area in centre of Spain containing at North a band of mountains, in centre part of Iberian plateau and river terraces, and also Madrid metropolis, from an official study of UPM for CM qualifying lands using a FAO model from requiring minimums of a whole set of Soil Science criteria. The authors set first from these criteria a complementary additive qualification, and tried later an intermediate qualification from both using fuzzy logic. The authors were also involved, together with colleagues from Argentina et al. that are in relation with local planners, for the consideration of regions and of election of management entities for them. At these general levels they have adopted multi-criteria MDM, used a weighted PROMETHEE, and also an ELECTRE-I with the same elicited weights for the criteria and data, and at side AHP using Expert Choice from parallel comparisons among similar criteria structured in two levels. The alternatives depend on the case study, and these areas with monsoon climates have natural hazards that are decisive for their election and qualification with an initial matrix used for ELECTRE and PROMETHEE. For the natural area of Arroyos Menores at South of Rio Cuarto town, with at North the subarea of La Colacha, the loess lands are rich but suffer now from water erosions forming regressive ditches that are spoiling them, and use of soils alternatives must consider Soil Conservation and Hydraulic Management actions. The use of soils may be in diverse non compatible ways, as autochthonous forest, high value forest, traditional

  10. Technical Work Plan for: Additional Multiscale Thermohydrologic Modeling

    The primary objective of Revision 04 of the MSTHM report is to provide TSPA with revised repository-wide MSTHM analyses that incorporate updated percolation flux distributions, revised hydrologic properties, updated IEDs, and information pertaining to the emplacement of transport, aging, and disposal (TAD) canisters. The updated design information is primarily related to the incorporation of TAD canisters, but also includes updates related to superseded IEDs describing emplacement drift cross-sectional geometry and layout. The intended use of the results of Revision 04 of the MSTHM report, as described in this TWP, is to predict the evolution of TH conditions (temperature, relative humidity, liquid-phase saturation, and liquid-phase flux) at specified locations within emplacement drifts and in the adjoining near-field host rock along all emplacement drifts throughout the repository. This information directly supports the TSPA for the nominal and seismic scenarios. The revised repository-wide analyses are required to incorporate updated parameters and design information and to extend those analyses out to 1,000,000 years. Note that the previous MSTHM analyses reported in Revision 03 of Multiscale Thermohydrologic Model (BSC 2005 [DIRS 173944]) only extend out to 20,000 years. The updated parameters are the percolation flux distributions, including incorporation of post-10,000-year distributions, and updated calibrated hydrologic property values for the host-rock units. The applied calibrated hydrologic properties will be an updated version of those available in Calibrated Properties Model (BSC 2004 [DIRS 169857]). These updated properties will be documented in an Appendix of Revision 03 of UZ Flow Models and Submodels (BSC 2004 [DIRS 169861]). The updated calibrated properties are applied because they represent the latest available information. The reasonableness of applying the updated calibrated' properties to the prediction of near-fieldin-drift TH conditions

  11. Sensitivity of earthquake risk models to uncertainties in hazards, exposure and vulnerability parameters

    Sensitivity of earthquake risk models to uncertainties in hazard, exposure and vulnerability components has been investigated. The study is conducted for two test beds with distinctive socio-economic characteristics; Zeytinburnu and Los Angeles. Results are evaluated in terms of overall damage estimates as well as damage estimates of individual building typologies relative to one another. The distribution of damage estimates shows that the respective earthquake risk models are significantly affected by the vulnerability model. Further, the vulnerability model used in the earthquake risk model also influences the ranking of the building typologies according to their damage estimates. In comparison to that, the quality and the level of details of the building exposure database and the selected ground motion prediction equation seem to have less effect on the damage estimates. (author)

  12. Monitoring and forecast of hydro meteorological hazards basing on data of distant assay and mathematical modeling

    Sapunov, Valentin; Dikinis, Alexandr; Voronov, Nikolai

    2014-05-01

    Russian Federation having giant area has low concentration of land meteorological check points. Net of monitoring is not enough for effective forecast and prediction of weather dynamics and extremely situations. Under increase of extremely situations and incidents - hurricanes et al (two times from begin of XXI century) reconstruction and "perestroika" of monitoring net is needful and necessary. The basis of such a progress is distant monitoring using planes and satellites adding land contact monitoring base on efforts of existed points and stations. Interaction of contact and distant views may make hydro meteorological data and prediction more fine and significant. Tradition physical methods must be added by new biological methods of modern study. According to gotten researches animal are able to predict extremely hazards of natural and anthropogenic nature basing of interaction between biological matter and probable physical field that is under primary study. For example it was animals which forecasted dropping of Chelyabinsk meteorite of 2013. Adding of biological indication with complex of meteorological data may increase significance of hazard prediction. The uniting of all data and approaches may become basis of proposed mathematical hydro meteorological weather models. Introduction to practice reported complex methods may decrease of loss from hydro meteorological risks and hazards and increase stability of country economics.

  13. Landslide hazard zonation using AHP model and GIS technique in Khoram Abad City

    R. Hatamifar

    2012-01-01

    Full Text Available Extended abstract1- Introduction The increasing growth of urban and rural development has caused some natural anomalies in the Earth's inhabitants. Mass movements, especially landslides, are one of the most damages of them that have had the increasing momentum together human manipulation in natural systems in recent decades. So that it is one of the principal geomorphic processes in the mountainous areas. Landslide phenomenon occurring in many parts of the world and Iran in the favorable conditions causes the destruction of vegetation, orchards, farmlands, and even human casualties. According to estimates, landslides have imposed much financial damages about 500 milliard Rials to Iran, annually. Burying of Abikar village of Charmahal-o-Bakhtiari Province in spring 1997 year is one of the clear samples of landslide human damage. Among these, the Lorestan Province is one of the most susceptible sites of landslide occurrence in Iran. The occurrence about 274 landslides in Lorestan Province, with extent 1400km2 equivalent %4.8 of its area confirm this claim. Since the exact predication of landslides occurrence isn’t possible by human sciences, thus, we can prevent from the damages of this phenomenon by identification of landslide susceptible areas and prioritizing them. Landslide hazard zonation maps can help the environmental designers and engineers to select a suitable place for development projects implementation. The results of these studies can be used as fundamental information by environmental managers and planners.The purposes of this study are the recognition of effective factors in landslide and the zonation of Khoram Abad City in terms of the occurrence of this phenomenon using the AHP model and GIS technique. Therefore, selection of criteria and standards, providing of factors raster layers, determining of relative and final weight of factors, overlaying of layers and preparing landslide hazard zonation map are the major objectives of

  14. Web-based Services for Earth Observing and Model Data in National Applications and Hazards

    Kafatos, M.; Boybeyi, Z.; Cervone, G.; di, L.; Sun, D.; Yang, C.; Yang, R.

    2005-12-01

    The ever-growing large volumes of Earth system science data, collected by Earth observing platforms, in situ stations and as model output data, are increasingly being used by discipline scientists and by wider classes of users. In particular, applications of Earth system science data to environmental and hazards as well as other national applications, require tailored or specialized data, as well as web-based tools and infrastructure. The latter are driven by applications and usage drivers which include ease of access, visualization of complex data, ease of producing value-added data, GIS and open source analysis usage, metadata, etc. Here we present different aspects of such web-based services and access, and discuss several applications in the hazards and environmental areas, including earthquake signatures and observations and model runs of hurricanes. Examples and lessons learned from the consortium Mid-Atlantic Geospatial Information Consortium will be presented. We discuss a NASA-funded, open source on-line data analysis system that is being applied to climate studies for the ESIP Federation. Since enhanced, this project and the next-generation Metadata Integrated Data Analysis System allow users not only to identify data but also to generate new data products on-the-fly. The functionalities extend from limited predefined functions, to sophisticated functions described by general-purposed GrADS (Grid Analysis and Display System) commands. The Federation system also allows third party data products to be combined with local data. Software component are available for converting the output from MIDAS (OPenDAP) into OGC compatible software. The on-going Grid efforts at CEOSR and LAITS in the School of Computational Sciences (SCS) include enhancing the functions of Globus to provide support for a geospatial system so the system can share the computing power to handle problems with different peak access times and improve the stability and flexibility of a rapid

  15. Comparison of additive (absolute) risk projection models and multiplicative (relative) risk projection models in estimating radiation-induced lifetime cancer risk

    Lifetime cancer risk estimates depend on risk projection models. While the increasing lengths of follow-up observation periods of atomic bomb survivors in Hiroshima and Nagasaki bring about changes in cancer risk estimates, the validity of the two risk projection models, the additive risk projection model (AR) and multiplicative risk projection model (MR), comes into question. This paper compares the lifetime risk or loss of life-expectancy between the two projection models on the basis of BEIR-III report or recently published RERF report. With Japanese cancer statistics the estimates of MR were greater than those of AR, but a reversal of these results was seen when the cancer hazard function for India was used. When we investigated the validity of the two projection models using epidemiological human data and animal data, the results suggested that MR was superior to AR with respect to temporal change, but there was little evidence to support its validity. (author)

  16. Model estimation of the longevity for cars registered in Sweden using survival analysis and Cox proportional hazards model

    Söderberg, Daniel

    2014-01-01

    Time-to-event data is used in this thesis to analyze private cars’ longevity in Sweden. Thedataset is provided by Trafikanalys and contains all registered, deregistered or temporary deregisteredcars in Sweden during the time period 2000 - 2012.A Cox proportional hazards model is fitted, including variables such as car manufacturer andcar body. The results show that directly imported cars have a much shorter median survivalcompared to non-imported cars. The convertible cars have the longest me...

  17. TRENT2D WG: a smart web infrastructure for debris-flow modelling and hazard assessment

    Zorzi, Nadia; Rosatti, Giorgio; Zugliani, Daniel; Rizzi, Alessandro; Piffer, Stefano

    2016-04-01

    Mountain regions are naturally exposed to geomorphic flows, which involve large amounts of sediments and induce significant morphological modifications. The physical complexity of this class of phenomena represents a challenging issue for modelling, leading to elaborate theoretical frameworks and sophisticated numerical techniques. In general, geomorphic-flows models proved to be valid tools in hazard assessment and management. However, model complexity seems to represent one of the main obstacles to the diffusion of advanced modelling tools between practitioners and stakeholders, although the UE Flood Directive (2007/60/EC) requires risk management and assessment to be based on "best practices and best available technologies". Furthermore, several cutting-edge models are not particularly user-friendly and multiple stand-alone software are needed to pre- and post-process modelling data. For all these reasons, users often resort to quicker and rougher approaches, leading possibly to unreliable results. Therefore, some effort seems to be necessary to overcome these drawbacks, with the purpose of supporting and encouraging a widespread diffusion of the most reliable, although sophisticated, modelling tools. With this aim, this work presents TRENT2D WG, a new smart modelling solution for the state-of-the-art model TRENT2D (Armanini et al., 2009, Rosatti and Begnudelli, 2013), which simulates debris flows and hyperconcentrated flows adopting a two-phase description over a mobile bed. TRENT2D WG is a web infrastructure joining advantages offered by the software-delivering model SaaS (Software as a Service) and by WebGIS technology and hosting a complete and user-friendly working environment for modelling. In order to develop TRENT2D WG, the model TRENT2D was converted into a service and exposed on a cloud server, transferring computational burdens from the user hardware to a high-performing server and reducing computational time. Then, the system was equipped with an

  18. A model of the holographic principle: Randomness and additional dimension

    In recent years an idea has emerged that a system in a 3-dimensional space can be described from an information point of view by a system on its 2-dimensional boundary. This mysterious correspondence is called the Holographic Principle and has had profound effects in string theory and our perception of space-time. In this note we describe a purely mathematical model of the Holographic Principle using ideas from nonlinear dynamical systems theory. We show that a random map on the surface S2 of a 3-dimensional open ball B has a natural counterpart in B, and the two maps acting in different dimensional spaces have the same entropy. We can reverse this construction if we start with a special 3-dimensional map in B called a skew product. The key idea is to use the randomness, as imbedded in the parameter of the 2-dimensional random map, to define a third dimension. The main result shows that if we start with an arbitrary dynamical system in B with entropy E we can construct a random map on S2 whose entropy is arbitrarily close to E.

  19. Timing of Effort and Reward: Three-Sided Moral Hazard in a Continuous-Time Model

    Jun Yang

    2010-01-01

    This paper studies a three-sided moral hazard problem with one agent exerting up-front effort and two agents exerting ongoing effort in a continuous-time model. The agents' efforts jointly affect the probability of survival and thus the expected cash flow of the project. In the optimal contract, the timing of payments reflects the timing of effort: payments for up-front effort precede payments for ongoing effort. Several patterns are possible for the cash allocation between the two agents wit...

  20. Tsunami Hazard Preventing Based Land Use Planning Model Using GIS Techniques in Muang Krabi, Thailand

    Abdul Salam Soomro

    2012-10-01

    Full Text Available The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region.

  1. Tsunami hazard preventing based land use planing model using GIS technique in Muang Krabi, Thailand

    The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems) based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region. (author)

  2. Transferability of regional permafrost disturbance susceptibility modelling using generalized linear and generalized additive models

    Rudy, Ashley C. A.; Lamoureux, Scott F.; Treitz, Paul; van Ewijk, Karin Y.

    2016-07-01

    To effectively assess and mitigate risk of permafrost disturbance, disturbance-prone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape characteristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Peninsula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed locations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) > 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Additionally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results indicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of disturbances were

  3. Models of magma-aquifer interactions and their implications for hazard assessment

    Strehlow, Karen; Gottsmann, Jo; Tumi Gudmundsson, Magnús

    2014-05-01

    Kverkfjöll and in October on White Island, New Zealand. The latter is only one example of these natural attractions that are visited by thousands of tourists every year. Additionally, these systems are increasingly used for energy generation. Phreatic explosions pose a serious risk to people and infrastructure nearby, and they are hard to predict. To improve risk assessment in hydrothermal areas, we assessed historical records and literature with regard to the frequency and mechanisms of hydrothermal explosions. Complemented by numerical models this study wants to answer the question: What determines the change of a safe to a dangerous behaviour of the system, i.e. the change from silent degassing to explosions? Our project aims to widen our knowledge base on the complex coupling of magmatic and hydrological systems, which provides further insight into the subsurface processes at volcanic systems and will aid future risk assessment and eruption forecasting.

  4. Additive interaction in survival analysis

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise;

    2012-01-01

    It is a widely held belief in public health and clinical decision-making that interventions or preventive strategies should be aimed at patients or population subgroups where most cases could potentially be prevented. To identify such subgroups, deviation from additivity of absolute effects is the...... relevant measure of interest. Multiplicative survival models, such as the Cox proportional hazards model, are often used to estimate the association between exposure and risk of disease in prospective studies. In Cox models, deviations from additivity have usually been assessed by surrogate measures of...... additive interaction derived from multiplicative models-an approach that is both counter-intuitive and sometimes invalid. This paper presents a straightforward and intuitive way of assessing deviation from additivity of effects in survival analysis by use of the additive hazards model. The model directly...

  5. A statistical model for seismic hazard assessment of hydraulic-fracturing-induced seismicity

    Hajati, T.; Langenbruch, C.; Shapiro, S. A.

    2015-12-01

    We analyze the interevent time distribution of hydraulic-fracturing-induced seismicity collected during 18 stages at four different regions. We identify a universal statistical process describing the distribution of hydraulic-fracturing-induced events in time. The distribution of waiting times between subsequently occurring events is given by the exponential probability density function of the homogeneous Poisson process. Our findings suggest that hydraulic-fracturing-induced seismicity is directly triggered by the relaxation of stress and pore pressure perturbation initially created by the injection. Therefore, compared to this relaxation, the stress transfer caused by the occurrence of preceding seismic events is mainly insignificant for the seismogenesis of subsequently occurring events. We develop a statistical model to compute the occurrence probability of hydraulic-fracturing-induced seismicity. This model can be used to assess the seismic hazard associated with hydraulic fracturing operations. No aftershock triggering has to be included in the statistical model.

  6. CalTOX, a multimedia total exposure model for hazardous-waste sites

    CalTOX has been developed as a spreadsheet model to assist in health-risk assessments that address contaminated soils and the contamination of adjacent air, surface water, sediments, and ground water. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify and reduce uncertainty in multimedia, multiple-pathway exposure models. This report provides an overview of the CalTOX model components, lists the objectives of the model, describes the philosophy under which the model was developed, identifies the chemical classes for which the model can be used, and describes critical sensitivities and uncertainties. The multimedia transport and transformation model is a dynamic model that can be used to assess time-varying concentrations of contaminants introduced initially to soil layers or for contaminants released continuously to air or water. This model assists the user in examining how chemical and landscape properties impact both the ultimate route and quantity of human contact. Multimedia, multiple pathway exposure models are used in the CalTOX model to estimate average daily potential doses within a human population in the vicinity of a hazardous substances release site. The exposure models encompass twenty-three exposure pathways. The exposure assessment process consists of relating contaminant concentrations in the multimedia model compartments to contaminant concentrations in the media with which a human population has contact (personal air, tap water, foods, household dusts soils, etc.). The average daily dose is the product of the exposure concentrations in these contact media and an intake or uptake factor that relates the concentrations to the distributions of potential dose within the population

  7. "Developing a multi hazard air quality forecasting model for Santiago, Chile"

    Mena, M. A.; Delgado, R.; Hernandez, R.; Saide, P. E.; Cienfuegos, R.; Pinochet, J. I.; Molina, L. T.; Carmichael, G. R.

    2013-05-01

    Santiago, Chile has reduced annual particulate matter from 69ug/m3 (in 1989) to 25ug/m3 (in 2012), mostly by forcing industry, the transport sector, and the residential heating sector to adopt stringent emission standards to be able to operate under bad air days. Statistical forecasting has been used to predict bad air days, and pollution control measures in Santiago, Chile, for almost two decades. Recently an operational PM2.5 deterministic model has been implemented using WRF-Chem. The model was developed by the University of Iowa and is run at the Chilean Meteorological Office. Model configuration includes high resolution emissions gridding (2km) and updated population distribution using 2008 data from LANDSCAN. The model is run using a 2 day spinup with a 5 day forecast. This model has allowed a preventive approach in pollution control measures, as episodes are the results of multiple days of bad dispersion. Decreeing air pollution control measures in advance of bad air days resulted in a reduction of 40% of alert days (80ug/m3 mean 24h PM2.5) and 66% of "preemergency days" (110ug/m3 mean 24h PM2.5) from 2011 to 2012, despite similar meteorological conditions. This model will be deployed under a recently funded Center for Natural Disaster Management, and include other meteorological hazards such as flooding, high temperature, storm waves, landslides, UV radiation, among other parameters. This paper will present the results of operational air quality forecasting, and the methodology that will be used to transform WRF-Chem into a multi hazard forecasting system.

  8. THE TRANSBOUNDARY MOVEMENT OF HAZARDOUS WASTE: A CONTRIBUTION TO IMPROVE THE PRESENT TECHNICAL-JUDICIAL MODEL

    Giancarlo Carosso

    2012-01-01

    Full Text Available The expression “the transboundary movement of hazardous waste” refers, from the technical point of view, to a series of phases that begin with the creation of a hazardous waste and includes its identification according to international, regional and national standards, its delivery to specialized companies and its movement from the exporting country to the importing country; these phases end with the final destination of the waste (disposal or recovery. From the judicial point of view, this same expression means a set of international regulations, at the UN level, a judicial model constituted by various Multilateral Environmental Agreements-MEAs and in particular the Basel, Rotterdam and Stockholm Conventions. In spite of this model, an international market, dominated in certain strategic points by a chain of organized crime, has become established over the last 30 years, with consequent adverse effects on human health and the environment as well as on trade and competition. The illegal movement and transport of hazardous goods and waste undermine international policies and enforcement efforts and put law-abiding businesses at an economic disadvantage. It is here taken into account that there are in fact two judicial models that have to be referred to. Apart from the previously mentioned one, it is necessary to take into consideration the international judicial hazardous goods and waste transport model. The former model is examined (1 from a general point of view and during the various phases; (2 as far as the controls that the international regulations foresee along the chain are concerned; (3 with a case study concerning the PCB category, a category which is very vast and whose contents are sometimes considered as hazardous goods and sometimes as wastes. Because of the great complexity of the problems and in order to have a clear picture, a specific second study, which is closely connected to the first, has been conducted on the

  9. A "mental models" approach to the communication of subsurface hydrology and hazards

    Gibson, Hazel; Stewart, Iain S.; Pahl, Sabine; Stokes, Alison

    2016-05-01

    Communicating information about geological and hydrological hazards relies on appropriately worded communications targeted at the needs of the audience. But what are these needs, and how does the geoscientist discern them? This paper adopts a psychological "mental models" approach to assess the public perception of the geological subsurface, presenting the results of attitudinal studies and surveys in three communities in the south-west of England. The findings reveal important preconceptions and misconceptions regarding the impact of hydrological systems and hazards on the geological subsurface, notably in terms of the persistent conceptualisation of underground rivers and the inferred relations between flooding and human activity. The study demonstrates how such mental models can provide geoscientists with empirical, detailed and generalised data of perceptions surrounding an issue, as well reveal unexpected outliers in perception that they may not have considered relevant, but which nevertheless may locally influence communication. Using this approach, geoscientists can develop information messages that more directly engage local concerns and create open engagement pathways based on dialogue, which in turn allow both geoscience "experts" and local "non-experts" to come together and understand each other more effectively.

  10. Household hazardous waste disposal to landfill: Using LandSim to model leachate migration

    Municipal solid waste (MSW) landfill leachate contains a number of aquatic pollutants. A specific MSW stream often referred to as household hazardous waste (HHW) can be considered to contribute a large proportion of these pollutants. This paper describes the use of the LandSim (Landfill Performance Simulation) modelling program to assess the environmental consequences of leachate release from a generic MSW landfill in receipt of co-disposed HHW. Heavy metals and organic pollutants were found to migrate into the zones beneath a model landfill site over a 20,000-year period. Arsenic and chromium were found to exceed European Union and US-EPA drinking water standards at the unsaturated zone/aquifer interface, with levels of mercury and cadmium exceeding minimum reporting values (MRVs). The findings demonstrate the pollution potential arising from HHW disposal with MSW. - Aquatic pollutants linked to the disposal of household hazardous waste in municipal landfills have the potential to exist in soil and groundwater for many years

  11. Seismically induced landslide hazard and exposure modelling in Southern California based on the 1994 Northridge, California earthquake event

    Budimir, M.E.A.; Atkinson, P. M.; Lewis, H.G.

    2015-01-01

    Quantitative modelling of landslide hazard, as opposed to landslide susceptibility, as a function of the earthquake trigger is vital in understanding and assessing future potential exposure to landsliding. Logistic regression analysis is a method commonly used to assess susceptibility to landsliding; however, estimating probability of landslide hazard as a result of an earthquake trigger is rarely undertaken. This paper utilises a very detailed landslide inventory map and a comprehensive data...

  12. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  13. Two statistical models for long term seismic hazard assessment in Vrancea, Romania

    Intermediate-depth earthquakes have occurred frequently in Vrancea, Romania and caused severe damages. To understand the regularity of earthquake occurrence and to predict future earthquakes, we analyzed M ≥7.0 earthquakes during the period of 1500 - 2000 using earthquake catalogue ROMPLUS. Firstly, we have attempted to assess the long-term seismic hazards in Vrancea using a stress-release (SR) model which models the elastic rebound theory in a stochastic process. Renewal models were also applied to the same data set, but these did not perform as well as the SR-model. The SR-model has identified that the probability of an M≥7.0 earthquake occurring in Vrancea in a 5-year period exceeds 40% by the end of this decade. Secondly, we have proposed the periodic upward migration model, in which 1) the first M7 earthquake occurs at a deeper segment of the seismic region at the beginning of each century, 2) the second one occurs at a middle segment at the midst of each century, and 3) the third one occurs at a shallower segment at the end of each century. 4) The above activity repeats every century. We could demonstrate using AIC that this model is better than a uniform Poisson model in time and space. (authors)

  14. ADVANCES IN RENEWAL DECISION-MAKING UTILISING THE PROPORTIONAL HAZARDS MODEL WITH VIBRATION COVARIATES

    Pieter-Jan Vlok

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Increased competitiveness in the production world necessitates improved maintenance strategies to increase availabilities and drive down cost . The maintenance engineer is thus faced with the need to make more intelligent pre ventive renewal decisions . Two of the main techniques to achieve this is through Condition Monitoring (such as vibrat ion monitoring and oil anal ysis and Statistical Failure Analysis (typically using probabilistic techniques . The present paper discusses these techniques, their uses and weaknesses and then presents th e Proportional Hazard Model as an solution to most of these weaknesses. It then goes on to compare the results of the different techniques in monetary terms, using a South African case study. This comparison shows clearly that the Proportional Hazards Model is sup erior to the present t echniques and should be the preferred model for many actual maintenance situations.

    AFRIKAANSE OPSOMMING: Verhoogde vlakke van mededinging in die produksie omgewing noodsaak verbeterde instandhouding strategies om beskikbaarheid van toerusting te verhoog en koste te minimeer. Instandhoudingsingenieurs moet gevolglik meer intellegente voorkomende hernuwings besluite neem. Twee prominente tegnieke om hierdie doelwit te bereik is Toestandsmonitering (soos vibrasie monitering of olie analise en Statistiese Falingsanalise (gewoonlik m.b.v. probabilistiese metodes. In hierdie artikel beskou ons beide hierdie tegnieke, hulle gebruike en tekortkominge en stel dan die Proporsionele Gevaarkoers Model voor as 'n oplossing vir meeste van die tekortkominge. Die artikel vergelyk ook die verskillende tegnieke in geldelike terme deur gebruik te maak van 'n Suid-Afrikaanse gevalle studie. Hierdie vergelyking wys duidelik-uit dat die Proporsionele Gevaarkoers Model groter beloft e inhou as die huidige tegni eke en dat dit die voorkeur oplossing behoort te wees in baie werklike instandhoudings situasies.

  15. Socio-economic vulnerability to natural hazards - proposal for an indicator-based model

    Eidsvig, U.; McLean, A.; Vangelsten, B. V.; Kalsnes, B.; Ciurean, R. L.; Argyroudis, S.; Winter, M.; Corominas, J.; Mavrouli, O. C.; Fotopoulou, S.; Pitilakis, K.; Baills, A.; Malet, J. P.

    2012-04-01

    Vulnerability assessment, with respect to natural hazards, is a complex process that must consider multiple dimensions of vulnerability, including both physical and social factors. Physical vulnerability refers to conditions of physical assets, and may be modeled by the intensity and magnitude of the hazard, the degree of physical protection provided by the natural and built environment, and the physical robustness of the exposed elements. Social vulnerability refers to the underlying factors leading to the inability of people, organizations, and societies to withstand impacts from the natural hazards. Social vulnerability models can be used in combination with physical vulnerability models to estimate both direct losses, i.e. losses that occur during and immediately after the impact, as well as indirect losses, i.e. long-term effects of the event. Direct impact of a landslide typically includes casualties and damages to buildings and infrastructure while indirect losses may e.g. include business closures or limitations in public services. The direct losses are often assessed using physical vulnerability indicators (e.g. construction material, height of buildings), while indirect losses are mainly assessed using social indicators (e.g. economical resources, demographic conditions). Within the EC-FP7 SafeLand research project, an indicator-based method was proposed to assess relative socio-economic vulnerability to landslides. The indicators represent the underlying factors which influence a community's ability to prepare for, deal with, and recover from the damage associated with landslides. The proposed model includes indicators representing demographic, economic and social characteristics as well as indicators representing the degree of preparedness and recovery capacity. Although the model focuses primarily on the indirect losses, it could easily be extended to include more physical indicators which account for the direct losses. Each indicator is individually

  16. A multi-objective model for the hazardous materials transportation problem based on lane reservation

    Zhou, 1 Zhen; Chu, Feng; Che, Ada; Mammar, Saïd

    2012-01-01

    This paper presents an application of the lane reservation strategy in the hazardous materials transportation. Once an accident of hazardous materials transportation happens, its effect is significant. Lane reservation can reduce the hazardous materials transportation risk enormously; however, it will also impact on the normal traffic. The proposed problem is to choose lanes to be reserved on the network and select the path for each hazardous materials shipment among the reserved lanes in ord...

  17. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  18. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  19. A seismic source zone model for the seismic hazard assessment of the Italian territory

    Meletti, Carlo; Galadini, Fabrizio; Valensise, Gianluca; Stucchi, Massimiliano; Basili, Roberto; Barba, Salvatore; Vannucci, Gianfranco; Boschi, Enzo

    2008-04-01

    We designed a new seismic source model for Italy to be used as an input for country-wide probabilistic seismic hazard assessment (PSHA) in the frame of the compilation of a new national reference map. We started off by reviewing existing models available for Italy and for other European countries, then discussed the main open issues in the current practice of seismogenic zoning. The new model, termed ZS9, is largely based on data collected in the past 10 years, including historical earthquakes and instrumental seismicity, active faults and their seismogenic potential, and seismotectonic evidence from recent earthquakes. This information allowed us to propose new interpretations for poorly understood areas where the new data are in conflict with assumptions made in designing the previous and widely used model ZS4. ZS9 is made out of 36 zones where earthquakes with Mw > = 5 are expected. It also assumes that earthquakes with Mw up to 5 may occur anywhere outside the seismogenic zones, although the associated probability is rather low. Special care was taken to ensure that each zone sampled a large enough number of earthquakes so that we could compute reliable earthquake production rates. Although it was drawn following criteria that are standard practice in PSHA, ZS9 is also innovative in that every zone is characterised also by its mean seismogenic depth (the depth of the crustal volume that will presumably release future earthquakes) and predominant focal mechanism (their most likely rupture mechanism). These properties were determined using instrumental data, and only in a limited number of cases we resorted to geologic constraints and expert judgment to cope with lack of data or conflicting indications. These attributes allow ZS9 to be used with more accurate regionalized depth-dependent attenuation relations, and are ultimately expected to increase significantly the reliability of seismic hazard estimates.

  20. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models.

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; Gutiérrez-Gutiérrez, O. Q.; Larreynaga, J.; González, M.; Castro, M.; Gavidia, F.; Aguirre-Ayerbe, I.; González-Riancho, P.; Carreño, E.

    2013-11-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and deterministic methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold: on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high-resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps, and from the elevation in the near shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific Basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences-finite volumes numerical model in this work, based on the linear and non-linear shallow water equations, to simulate a total of 24 earthquake-generated tsunami scenarios. Our results show that at the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results

  1. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    J. A. Álvarez-Gómez

    2013-05-01

    Full Text Available El Salvador is the smallest and most densely populated country in Central America; its coast has approximately a length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there have been 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and hundreds of victims. The hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached from both Probabilistic and Deterministic Methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold, on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps and from the elevation in the near-shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences – finite volumes numerical model in this work, based on the Linear and Non-linear Shallow Water Equations, to simulate a total of 24 earthquake generated tsunami scenarios. In the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results obtained

  2. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve

  3. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product. PMID:11570169

  4. Development of models to inform a national Daily Landslide Hazard Assessment for Great Britain

    Dijkstra, Tom A.; Reeves, Helen J.; Dashwood, Claire; Pennington, Catherine; Freeborough, Katy; Mackay, Jonathan D.; Uhlemann, Sebastian S.; Chambers, Jonathan E.; Wilkinson, Paul B.

    2015-04-01

    were combined with records of observed landslide events to establish which antecedent effective precipitation (AEP) signatures of different duration could be used as a pragmatic proxy for the occurrence of landslides. It was established that 1, 7, and 90 days AEP provided the most significant correlations and these were used to calculate the probability of at least one landslide occurring. The method was then extended over the period 2006 to 2014 and the results evaluated against observed occurrences. It is recognised that AEP is a relatively poor proxy for simulating effective stress conditions along potential slip surfaces. However, the temporal pattern of landslide probability compares well to the observed occurrences and provides a potential benefit to assist with the DLHA. Further work is continuing to fine-tune the model for landslide type, better spatial resolution of effective precipitation input and cross-reference to models that capture changes in water balance and conditions along slip surfaces. The latter is facilitated by intensive research at several field laboratories, such as the Hollin Hill site in Yorkshire, England. At this site, a decade of activity has generated a broad range of research and a wealth of data. This paper reports on one example of recent work; the characterisation of near surface hydrology using infiltration experiments where hydrological pathways are captured, among others, by electrical resistivity tomography. This research, which has further developed our understanding of soil moisture movement in a heterogeneous landslide complex, has highlighted the importance of establishing detailed ground models to enable determination of landslide potential at high resolution. In turn, the knowledge gained through this research is used to enhance the expertise for the daily landslide hazard assessments at a national scale.

  5. A general additive-multiplicative rates model for recurrent event data

    2009-01-01

    In this article, we propose a general additive-multiplicative rates model for recurrent event data. The proposed model includes the additive rates and multiplicative rates models as special cases. For the inference on the model parameters, estimating equation approaches are developed, and asymptotic properties of the proposed estimators are established through modern empirical process theory. In addition, an illustration with multiple-infection data from a clinic study on chronic granulomatous disease is provided.

  6. Stiffness Model of a 3-DOF Parallel Manipulator with Two Additional Legs

    Yu, Guang; Wu, Jun; Wang, Liping

    2014-01-01

    This paper investigates the stiffness modelling of a 3-DOF parallel manipulator with two additional legs. The stiffness model in six directions of the 3-DOF parallel manipulator with two additional legs is derived by performing condensation of DOFs for the joint connection and treatment of the fixed-end connections. Moreover, this modelling method is used to derive the stiffness model of the manipulator with zero/one additional legs. Two performance indices are given to compare the stiffness ...

  7. Developing a functional model for cities impacted by a natural hazard: application to a city affected by flooding

    Bambara, G.; Peyras, L.; Felix, H.; Serre, D.

    2015-01-01

    The experience feedback on a crisis that hit a city is frequently used as a "recollection" tool. To capitalize information about an experience feedback from the cities that have been affected by a natural hazard, the authors propose in this study a functional model to model scenarios of city crises. In this model, the city, considered as a complex system, was modelled using a functional analysis method. Based on such modelling, two risk analysis methods (Failure Mode and Eff...

  8. Exploration of land-use scenarios for flood hazard modeling - the case of Santiago de Chile

    Müller, A.; Reinstorf, F.

    2011-04-01

    Urban expansion leads to modifications in land use and land cover and to the loss of vegetated areas. These developments are in some regions of the world accelerated by a changing regional climate. As a consequence, major changes in the amount of green spaces can be observed in many urban regions. Amongst other dependences the amount of green spaces determines the availability of retention areas in a watershed. The goal of this research is to develop possible land-use and land-cover scenarios for a watershed and to explore the influence of land-use and land-cover changes on its runoff behavior using the distributed hydrological model HEC-HMS. The study area for this research is a small peri-urban watershed in the eastern area of Santiago de Chile. Three spatially explicit exploratory land-use/land-cover scenario alternatives were developed based on the analysis of previous land-use developments using high resolution satellite data, on the analysis of urban planning laws, on the analysis of climate change predictions, and on expert interviews. Modeling the resulting changes in runoff allows making predictions about the changes in flood hazard which the adjacent urban areas are facing after heavy winter precipitation events. The paper shows how HEC-HMS was used applying a distributed event modeling approach. The derived runoff values are combined with existing flood hazard maps and can be regarded as important source of information for the adaptation to changing conditions in the study area. The most significant finding is that the land-use changes that have to be expected after long drought periods pose the highest risk with respect to floods.

  9. Weather modeling for hazard and consequence assessment operations during the 2006 Winter Olympic Games

    Hayes, P.; Trigg, J. L.; Stauffer, D.; Hunter, G.; McQueen, J.

    2006-05-01

    Consequence assessment (CA) operations are those processes that attempt to mitigate negative impacts of incidents involving hazardous materials such as chemical, biological, radiological, nuclear, and high explosive (CBRNE) agents, facilities, weapons, or transportation. Incident types range from accidental spillage of chemicals at/en route to/from a manufacturing plant, to the deliberate use of radiological or chemical material as a weapon in a crowded city. The impacts of these incidents are highly variable, from little or no impact to catastrophic loss of life and property. Local and regional scale atmospheric conditions strongly influence atmospheric transport and dispersion processes in the boundary layer, and the extent and scope of the spread of dangerous materials in the lower levels of the atmosphere. Therefore, CA personnel charged with managing the consequences of CBRNE incidents must have detailed knowledge of current and future weather conditions to accurately model potential effects. A meteorology team was established at the U.S. Defense Threat Reduction Agency (DTRA) to provide weather support to CA personnel operating DTRA's CA tools, such as the Hazard Prediction and Assessment Capability (HPAC) tool. The meteorology team performs three main functions: 1) regular provision of meteorological data for use by personnel using HPAC, 2) determination of the best performing medium-range model forecast for the 12 - 48 hour timeframe and 3) provision of real-time help-desk support to users regarding acquisition and use of weather in HPAC CA applications. The normal meteorology team operations were expanded during a recent modeling project which took place during the 2006 Winter Olympic Games. The meteorology team took advantage of special weather observation datasets available in the domain of the Winter Olympic venues and undertook a project to improve weather modeling at high resolution. The varied and complex terrain provided a special challenge to the

  10. Three-dimensional displays for natural hazards analysis, using classified Landsat Thematic Mapper digital data and large-scale digital elevation models

    Butler, David R.; Walsh, Stephen J.; Brown, Daniel G.

    1991-01-01

    Methods are described for using Landsat Thematic Mapper digital data and digital elevation models for the display of natural hazard sites in a mountainous region of northwestern Montana, USA. Hazard zones can be easily identified on the three-dimensional images. Proximity of facilities such as highways and building locations to hazard sites can also be easily displayed. A temporal sequence of Landsat TM (or similar) satellite data sets could also be used to display landscape changes associated with dynamic natural hazard processes.

  11. Probabilistic forecasts of debris-flow hazard at the regional scale with a combination of models.

    Malet, Jean-Philippe; Remaître, Alexandre

    2015-04-01

    Debris flows are one of the many active slope-forming processes in the French Alps, where rugged and steep slopes mantled by various slope deposits offer a great potential for triggering hazardous events. A quantitative assessment of debris-flow hazard requires the estimation, in a probabilistic framework, of the spatial probability of occurrence of source areas, the spatial probability of runout areas, the temporal frequency of events, and their intensity. The main objective of this research is to propose a pipeline for the estimation of these quantities at the region scale using a chain of debris-flow models. The work uses the experimental site of the Barcelonnette Basin (South French Alps), where 26 active torrents have produced more than 150 debris-flow events since 1850 to develop and validate the methodology. First, a susceptibility assessment is performed to identify the debris-flow prone source areas. The most frequently used approach is the combination of environmental factors with GIS procedures and statistical techniques, integrating or not, detailed event inventories. Based on a 5m-DEM and derivatives, and information on slope lithology, engineering soils and landcover, the possible source areas are identified with a statistical logistic regression model. The performance of the statistical model is evaluated with the observed distribution of debris-flow events recorded after 1850 in the study area. The source areas in the three most active torrents (Riou-Bourdoux, Faucon, Sanières) are well identified by the model. Results are less convincing for three other active torrents (Bourget, La Valette and Riou-Chanal); this could be related to the type of debris-flow triggering mechanism as the model seems to better spot the open slope debris-flow source areas (e.g. scree slopes), but appears to be less efficient for the identification of landslide-induced debris flows. Second, a susceptibility assessment is performed to estimate the possible runout distance

  12. Analysis of risk indicators and issues associated with applications of screening model for hazardous and radioactive waste sites

    Risk indicators, such as population risk, maximum individual risk, time of arrival of contamination, and maximum water concentrations, were analyzed to determine their effect on results from a screening model for hazardous and radioactive waste sites. The analysis of risk indicators is based on calculations resulting from exposure to air and waterborne contamination predicted with Multimedia Environmental Pollutant Assessment System (MEPAS) model. The different risk indicators were analyzed, based on constituent type and transport and exposure pathways. Three of the specific comparisons that were made are (1) population-based versus maximum individual-based risk indicators, (2) time of arrival of contamination, and (3) comparison of different threshold assumptions for noncarcinogenic impacts. Comparison of indicators for population- and maximum individual-based human health risk suggests that these two parameters are highly correlated, but for a given problem, one may be more important than the other. The results indicate that the arrival distribution for different levels of contamination reaching a receptor can also be helpful in decisions regarding the use of resources for remediating short- and long-term environmental problems. The addition of information from a linear model for noncarcinogenic impacts allows interpretation of results below the reference dose (RfD) levels that might help in decisions for certain applications. The analysis of risk indicators suggests that important information may be lost by the use of a single indicator to represent public health risk and that multiple indicators should be considered. 15 refs., 8 figs., 1 tab

  13. Release of hazardous substances in flood events: Damage model for horizontal cylindrical vessels

    Severe accidents may be triggered by the impact of floods on process and storage equipment containing hazardous substances. The present study analyses the possible damage of horizontal cylindrical equipment, either operating at atmospheric or at higher pressures. A mechanical damage model was developed and validated by available literature data on past accidents. Simplified correlations were then obtained to calculate the critical flooding conditions leading to vessel failure. A fragility model was proposed for the straightforward assessment of equipment damage probability in the framework of the quantitative risk assessment of NaTech scenarios triggered by floods. A case-study was discussed to test the potentialities of the method. - Highlights: • a model for the failure of horizontal vessels caused by floods was developed. • the model was validated with available literature data. • simplified correlations were obtained to allow the straightforward calculation of vessel failure probability. • the model is suitable the assessment of NaTech scenarios triggered by floods. • the assessment of a case-study evidenced that high failure frequencies may be obtained if flood-prone areas

  14. Doubly stochastic models for volcanic hazard assessment at Campi Flegrei caldera

    Bevilacqua, Andrea

    2016-01-01

    This study provides innovative mathematical models for assessing the eruption probability and associated volcanic hazards, and applies them to the Campi Flegrei caldera in Italy. Throughout the book, significant attention is devoted to quantifying the sources of uncertainty affecting the forecast estimates. The Campi Flegrei caldera is certainly one of the world’s highest-risk volcanoes, with more than 70 eruptions over the last 15,000 years, prevalently explosive ones of varying magnitude, intensity and vent location. In the second half of the twentieth century the volcano apparently once again entered a phase of unrest that continues to the present. Hundreds of thousands of people live inside the caldera and over a million more in the nearby city of Naples, making a future eruption of Campi Flegrei an event with potentially catastrophic consequences at the national and European levels.

  15. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two

  16. Accelerated life models modeling and statistical analysis

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  17. Development of a Probabilistic Tornado Wind Hazard Model for the Continental United States Volume I: Main Report

    Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526, in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States

  18. A class of additive-accelerated means regression models for recurrent event data

    2010-01-01

    In this article, we propose a class of additive-accelerated means regression models for analyzing recurrent event data. The class includes the proportional means model, the additive rates model, the accelerated failure time model, the accelerated rates model and the additive-accelerated rate model as special cases. The new model offers great flexibility in formulating the effects of covariates on the mean functions of counting processes while leaving the stochastic structure completely unspecified. For the inference on the model parameters, estimating equation approaches are derived and asymptotic properties of the proposed estimators are established. In addition, a technique is provided for model checking. The finite-sample behavior of the proposed methods is examined through Monte Carlo simulation studies, and an application to a bladder cancer study is illustrated.

  19. CFD model for large hazardous dense cloud spread predictions, with particular reference to Bhopal disaster

    Mishra, Kirti Bhushan

    2015-09-01

    A volumetric source based CFD (Computational Fluid Dynamics) model for estimating the wind and gravity driven spread of an elevated released dense hazardous cloud on a flat terrain without and with obstacles is demonstrated. The model considers the development of a worst-case scenario similar to that occurred at Bhopal. Fully developed clouds of a dense gas having different densities, under ABL (Atmospheric Boundary Layer) with calm ground wind conditions are first obtained. These clouds are then allowed to spread under ABL with different ground wind speeds and gravity conditions. The developed model is validated by performing the grid independent study, the fluid dynamical evidences, post-disaster facts, the downwind MIC (Methyl Isocynate) concentrations estimated by earlier models and experiments on dense plume trajectories. It is shown that in case of an active dispersion under calm wind conditions the lateral spread would prevail over the downwind spread. The presence of a dense medium behaves like a weak porous media and initiates turbulence at much smaller downwind distances than that normally would occur without the dense medium. The safety distances from toxic exposures of MIC are predicted by specifying an isosurface of a minimum concentration above the ground surface. Discrepancies in near-field predictions still exist. However, the far-field predictions agree well with data published before.

  20. Effectiveness of water infrastructure for river flood management – Part 1: Flood hazard assessment using hydrological models in Bangladesh

    M. A. Gusyev; Kwak, Y.; Khairul, M. I.; Arifuzzaman, M. B.; Magome, J.; Sawano, H.; Takeuchi, K

    2015-01-01

    This study introduces a flood hazard assessment part of the global flood risk assessment (Part 2) conducted with a distributed hydrological Block-wise TOP (BTOP) model and a GIS-based Flood Inundation Depth (FID) model. In this study, the 20 km grid BTOP model was developed with globally available data on and applied for the Ganges, Brahmaputra and Meghna (GBM) river basin. The BTOP model was calibrated with observed river discharges in Bangladesh and was applied for climate...

  1. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development Project

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  2. Methodologies, models and parameters for environmental, impact assessment of hazardous and radioactive contaminants; Metodologias, modelos y parametros para evaluacion del impacto ambiental de contaminantes peligrosos y radiactivos

    Aguero, A.; Cancio, D.; Garcia-Olivares, A.; Romero, L.; Pinedo, P.; Robles, B.; Rodriguez, J.; Simon, I.; Suanez, A.

    2003-07-01

    An Environmental Impact Assessment Methodology to assess the impact arising from contaminants present in hazardous and radioactive wastes has been developed. Taking into account of the background information on legislation, waste categories and contaminants inventory, and disposal, recycling and waste treatment options, an Environmental Impact Assessment Methodology (MEIA) is proposed. This is applicable to (i) several types of solid wastes (hazardous, radioactive and mixed wastes; (ii) several management options (recycling and temporal and final storage (in shallow and deep disposal)), (iii) several levels of data availability. Conceptual and mathematical models and software tools needed for the application of the MEIA have been developed. Bearing in mind that this is a complex process, both the models and tools have to be developed following an iterative approaches, involving refinement of the models and go as to better correspond the described system. The selection of suitable parameters for the models is based on information derived from field and laboratory measurements and experiments, nd then applying a data elicitation protocol.. It is shown an application performed for a hypothetical shallow radioactive waste disposal facility (test case), with all the steps of the MEIA applied sequentially. In addition, the methodology is applied to an actual cases of waste management for hazardous wastes from the coal fuel cycle, demonstrating several possibilities for application of the MEIA from a practical perspective. The experience obtained in the development of the work shows that the use of the MEIA for the assessment of management options for hazardous and radioactive wastes gives important advantages, simplifying the execution of the assessment, its tracability and the dissemination of methodology assessment results to to other interested parties. (Author)

  3. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  4. Local models for rainstorm-induced hazard analysis on Mediterranean river-torrential geomorphological systems

    N. Diodato

    2004-01-01

    Full Text Available Damaging hydrogeomorphological events are defined as one or more simultaneous phenomena (e.g. accelerated erosions, landslides, flash floods and river floods, occurring in a spatially and temporal random way and triggered by rainfall with different intensity and extent. The storm rainfall values are highly dependent on weather condition and relief. However, the impact of rainstorms in Mediterranean mountain environments depend mainly on climatic fluctuations in the short and long term, especially in rainfall quantity. An algorithm for the characterisation of this impact, called Rainfall Hazard Index (RHI, is developed with a less expensive methodology. In RHI modelling, we assume that the river-torrential system has adapted to the natural hydrological regime, and a sudden fluctuation in this regime, especially those exceeding thresholds for an acceptable range of flexibility, may have disastrous consequences for the mountain environment. RHI integrate two rainfall variables based upon storm depth current and historical data, both of a fixed duration, and a one-dimensionless parameter representative of the degree ecosystem flexibility. The approach was applied to a test site in the Benevento river-torrential landscape, Campania (Southern Italy. So, a database including data from 27 events which have occurred during an 77-year period (1926-2002 was compared with Benevento-station RHI(24h, for a qualitative validation. Trends in RHIx for annual maximum storms of duration 1, 3 and 24h were also examined. Little change is observed at the 3- and 24-h duration of a storm, but a significant increase results in hazard of a short and intense storm (RHIx(1h, in agreement with a reduction in return period for extreme rainfall events.

  5. Quantitative hazard assessment at Vulcano (Aeolian islands): integration of geology, event statistics and physical modelling

    Dellino, Pierfrancesco; de Astis, Gianfilippo; La Volpe, Luigi; Mele, Daniela; Sulpizio, Roberto

    2010-05-01

    The analysis of stratigraphy and of pyroclastic deposits particle features allowed the reconstruction of the volcanic history of La Fossa di Vulcano. An eruptive scenario driven by superficial phreatomagmatic explosions emerged. A statistical analysis of the pyroclastic Successions led to define a repetitive sequence of dilute pyroclastic density currents as the most probable events at short term, followed by fallout of dense ballistic blocks. The scale of such events is related to the amount of magma involved in each explosion. Events involving a million of cubic meters of magma are probable in view of what happened in the most recent eruptions. They led to the formation of hundreds of meters thick dilute pyroclastic density currents, moving down the volcano slope at velocities exceeding 50 m/sec. The dispersion of desnity currents affected the whole Vulcano Porto area, the Vulcanello area and also overrode the Fossa Caldera's rim, spreading over the Piano area. Similarly, older pyroclastic deposits erupted at different times (Piano Grotte dei Rossi formation, ~20-7.7 ka) from vents within La Fossa Caldera and before La Fossa Cone formation. They also were phreatomagmatic in origin and fed dilute pyroclastic density currents (PDC). They represent the eruptions with the highest magnitude on the Island. Therefore, for the aim of hazard assessment, these deposits from La Fossa Cone and La Fossa Caldera were used to depict eruptive scenarios at short term and at long term. On the base of physical models that make use of pyroclastic deposits particle features, the impact parameters for each scenario have been calculated. They are dynamic pressure and particle volumetric concentration of density currents, and impact energy of ballistic blocks. On this base, a quantitative hazard map is presented, which could be of direct use for territory planning and for the calculation of the expected damage.

  6. Stiffness Model of a 3-DOF Parallel Manipulator with Two Additional Legs

    Guang Yu

    2014-10-01

    Full Text Available This paper investigates the stiffness modelling of a 3-DOF parallel manipulator with two additional legs. The stiffness model in six directions of the 3-DOF parallel manipulator with two additional legs is derived by performing condensation of DOFs for the joint connection and treatment of the fixed-end connections. Moreover, this modelling method is used to derive the stiffness model of the manipulator with zero/one additional legs. Two performance indices are given to compare the stiffness of the parallel manipulators with two additional legs with those of the manipulators with zero/one additional legs. The method not only can be used to derive the stiffness model of a redundant parallel manipulator, but also to model the stiffness of non-redundant parallel manipulators.

  7. Identifying model pollutants to investigate biodegradation of hazardous XOCs in WWTPs

    Press-Kristensen, Kaare; Ledin, Anna; Schmidt, Jens Ejbye; Henze, Mogens [Department of Environment and Resources, Technical University of Denmark Building 115, 2800 Lyngby (Denmark)

    2007-02-01

    Xenobiotic organic compounds (XOCs) in wastewater treatment plant (WWTP) effluents might cause toxic effects in ecosystems. Several investigations have emphasized biodegradation as an important removal mechanism to reduce pollution with XOCs from WWTP effluents. The aim of the study was to design a screening tool to identify and select hazardous model pollutants for the further investigation of biodegradation in WWTPs. The screening tool consists of three criteria: The XOC is present in WWTP effluents, the XOC constitutes an intolerable risk in drinking water or the environment, and the XOC is expected to be biodegradable in WWTPs. The screening tool was tested on bisphenol A (BPA), carbamazepine (CBZ), di(2ethylhexyl)-phthalate (DEHP), 17{beta}-estradiol (E2), estrone (E1), 17{alpha}-ethinyloetradiol (EE2), ibuprofen, naproxen, nonylphenol (NP), and octylphenol (OP). BPA, DEHP, E2, E1, EE2, and NP passed all criteria in the screening tool and were selected as model pollutants. OP did not pass the filter and was rejected as model pollutant. CBZ, ibuprofen, and naproxen were not finally evaluated due to insufficient data. (author)

  8. Mediation Analysis with Survival Outcomes: Accelerated Failure Time Versus Proportional Hazards Models

    Lois A Gelfand

    2016-03-01

    Full Text Available Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH and fully parametric accelerated failure time (AFT approaches for illustration.Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively under varied data conditions, some including censoring. A simulated data set illustrates the findings.Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome – underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG.Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  9. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships

  10. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    Harper, Bryan [Oregon State University (United States); Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan [Pacific Northwest National Laboratory (United States); Tang, Kaizhi [Intelligent Automation, Inc. (United States); Heredia-Langner, Alejandro [Pacific Northwest National Laboratory (United States); Lins, Roberto [CPqAM, Oswaldo Cruz Foundation, FIOCRUZ-PE (Brazil); Harper, Stacey, E-mail: stacey.harper@oregonstate.edu [Oregon State University (United States)

    2015-06-15

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships.

  11. A forecasting and forewarning model for methane hazard in working face of coal mine based on LS-SVM

    CAO Shu-gang; LIU Yan-bao; WANG Yan-ping

    2008-01-01

    To improve the precision and reliability in predicting methane hazard in working face of coal mine, we have proposed a forecasting and forewarning model for methane hazard based on the least square support vector (LS-SVM) multi-classifier and regression machine. For the forecasting model, the methane concentration can be considered as a nonlinear time series and the time series analysis method is adopted to predict the change in methane concentration using LS-SVM regression. For the forewarning model, which is based on the forecasting results, by the multi-classification method of LS-SVM, the methane hazard was identified to four grades: normal, attention, warning and danger. According to the forewarning results, corresponding measures are taken. The model was used to forecast and forewarn the K9 working face. The results obtained by LS-SVM regression show that the forecast- ing have a high precision and forewarning results based on a LS-SVM multi-classifier are credible. Therefore, it is an effective model building method for continuous prediction of methane concentration and hazard forewarning in working face.

  12. A forecasting and forewarning model for methane hazard in working face of coal mine based on LS-SVM

    Shu-gang Cao; Yan-bao Liu; Yan-ping Wang [Chongqing University, Chongqing (China). Key Laboratory for the Exploitation of Southwest Resources and the Environmental Disaster Control Engineering, Ministry of Education

    2008-06-15

    To improve the precision and reliability in predicting the methane hazard in a working face of a coal mine, a forecasting and forewarning model for methane hazard is proposed based on the least square support vector (LS-SVM) multi-classifier and regression machine. For the forecasting model, the methane concentration can be considered as a nonlinear time series and the time series analysis method is adopted to predict the change in methane concentration using LS-SVM regression. For the forewarning model, which is based on the forecasting results, by the multi-classification method of LS-SVM, the methane hazard was identified to four grades: normal, attention, warning and danger. According to the forewarning results, corresponding measures are taken. The model was used to forecast and forewarn the K9 working face. The results obtained by LS-SVM regression show that the forecasting have a high precision and forewarning results based on a LS-SVM multi-classifier are credible. Therefore, it is an effective model building method for continuous prediction of methane concentration and hazard forewarning in the working face. 20 refs., 2 figs., 3 tabs.

  13. Multiprocessing and Correction Algorithm of 3D-models for Additive Manufacturing

    Anamova, R. R.; Zelenov, S. V.; Kuprikov, M. U.; Ripetskiy, A. V.

    2016-07-01

    This article addresses matters related to additive manufacturing preparation. A layer-by-layer model presentation was developed on the basis of a routing method. Methods for correction of errors in the layer-by-layer model presentation were developed. A multiprocessing algorithm for forming an additive manufacturing batch file was realized.

  14. Validation analysis of probabilistic models of dietary exposure to food additives.

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty. PMID:14555358

  15. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  16. Long-Term Slip History Discriminates Among Occurrence Models for Seismic Hazard Assessment

    Fitzenz, D. D.; Ferry, M. A.; Jalobeanu, A.

    2010-12-01

    Today, the probabilistic seismic hazard assessment (PSHA) community relies on one or a combination of stochastic models to compute occurrence probabilities for large earthquakes. Considerable efforts have been devoted to extracting the maximum information from long catalogues of large earthquakes (CLE) based on instrumental, historical, archeological and paleoseismological data (Biasi et al, 2009, Parsons, 2008, Rhoades and Dissen 2003). However, the models remain only and insufficiently constrained by these rare single-slip event data. Therefore, the selection of the models and their respective weights is necessarily left with the appreciation of a panel of experts (WGCEP, 2003). Since cumulative slip data with high temporal and spatial resolution are now available, we propose here a new approach to incorporate these pieces of evidence of mid- to long-term fault behavior into the next generation of PSHA: the Cumulative Offset-Based Bayesian Recurrence Analysis (COBBRA). Applied to the Jordan Valley segment of the Dead Sea Fault, the method yields the best combination of occurrence models for full-segment ruptures knowing the available single-event and cumulative data. Not only does our method provide data-driven, objective weights to the competing models, but it also allows to rule out time-independence, and to compute the cumulative probability of occurrence for the next full-segment event reflecting all available data. References: Biasi, G. P. & Weldon, R. J., II. Bull. Seism. Soc. Am. 99, 471-498, doi:10.1785/0120080287 (2009). Parsons, T. J. Geophys. Res., 113, doi:10.1029/2007JB004,998.216 (2008) Rhoades, D. A., and R. J. V. Dissen, New Zealand Journal of Geology & Geophysics, 46, 479-488 (2003). Working Group On California Earthquake Probabilities. Earthquake Probabilities in the San Francisco Bay Region: 2002-2031. (2003).

  17. Structured Additive Synthesis: Towards a Model of Sound Timbre and Electroacoustic Music Forms

    Desainte-Catherine, M.; Marchand, Sylvain

    1999-01-01

    We have developed a sound model used for exploring sound timbre. This model is called Structured Additive Synthesis, or SAS for short. It has the exibility of additive synthesis while addressing the fact that basic additive synthesis is extremely dicult to use directly for creating and editing sounds. SAS consists of a complete abstraction of sounds according to only four parameters: amplitude, frequency, color, and warping. These parameters are inspired by the vocabulary of composers of elec...

  18. Effects of Additional Foods to Predators on Nutrient-Consumer-Predator Food Chain Model

    Banshidhar Sahoo

    2012-01-01

    We have proposed a nutrient-consumer-predator model with additional food to predator, at variable nutrient enrichment levels. The boundedness property and the conditions for local stability of boundary and interior equilibrium points of the system are derived. Bifurcation analysis is done with respect to quality and quantity of additional food and consumer’s death rate for the model. The system has stable as well as unstable dynamics depending on supply of additional food to predator. This mo...

  19. Modeling and hazard mapping of complex cascading mass movement processes: the case of glacier lake 513, Carhuaz, Peru

    Schneider, Demian; Huggel, Christian; García, Javier; Ludeña, Sebastian; Cochachin, Alejo

    2013-04-01

    The Cordilleras in Peru are especially vulnerable to, and affected by impacts from climate change. Local communities and cities often exist directly within the reach of major hazard potentials such as lake outburst floods (aluviones), mud-/debris flows (huaycos) or large rock-/ice avalanches. They have been repeatedly and strongly affected these regions over the last decades and since the last century, and thousands of people have been killed. One of the most recent events in the Cordillera Blanca occurred on 11 April 2010, when a rock/ice avalanche from the top of Hualcán mountain, NE of the town of Carhuaz impacted the glacier lake 513 (Laguna 513), caused displacement waves and triggered an outburst flood wave. The flow repeatedly transformed from debris flow to hyperconcentrated flow and eventually caused significant damage in Carhuaz. This event was motivation to start early warning and prevention efforts to reduce risks related to ice/rock avalanches and glacier lake outburst floods (GLOF). One of the basic components of an early warning system is the assessment, understanding and communication of relevant hazards and risks. Here we report on the methodology and results of generating GLOF related hazard maps for Carhuaz based on numerical modeling and field work. This exercise required an advanced concept and implementation of different mass movement models. Specifically, numerical models were applied for simulating avalanche flow, avalanche lake impact, displacement wave generation and lake overtopping, and eventually flow propagation of the outburst flood with changing rheology between debris flow and hyperconcentrated flows. We adopted a hazard mapping procedure slightly adjusted adjusted from guidelines developed in Switzerland and in the Andes region. A methodology has thereby been developed to translate results from numerical mass movement modeling into hazard maps. The resulting hazard map was verified and adjusted during field work. This study shows

  20. Reliability estimation and remaining useful lifetime prediction for bearing based on proportional hazard model

    王鹭; 张利; 王学芝

    2015-01-01

    As the central component of rotating machine, the performance reliability assessment and remaining useful lifetime prediction of bearing are of crucial importance in condition-based maintenance to reduce the maintenance cost and improve the reliability. A prognostic algorithm to assess the reliability and forecast the remaining useful lifetime (RUL) of bearings was proposed, consisting of three phases. Online vibration and temperature signals of bearings in normal state were measured during the manufacturing process and the most useful time-dependent features of vibration signals were extracted based on correlation analysis (feature selection step). Time series analysis based on neural network, as an identification model, was used to predict the features of bearing vibration signals at any horizons (feature prediction step). Furthermore, according to the features, degradation factor was defined. The proportional hazard model was generated to estimate the survival function and forecast the RUL of the bearing (RUL prediction step). The positive results show that the plausibility and effectiveness of the proposed approach can facilitate bearing reliability estimation and RUL prediction.

  1. Extended FRAM by Integrating with Model Checking to Effectively Explore Hazard Evolution

    Guihuan Duan

    2015-01-01

    Full Text Available Functional Resonance Analysis Method (FRAM, which defines a systemic framework to model complex systems from the perspective of function and views accidents as emergent phenomenon of function’s variability, is playing an increasingly significant role in the development of systemic accident theory. However, as FRAM is typically taken as a theoretic method, there is a lack of specific approaches or supportive tools to bridge the theory and practice. To fill the gap and contribute to the development of FRAM, (1 function’s variability was described further, with the rules of interaction among variability of different functions being determined and (2 the technology of model checking (MC was used for the analysis of function’s variability to automatically search the potential paths that could lead to hazards. By means of MC, system’s behaviors (normal or abnormal are simulated and the counter example(s that violates the safety constraints and requirements can be provided, if there is any, to improve the system design. The extended FRAM approach was applied to a typical air accident analysis, with more details drawn than the conclusions in the accident report issued officially by Agenzia Nazionale per la Sicurezza del Volo (ANSV.

  2. Class 1 Permit Modification Notification Addition of Structures within Technical Area 54, Area G, Pad 11, Dome 375 Los Alamos National Laboratory Hazardous Waste Facility Permit, July 2012

    The purpose of this letter is to notify the New Mexico Environment Department-Hazardous Waste Bureau (NMED-HWB) of a Class 1 Permit Modification to the Los Alamos National Laboratory (LANL) Hazardous Waste Facility Permit issued to the Department of Energy (DOE) and Los Alamos National Security, LLC (LANS) in November 2010. The modification adds structures to the container storage unit at Technical Area (TA) 54 Area G, Pad 11. Permit Section 3.1(3) requires that changes to the location of a structure that does not manage hazardous waste shall be changed within the Permit as a Class 1 modification without prior approval in accordance with Code of Federal Regulations, Title 40 (40 CFR), (section)270.42(a)(1). Structures have been added within Dome 375 located at TA-54, Area G, Pad 11 that will be used in support of waste management operations within Dome 375 and the modular panel containment structure located within Dome 375, but will not be used as waste management structures. The Class 1 Permit Modification revises Figure 36 in Attachment N, Figures; and Figure G.12-1 in Attachment G.12, Technical Area 54, Area G, Pad 11 Outdoor Container Storage Unit Closure Plan. Descriptions of the structures have also been added to Section A.4.2.9 in Attachment A, TA - Unit Descriptions; and Section 2.0 in Attachment G.12, Technical Area 54, Area G, Pad 11 Outdoor Container Storage Unit Closure Plan. Full description of the permit modification and the necessary changes are included in Enclosure 1. The modification has been prepared in accordance with 40 CFR (section)270.42(a)(l). This package includes this letter and an enclosure containing a description of the permit modification, text edits of the Permit sections, and the revised figures (collectively LA-UR--12-22808). Accordingly, a signed certification page is also enclosed. Three hard copies and one electronic copy of this submittal will be delivered to the NMED-HWB.

  3. Bankruptcy prediction : static logit and discrete hazard models incorporating macoreconomic dependencies and industry effects

    Sheikh, Suleman; Yahya, Muhammad

    2015-01-01

    In this thesis, we present firm default prediction models based on firm financial statements and macroeconomic variables. We seek to develop reliable models to forecast out-of-sample default probability, and we are particularly interested in exploring the impact of incorporating macroeconomic variables and industry effects. To the best of our knowledge, this is the first study to account for both macroeconomic dependencies and industry effects in one analysis. Additionally, we ...

  4. Time Series Forecasting by using Seasonal Autoregressive Integrated Moving Average: Subset, Multiplicative or Additive Model

    Suhartono

    2011-01-01

    Full Text Available Problem statement: Most of Seasonal Autoregressive Integrated Moving Average (SARIMA models that used for forecasting seasonal time series are multiplicative SARIMA models. These models assume that there is a significant parameter as a result of multiplication between nonseasonal and seasonal parameters without testing by certain statistical test. Moreover, most popular statistical software such as MINITAB and SPSS only has facility to fit a multiplicative model. The aim of this research is to propose a new procedure for indentifying the most appropriate order of SARIMA model whether it involves subset, multiplicative or additive order. In particular, the study examined whether a multiplicative parameter existed in the SARIMA model. Approach: Theoretical derivation about Autocorrelation (ACF and Partial Autocorrelation (PACF functions from subset, multiplicative and additive SARIMA model was firstly discussed and then R program was used to create the graphics of these theoretical ACF and PACF. Then, two monthly datasets were used as case studies, i.e. the international airline passenger data and series about the number of tourist arrivals to Bali, Indonesia. The model identification step to determine the order of ARIMA model was done by using MINITAB program and the model estimation step used SAS program to test whether the model consisted of subset, multiplicative or additive order. Results: The theoretical ACF and PACF showed that subset, multiplicative and additive SARIMA models have different patterns, especially at the lag as a result of multiplication between non-seasonal and seasonal lags. Modeling of the airline data yielded a subset SARIMA model as the best model, whereas an additive SARIMA model is the best model for forecasting the number of tourist arrivals to Bali. Conclusion: Both of case studies showed that a multiplicative SARIMA model was not the best model for forecasting these data. The comparison evaluation showed that subset

  5. Tsunami hazard assessment along the French Mediterranean coast : detailed modeling of tsunami impacts for the ALDES project

    Quentel, E.; Loevenbruck, A.; Hébert, H.

    2012-04-01

    The catastrophic 2004 tsunami drew the international community's attention to tsunami risk in all basins where tsunamis occurred but no warning system exists. Consequently, under the coordination of UNESCO, France decided to create a regional center, called CENALT, for the north-east Atlantic and the western Mediterranean. This warning system, which should be operational by 2012, is set up by the CEA in collaboration with the SHOM and the CNRS. The French authorities are in charge of the top-down alert system including the local alert dissemination. In order to prepare the appropriate means and measures, they initiated the ALDES (Alerte Descendante) project to which the CEA also contributes. It aims at examining along the French Mediterranean coast the tsunami risk related to earthquakes and landslides. In addition to the evaluation at regional scale, it includes the detailed studies of 3 selected sites; the local alert system will be designed for one of them : the French Riviera. In this project, our main task at CEA consists in assessing tsunami hazard related to seismic sources using numerical modeling. Past tsunamis have affected the west Mediterranean coast but are too few and poorly documented to provide a suitable database. Thus, a synthesis of earthquakes representative of the tsunamigenic seismic activity and prone to induce the largest impact to the French coast is performed based on historical data, seismotectonics and first order models. The North Africa Margin, the Ligurian and the South Tyrrhenian Seas are considered as the main tsunamigenic zones. In order to forecast the most important plausible effects, the magnitudes are estimated by enhancing to some extent the largest known values. Our hazard estimation is based on the simulation of the induced tsunamis scenarios performed with the CEA code. The 3 sites have been chosen according to the regional hazard studies, coastal typology elements and the appropriate DTMs (Digital Terrain Models). The

  6. Modeling of Natural Coastal Hazards in Puerto Rico in Support of Emergency Management and Coastal Planning

    Mercado, A., Jr.

    2015-12-01

    The island of Puerto Rico is not only located in the so-called Caribbean hurricane alley, but is also located in a tsunami prone region. And both phenomena have affected the island. For the past few years we have undergone the task of upgrading the available coastal flood maps due to storm surges and tsunamis. This has been done taking advantage of new Lidar-derived, high resolution, topography and bathymetry and state-of-the-art models (MOST for tsunamis and ADCIRC/SWAN for storm surges). The tsunami inundation maps have been converted to evacuation maps. In tsunamis we are also working in preparing hazard maps due to tsunami currents inside ports, bays, and marinas. The storm surge maps include two scenarios of sea level rise: 0.5 and 1.0 m above Mean High Water. All maps have been adopted by the Puerto Rico State Emergency Management Agency, and are publicly available through the Internet. It is the purpose of this presentation to summarize how it has been done, the spin-off applications they have generated, and how we plan to improve coastal flooding predictions.

  7. On the predictive information criteria for model determination in seismic hazard analysis

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  8. Evaluation of model-predicted hazardous air pollutants (HAPs) near a mid-sized U.S. airport

    Vennam, Lakshmi Pradeepa; Vizuete, William; Arunachalam, Saravanan

    2015-10-01

    Accurate modeling of aircraft-emitted pollutants in the vicinity of airports is essential to study the impact on local air quality and to answer policy and health-impact related issues. To quantify air quality impacts of airport-related hazardous air pollutants (HAPs), we carried out a fine-scale (4 × 4 km horizontal resolution) Community Multiscale Air Quality model (CMAQ) model simulation at the T.F. Green airport in Providence (PVD), Rhode Island. We considered temporally and spatially resolved aircraft emissions from the new Aviation Environmental Design Tool (AEDT). These model predictions were then evaluated with observations from a field campaign focused on assessing HAPs near the PVD airport. The annual normalized mean error (NME) was in the range of 36-70% normalized mean error for all HAPs except for acrolein (>70%). The addition of highly resolved aircraft emissions showed only marginally incremental improvements in performance (1-2% decrease in NME) of some HAPs (formaldehyde, xylene). When compared to a coarser 36 × 36 km grid resolution, the 4 × 4 km grid resolution did improve performance by up to 5-20% NME for formaldehyde and acetaldehyde. The change in power setting (from traditional International Civil Aviation Organization (ICAO) 7% to observation studies based 4%) doubled the aircraft idling emissions of HAPs, but led to only a 2% decrease in NME. Overall modeled aircraft-attributable contributions are in the range of 0.5-28% near a mid-sized airport grid-cell with maximum impacts seen only within 4-16 km from the airport grid-cell. Comparison of CMAQ predictions with HAP estimates from EPA's National Air Toxics Assessment (NATA) did show similar annual mean concentrations and equally poor performance. Current estimates of HAPs for PVD are a challenge for modeling systems and refinements in our ability to simulate aircraft emissions have made only incremental improvements. Even with unrealistic increases in HAPs aviation emissions the model

  9. Simulating floods : On the application of a 2D-hydraulic model for flood hazard and risk assessment

    Alkema, D.

    2007-01-01

    Over the last decades, river floods in Europe seem to occur more frequently and are causing more and more economic and emotional damage. Understanding the processes causing flooding and the development of simulation models to evaluate countermeasures to control that damage are important issues. This study deals with the application of a 2D hydraulic flood propagation model for flood hazard and risk assessment. It focuses on two components: 1) how well does it predict the spatial-dynamic chara...

  10. A framework for modeling clustering in natural hazard catastrophe risk management and the implications for re/insurance loss perspectives

    Khare, S; Bonazzi, A.; C. Mitas; S. Jewson

    2014-01-01

    In this paper, we present a novel framework for modelling clustering in natural hazard risk models. The framework we present is founded on physical principles where large-scale oscillations in the physical system is the source of non-Poissonian (clustered) frequency behaviour. We focus on a particular mathematical implementation of the "Super-Cluster" methodology that we introduce. This mathematical framework has a number of advantages including tunability...

  11. Predictive Modeling of Chemical Hazard by Integrating Numerical Descriptors of Chemical Structures and Short-term Toxicity Assay Data

    Rusyn, Ivan; Sedykh, Alexander; Low, Yen; Guyton, Kathryn Z.; Tropsha, Alexander

    2012-01-01

    Quantitative structure-activity relationship (QSAR) models are widely used for in silico prediction of in vivo toxicity of drug candidates or environmental chemicals, adding value to candidate selection in drug development or in a search for less hazardous and more sustainable alternatives for chemicals in commerce. The development of traditional QSAR models is enabled by numerical descriptors representing the inherent chemical properties that can be easily defined for any number of molecules...

  12. Numerical Stress Field Modelling: from geophysical observations toward volcano hazard assessment

    Currenti, Gilda; Coco, Armando; Privitera, Emanuela

    2015-04-01

    . Numerical results show the contribution of groundwater head gradients associated with topographically induced flow and pore-pressure changes, providing a quantitative estimate for deformation and failure of volcano edifice. The comparison between the predictions of the model and the observations can provide valuable insights about the stress state of the volcano and, hence, about the likelihood of an impending eruption. This innovative approach opens up new perspectives in geodetic inverse modelling and poses the basis for future development in a volcano hazard assessment based on a critical combination of geophysical observations and numerical modelling.

  13. Modelling poverty by not modelling poverty: an application of a simultaneous hazards approach to the UK

    Aassve, Arnstein; Burgess, Simon; Dickson, Matt; Propper, Carol

    2006-01-01

    We pursue an economic approach to analysing poverty. This requires a focus on the variables that individuals can influence, such as forming or dissolving a union or having children. We argue that this indirect approach to modelling poverty is the right way to bring economic tools to bear on the issue. In our implementation of this approach, we focus on endogenous demographic and employment transitions as the driving forces behind changes in poverty. We construct a dataset covering event histo...

  14. Considering the Epistemic Uncertainties of the Variogram Model in Locating Additional Exploratory Drillholes

    Saeed Soltani

    2015-06-01

    Full Text Available To enhance the certainty of the grade block model, it is necessary to increase the number of exploratory drillholes and collect more data from the deposit. The inputs of the process of locating these additional drillholes include the variogram model parameters, locations of the samples taken from the initial drillholes, and the geological block model. The uncertainties of these inputs will lead to uncertainties in the optimal locations of additional drillholes. Meanwhile, the locations of the initial data are crisp, but the variogram model parameters and the geological model have uncertainties due to the limitation of the number of initial data. In this paper, effort has been made to consider the effects of variogram uncertainties on the optimal location of additional drillholes using the fuzzy kriging and solve the locating problem with the genetic algorithm (GA optimization method.A bauxite deposit case study has shown the efficiency of the proposed model.

  15. Flow-R, a model for susceptibility mapping of debris flows and other gravitational hazards at a regional scale

    P. Horton

    2013-04-01

    Full Text Available The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM. The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time

  16. A spatiotemporal optimization model for the evacuation of the population exposed to flood hazard

    Alaeddine, H.; Serrhini, K.; Maizia, M.

    2015-03-01

    Managing the crisis caused by natural disasters, and especially by floods, requires the development of effective evacuation systems. An effective evacuation system must take into account certain constraints, including those related to traffic network, accessibility, human resources and material equipment (vehicles, collecting points, etc.). The main objective of this work is to provide assistance to technical services and rescue forces in terms of accessibility by offering itineraries relating to rescue and evacuation of people and property. We consider in this paper the evacuation of an urban area of medium size exposed to the hazard of flood. In case of inundation, most people will be evacuated using their own vehicles. Two evacuation types are addressed in this paper: (1) a preventive evacuation based on a flood forecasting system and (2) an evacuation during the disaster based on flooding scenarios. The two study sites on which the developed evacuation model is applied are the Tours valley (Fr, 37), which is protected by a set of dikes (preventive evacuation), and the Gien valley (Fr, 45), which benefits from a low rate of flooding (evacuation before and during the disaster). Our goal is to construct, for each of these two sites, a chronological evacuation plan, i.e., computing for each individual the departure date and the path to reach the assembly point (also called shelter) according to a priority list established for this purpose. The evacuation plan must avoid the congestion on the road network. Here we present a spatiotemporal optimization model (STOM) dedicated to the evacuation of the population exposed to natural disasters and more specifically to flood risk.

  17. Regularization for Generalized Additive Mixed Models by Likelihood-Based Boosting

    Groll, Andreas; Tutz, Gerhard

    2012-01-01

    With the emergence of semi- and nonparametric regression the generalized linear mixed model has been expanded to account for additive predictors. In the present paper an approach to variable selection is proposed that works for generalized additive mixed models. In contrast to common procedures it can be used in high-dimensional settings where many covariates are available and the form of the influence is unknown. It is constructed as a componentwise boosting method and hence is able to pe...

  18. Numerical modeling of debris avalanches at Nevado de Toluca (Mexico): implications for hazard evaluation and mapping

    Grieco, F.; Capra, L.; Groppelli, G.; Norini, G.

    2007-05-01

    The present study concerns the numerical modeling of debris avalanches on the Nevado de Toluca Volcano (Mexico) using TITAN2D simulation software, and its application to create hazard maps. Nevado de Toluca is an andesitic to dacitic stratovolcano of Late Pliocene-Holocene age, located in central México near to the cities of Toluca and México City; its past activity has endangered an area with more than 25 million inhabitants today. The present work is based upon the data collected during extensive field work finalized to the realization of the geological map of Nevado de Toluca at 1:25,000 scale. The activity of the volcano has developed from 2.6 Ma until 10.5 ka with both effusive and explosive events; the Nevado de Toluca has presented long phases of inactivity characterized by erosion and emplacement of debris flow and debris avalanche deposits on its flanks. The largest epiclastic events in the history of the volcano are wide debris flows and debris avalanches, occurred between 1 Ma and 50 ka, during a prolonged hiatus in eruptive activity. Other minor events happened mainly during the most recent volcanic activity (less than 50 ka), characterized by magmatic and tectonic-induced instability of the summit dome complex. According to the most recent tectonic analysis, the active transtensive kinematics of the E-W Tenango Fault System had a strong influence on the preferential directions of the last three documented lateral collapses, which generated the Arroyo Grande and Zaguàn debris avalanche deposits towards E and Nopal debris avalanche deposit towards W. The analysis of the data collected during the field work permitted to create a detailed GIS database of the spatial and temporal distribution of debris avalanche deposits on the volcano. Flow models, that have been performed with the software TITAN2D, developed by GMFG at Buffalo, were entirely based upon the information stored in the geological database. The modeling software is built upon equations

  19. The chaos and control of a food chain model supplying additional food to top-predator

    Highlights: • We propose a chaotic food chain model supplying additional food to top-predator. • Local and global stability conditions are derived in presence of additional food. • Chaos is controlled only by increasing quantity of additional food. • System enters into periodic region and depicts Hopf bifurcations supplying additional food. • This an application of non-chemical methods for controlling chaos. -- Abstract: The control and management of chaotic population is one of the main objectives for constructing mathematical model in ecology today. In this paper, we apply a technique of controlling chaotic predator–prey population dynamics by supplying additional food to top-predator. We formulate a three species predator–prey model supplying additional food to top-predator. Existence conditions and local stability criteria of equilibrium points are determined analytically. Persistence conditions for the system are derived. Global stability conditions of interior equilibrium point is calculated. Theoretical results are verified through numerical simulations. Phase diagram is presented for various quality and quantity of additional food. One parameter bifurcation analysis is done with respect to quality and quantity of additional food separately keeping one of them fixed. Using MATCONT package, we derive the bifurcation scenarios when both the parameters quality and quantity of additional food vary together. We predict the existence of Hopf point (H), limit point (LP) and branch point (BP) in the model for suitable supply of additional food. We have computed the regions of different dynamical behaviour in the quantity–quality parametric plane. From our study we conclude that chaotic population dynamics of predator prey system can be controlled to obtain regular population dynamics only by supplying additional food to top predator. This study is aimed to introduce a new non-chemical chaos control mechanism in a predator–prey system with the

  20. Modelling of Hazards Effect on Safety Integrity of Open Transmission Systems

    Karol Rástočný; Mária Franeková; Peter Holečko; Iveta Zolotová

    2016-01-01

    The paper is concerned with safety appraisal of safety-related communication systems (SRComSs) with open transmission system, where except in addition to message transmission integrity also confidentiality is recommended to be provided. The authors focused on safety analysis of safety-related messages transmission secured using cryptographic and safety code mechanisms and on the possibilities of modelling safety-related industrial communication system, where a high safety integrity level SIL3...

  1. Do female researchers face a glass ceiling in France? A hazard model of promotions

    Sabatier, Mareva; Carrère, Myriam

    2010-01-01

    Abstract The present article examines whether French female researchers face a glass ceiling, an invisible barrier to promotion. Using an original database from the National Institute for Agricultural Research, we estimate duration models for promotions. The methodology used allowed us to take into account censored observations and unobserved heterogeneity. Our results show a significant gender effect that does not contradict the glass-ceiling hypothesis. In addition, factors that ...

  2. Model of quasi-ideal cascade with an additional feed flow and losses of working substances

    A mathematical model for the quasi-ideal cascade with an additional feed flow and losses of' working substances was established. Analytical relations to calculate the relative product and waste flows, component concentrations in the product and waste flows and the total substance flow in this cascade model were obtained by solving cascade equations. Cascade calculations were performed for separation of the recycled uranium. It was analyzed that the effects of loss factor and ratio between base and additional flows on the product concentration of cascade, in which the natural uranium was fed as a base feed flow and the recycled uranium as an additional one. (authors)

  3. ADDITIVE-MULTIPLICATIVE MODEL FOR RISK ESTIMATION IN THE PRODUCTION OF ROCKET AND SPACE TECHNICS

    Orlov A. I.

    2014-10-01

    Full Text Available For the first time we have developed a general additive-multiplicative model of the risk estimation (to estimate the probabilities of risk events. In the two-level system in the lower level the risk estimates are combined additively, on the top – in a multiplicative way. Additive-multiplicative model was used for risk estimation for (1 implementation of innovative projects at universities (with external partners, (2 the production of new innovative products, (3 the projects for creation of rocket and space equipmen

  4. Digital elevation models in the marine domain: investigating the offshore tsunami hazard from submarine landslides

    Tappin, David R.

    2015-04-01

    the resolution necessary to identify the hazard from landslides, particularly along convergent margins where this hazard is the greatest. Multibeam mapping of the deep seabed requires low frequency sound sources that, because of their corresponding low resolution, cannot produce the detail required to identify the finest scale features. In addition, outside of most countries, there are not the repeat surveys that allow seabed changes to be identified. Perhaps only japan has this data. In the near future as research budgets shrink and ship time becomes ever expensive new strategies will have to be used to make best use of the vessels available. Remote AUV technology is almost certainly the answer, and should be increasingly utilised to map the seabed while the mother ship is better used to carry out other duties, such as sampling or seismic data acquisition. This will have the advantage in the deep ocean of acquiring higher resolution data from high frequency multibeams. This talk presents on a number of projects that show the evolution of the use of MBES in mapping submarine landslides since the PNG tsunami. Data from PNG is presented, together with data from Japan, Hawaii and the NE Atlantic. New multibeam acquisition methodologies are also discussed.

  5. Modeling the Use of Sulfate Additives for Potassium Chloride Destruction in Biomass Combustion

    Wu, Hao; Pedersen, Morten Nedergaard; Jespersen, Jacob Boll;

    2014-01-01

    Potassium chloride, KCl, formed from biomass combustion may lead to ash deposition and corrosion problems in boilers. Sulfates are effective additives for converting KCl to the less harmful K2SO4 and HCl. In the present study, the rate constants for decomposition of ammonium sulfate and aluminum......-dependent distribution of SO2 and SO3 from ammonium sulfate decomposition. On the basis of these data as well as earlier results, a detailed chemical kinetic model for sulfation of KCl by a range of sulfate additives was established. Modeling results were compared to biomass combustion experiments in a bubbling...... fluidized-bed reactor using ammonium sulfate, aluminum sulfate, and ferric sulfate as additives. The simulation results for ammonium sulfate and ferric sulfate addition compared favorably to the experiments. The predictions for aluminum sulfate addition were only partly in agreement with the experimental...

  6. Assessment of erosion hazard after recurrence fires with the RUSLE 3D MODEL

    Vecín-Arias, Daniel; Palencia, Covadonga; Fernández Raga, María

    2016-04-01

    The objective of this work is to calculate if there is more soil erosion after the recurrence of several forest fires on an area. To that end, it has been studied an area of 22 130 ha because has a high frequency of fires. This area is located in the northwest of the Iberian Peninsula. The assessment of erosion hazard was calculated in several times using Geographic Information Systems (GIS).The area have been divided into several plots according to the number of times they have been burnt in the past 15 years. Due to the complexity that has to make a detailed study of a so large field and that there are not information available anually, it is necessary to select the more interesting moments. In august 2012 it happened the most agressive and extensive fire of the area. So the study was focused on the erosion hazard for 2011 and 2014, because they are the date before and after from the fire of 2012 in which there are orthophotos available. RUSLE3D model (Revised Universal Soil Loss Equation) was used to calculate maps erosion losses. This model improves the traditional USLE (Wischmeier and D., 1965) because it studies the influence of the concavity / convexity (Renard et al., 1997), and improves the estimation of the slope factor LS (Renard et al., 1991). It is also one of the most commonly used models in literatura (Mitasova et al., 1996; Terranova et al., 2009). The tools used are free and accessible, using GIS "gvSIG" (http://www.gvsig.com/es) and the metadata were taken from Spatial Data Infrastructure of Spain webpage (IDEE, 2016). However the RUSLE model has many critics as some authors who suggest that only serves to carry out comparisons between areas, and not for the calculation of absolute soil loss data. These authors argue that in field measurements the actual recovered eroded soil can suppose about one-third of the values obtained with the model (Šúri et al., 2002). The study of the area shows that the error detected by the critics could come from

  7. System Dynamics Model to develop resilience management strategies for lifelines exposed to natural hazards

    Pagano, Alessandro; Pluchinotta, Irene; Giordano, Raffaele; Vurro, Michele

    2016-04-01

    Resilience has recently become a key concept, and a crucial paradigm in the analysis of the impacts of natural disasters, mainly concerning Lifeline Systems (LS). Indeed, the traditional risk management approaches require a precise knowledge of all potential hazards and a full understanding of the interconnections among different infrastructures, based on past events and trends analysis. Nevertheless, due to the inner complexity of LS, their interconnectedness and the dynamic context in which they operate (i.e. technology, economy and society), it is difficult to gain a complete comprehension of the processes influencing vulnerabilities and threats. Therefore, resilience thinking addresses the complexities of large integrated systems and the uncertainty of future threats, emphasizing the absorbing, adapting and responsive behavior of the system. Resilience thinking approaches are focused on the capability of the system to deal with the unforeseeable. The increasing awareness of the role played by LS, has led governmental agencies and institutions to develop resilience management strategies. Risk prone areas, such as cities, are highly dependent on infrastructures providing essential services that support societal functions, safety, economic prosperity and quality of life. Among the LS, drinking water supply is critical for supporting citizens during emergency and recovery, since a disruption could have a range of serious societal impacts. A very well-known method to assess LS resilience is the TOSE approach. The most interesting feature of this approach is the integration of four dimensions: Technical, Organizational, Social and Economic. Such issues are all concurrent to the resilience level of an infrastructural system, and should be therefore quantitatively assessed. Several researches underlined that the lack of integration among the different dimensions, composing the resilience concept, may contribute to a mismanagement of LS in case of natural disasters

  8. Robust Estimation of Mean and Dispersion Functions in Extended Generalized Additive Models

    Croux, C.; Gijbels, I.; Prosdocimi, I.

    2010-01-01

    Generalized Linear Models are a widely used method to obtain parametric es- timates for the mean function. They have been further extended to allow the re- lationship between the mean function and the covariates to be more flexible via Generalized Additive Models. However the fixed variance structur

  9. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  10. Assessment of erosion hazard after recurrence fires with the RUSLE 3D MODEL

    Vecín-Arias, Daniel; Palencia, Covadonga; Fernández Raga, María

    2016-04-01

    The objective of this work is to calculate if there is more soil erosion after the recurrence of several forest fires on an area. To that end, it has been studied an area of 22 130 ha because has a high frequency of fires. This area is located in the northwest of the Iberian Peninsula. The assessment of erosion hazard was calculated in several times using Geographic Information Systems (GIS).The area have been divided into several plots according to the number of times they have been burnt in the past 15 years. Due to the complexity that has to make a detailed study of a so large field and that there are not information available anually, it is necessary to select the more interesting moments. In august 2012 it happened the most agressive and extensive fire of the area. So the study was focused on the erosion hazard for 2011 and 2014, because they are the date before and after from the fire of 2012 in which there are orthophotos available. RUSLE3D model (Revised Universal Soil Loss Equation) was used to calculate maps erosion losses. This model improves the traditional USLE (Wischmeier and D., 1965) because it studies the influence of the concavity / convexity (Renard et al., 1997), and improves the estimation of the slope factor LS (Renard et al., 1991). It is also one of the most commonly used models in literatura (Mitasova et al., 1996; Terranova et al., 2009). The tools used are free and accessible, using GIS "gvSIG" (http://www.gvsig.com/es) and the metadata were taken from Spatial Data Infrastructure of Spain webpage (IDEE, 2016). However the RUSLE model has many critics as some authors who suggest that only serves to carry out comparisons between areas, and not for the calculation of absolute soil loss data. These authors argue that in field measurements the actual recovered eroded soil can suppose about one-third of the values obtained with the model (Šúri et al., 2002). The study of the area shows that the error detected by the critics could come from

  11. A Proportional Hazards Regression Model for the Subdistribution with Covariates-adjusted Censoring Weight for Competing Risks Data

    He, Peng; Eriksson, Frank; Scheike, Thomas H.; Zhang, Mei Jie

    2016-01-01

    With competing risks data, one often needs to assess the treatment and covariate effects on the cumulative incidence function. Fine and Gray proposed a proportional hazards regression model for the subdistribution of a competing risk with the assumption that the censoring distribution and the...... covariates are independent. Covariate-dependent censoring sometimes occurs in medical studies. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with proper adjustments for covariate-dependent censoring. We consider a covariate-adjusted weight...... approach works well for the variance estimator as well. We illustrate our methods with bone marrow transplant data from the Center for International Blood and Marrow Transplant Research. Here, cancer relapse and death in complete remission are two competing risks....

  12. Hazard rate function in dynamic environment

    The hazard rate function is always applied to make maintenance policy, and the usual hazard rate function is computed by the data of failure times of systems working in constant environment, thus for systems working in dynamic environment it cannot be directly applied. In this paper, hazard rate function of system in the dynamic environment is computed, and the effects of current environment status and the environmental history on hazard rate function are explicitly presented. For system with the known degradation process, hazard rate function is studied by the Markov additive process. The environment evolution process is modeled as a stochastic process with two states, one state represents normal environment, the other represents severe environment, and system degrades more quickly under severe environment than under normal environment. The relationship between hazard rate functions of system in time-invariant and dynamic environment is researched, from which three important facts are revealed, firstly hazard rate function jumps as the environment jumps, secondly the form of hazard rate function is determined Wby current environment state, and thirdly the effective age of system is determined by the environmental history. For system with the unknown degradation process, based on the above facts, this paper derives the hazard rate function in dynamic environment, and proposes a method to compute the effective age under given environmental history. Finally the optimal maintenance policy for system in dynamic environment is studied. - Highlights: ●Compute hazard rate function (HRF) in dynamic environment with the Markov additive process. ●The history of environment is took into consideration when compute HRF. ●Compute HRF in dynamic environment with estimated classical HRF. ●Effective age of system is computed when environment is changed

  13. Between and beyond additivity and non-additivity; the statistical modelling of genotype by environment interaction in plant breeding.

    Eeuwijk, van F.A.

    1996-01-01

    In plant breeding it is a common observation to see genotypes react differently to environmental changes. This phenomenon is called genotype by environment interaction. Many statistical approaches for analysing genotype by environment interaction rely heavily on the analysis of variance model. Genot

  14. A framework for modeling clustering in natural hazard catastrophe risk management and the implications for re/insurance loss perspectives

    S. Khare

    2014-08-01

    Full Text Available In this paper, we present a novel framework for modelling clustering in natural hazard risk models. The framework we present is founded on physical principles where large-scale oscillations in the physical system is the source of non-Poissonian (clustered frequency behaviour. We focus on a particular mathematical implementation of the "Super-Cluster" methodology that we introduce. This mathematical framework has a number of advantages including tunability to the problem at hand, as well as the ability to model cross-event correlation. Using European windstorm data as an example, we provide evidence that historical data show strong evidence of clustering. We then develop Poisson and clustered simulation models for the data, demonstrating clearly the superiority of the clustered model which we have implemented using the Poisson-Mixtures approach. We then discuss the implications of including clustering in models of prices on catXL contracts, one of the most commonly used mechanisms for transferring risk between primary insurers and reinsurers. This paper provides a number of new insights into the impact clustering has on modelled catXL contract prices. The simple model presented in this paper provides an insightful starting point for practicioners of natural hazard risk modelling.

  15. A framework for modeling clustering in natural hazard catastrophe risk management and the implications for re/insurance loss perspectives

    Khare, S.; Bonazzi, A.; Mitas, C.; Jewson, S.

    2014-08-01

    In this paper, we present a novel framework for modelling clustering in natural hazard risk models. The framework we present is founded on physical principles where large-scale oscillations in the physical system is the source of non-Poissonian (clustered) frequency behaviour. We focus on a particular mathematical implementation of the "Super-Cluster" methodology that we introduce. This mathematical framework has a number of advantages including tunability to the problem at hand, as well as the ability to model cross-event correlation. Using European windstorm data as an example, we provide evidence that historical data show strong evidence of clustering. We then develop Poisson and clustered simulation models for the data, demonstrating clearly the superiority of the clustered model which we have implemented using the Poisson-Mixtures approach. We then discuss the implications of including clustering in models of prices on catXL contracts, one of the most commonly used mechanisms for transferring risk between primary insurers and reinsurers. This paper provides a number of new insights into the impact clustering has on modelled catXL contract prices. The simple model presented in this paper provides an insightful starting point for practicioners of natural hazard risk modelling.

  16. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    Álvarez-Gómez, José Antonio; Aniel-Quiroga Zorrilla, Íñigo; Gutiérrez Gutiérrez, Omar Quetzalcóatl; Larreynaga Murcia, Jeniffer; González Rodríguez, Ernesto Mauricio; M. Castro; Gavidia Medina, Francisco; Aguirre Ayerbe, Ignacio; González-Riancho Calzada, Pino; Carreño Herrero, Emilio

    2013-01-01

    ABSTRACT. El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate lenght of 320 km, 29 municipalities and more than 700.000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and deterministic m...

  17. Using a fire propagation model to assess the efficiency of prescribed burning in reducing the fire hazard

    Cassagne, Nathalie; Pimont, François; Dupuy, Jean-Luc; Linn, Rodman R.; Marell, Anders; Oliveri, Chloe; Rigolot, Eric

    2011-01-01

    We examined how fire hazard was affected by prescribed burning and fuel recovery over the first six years following treatment. Eight common Mediterranean fuel complexes managed by means of prescribed burning in limestone Provence (South-Eastern France) were studied, illustrating forest and woodland, garrigue and grassland situations. The coupled atmosphere-wildfire behaviour model FIRETEC was used to simulate fire behaviour (ROS, intensity) in these complex vegetations. The temporal threshold...

  18. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  19. Characterizing the danger of in-channel river hazards using LIDAR and a 2D hydrodynamic model

    Strom, M. A.; Pasternack, G. B.

    2014-12-01

    Despite many injuries and deaths each year worldwide, no analytically rigorous attempt exists to characterize and quantify the dangers to boaters, swimmers, fishermen, and other river enthusiasts. While designed by expert boaters, the International Scale of River Difficulty provides a whitewater classification that uses qualitative descriptions and subjective scoring. The purpose of this study was to develop an objective characterization of in-channel hazard dangers across spatial scales from a single boulder to an entire river segment for application over a wide range of discharges and use in natural hazard assessment and mitigation, recreational boating safety, and river science. A process-based conceptualization of river hazards was developed, and algorithms were programmed in R to quantify the associated dangers. Danger indicators included the passage proximity and reaction time posed to boats and swimmers in a river by three hazards: emergent rocks, submerged rocks, and hydraulic jumps or holes. The testbed river was a 12.2 km mixed bedrock-alluvial section of the upper South Yuba River between Lake Spaulding and Washington, CA in the Sierra Mountains. The segment has a mean slope of 1.63%, with 8 reaches varying from 1.07% to 3.30% slope and several waterfalls. Data inputs to the hazard analysis included sub-decimeter aerial color imagery, airborne LIDAR of the river corridor, bathymetric data, flow inputs, and a stage-discharge relation for the end of the river segment. A key derived data product was the location and configuration of boulders and boulder clusters as these were potential hazards. Two-dimensional hydrodynamic modeling was used to obtain the meter-scale spatial pattern of depth and velocity at discharges ranging from baseflow to modest flood stages. Results were produced for four discharges and included the meter-scale spatial pattern of the passage proximity and reaction time dangers for each of the three hazards investigated. These results

  20. Introducing Geoscience Students to Numerical Modeling of Volcanic Hazards: The example of Tephra2 on VHub.org

    Leah M. Courtland

    2012-07-01

    Full Text Available The Tephra2 numerical model for tephra fallout from explosive volcanic eruptions is specifically designed to enable students to probe ideas in model literacy, including code validation and verification, the role of simplifying assumptions, and the concepts of uncertainty and forecasting. This numerical model is implemented on the VHub.org website, a venture in cyberinfrastructure that brings together volcanological models and educational materials. The VHub.org resource provides students with the ability to explore and execute sophisticated numerical models like Tephra2. We present a strategy for using this model to introduce university students to key concepts in the use and evaluation of Tephra2 for probabilistic forecasting of volcanic hazards. Through this critical examination students are encouraged to develop a deeper understanding of the applicability and limitations of hazard models. Although the model and applications are intended for use in both introductory and advanced geoscience courses, they could easily be adapted to work in other disciplines, such as astronomy, physics, computational methods, data analysis, or computer science.

  1. A hydro-sedimentary modelling system for flash flood propagation and hazard estimation under different agricultural practices

    N. N. Kourgialas

    2013-10-01

    Full Text Available A modelling system for the estimation of flash flood flow characteristics and sediment transport is developed in this study. The system comprises of three components: (a a modelling framework based on the hydrological model HSPF, (b the hydrodynamic module of the hydraulic model MIKE 11 (quasi-2-D, and (c the advection-dispersion module of MIKE 11 as a sediment transport model. An important parameter in hydraulic modelling is the Manning's coefficient, an indicator of the channel resistance which is directly depended on riparian vegetation changes. Riparian vegetation effect on flood propagation parameters such as water depth (inundation, discharge, flow velocity, and sediment transport load is investigated in this study. Based on the obtained results, when the weed cutting percentage is increased, the flood wave depth decreases while flow discharge, velocity and sediment transport load increase. The proposed modelling system is used to evaluate and illustrate the flood hazard for different cutting riparian vegetation scenarios. For the estimation of flood hazard, a combination of the flood propagation characteristics of water depth, flow velocity and sediment load was used. Next, an optimal selection of the most appropriate agricultural cutting practices of riparian vegetation was performed. Ultimately, the model results obtained for different agricultural cutting practice scenarios can be employed to create flood protection measures for flood prone areas. The proposed methodology was applied to the downstream part of a small mediterranean river basin in Crete, Greece.

  2. Genomic prediction of growth in pigs based on a model including additive and dominance effects.

    Lopes, M S; Bastiaansen, J W M; Janss, L; Knol, E F; Bovenhuis, H

    2016-06-01

    Independent of whether prediction is based on pedigree or genomic information, the focus of animal breeders has been on additive genetic effects or 'breeding values'. However, when predicting phenotypes rather than breeding values of an animal, models that account for both additive and dominance effects might be more accurate. Our aim with this study was to compare the accuracy of predicting phenotypes using a model that accounts for only additive effects (MA) and a model that accounts for both additive and dominance effects simultaneously (MAD). Lifetime daily gain (DG) was evaluated in three pig populations (1424 Pietrain, 2023 Landrace, and 2157 Large White). Animals were genotyped using the Illumina SNP60K Beadchip and assigned to either a training data set to estimate the genetic parameters and SNP effects, or to a validation data set to assess the prediction accuracy. Models MA and MAD applied random regression on SNP genotypes and were implemented in the program Bayz. The additive heritability of DG across the three populations and the two models was very similar at approximately 0.26. The proportion of phenotypic variance explained by dominance effects ranged from 0.04 (Large White) to 0.11 (Pietrain), indicating that importance of dominance might be breed-specific. Prediction accuracies were higher when predicting phenotypes using total genetic values (sum of breeding values and dominance deviations) from the MAD model compared to using breeding values from both MA and MAD models. The highest increase in accuracy (from 0.195 to 0.222) was observed in the Pietrain, and the lowest in Large White (from 0.354 to 0.359). Predicting phenotypes using total genetic values instead of breeding values in purebred data improved prediction accuracy and reduced the bias of genomic predictions. Additional benefit of the method is expected when applied to predict crossbred phenotypes, where dominance levels are expected to be higher. PMID:26676611

  3. Effects of oxygen addition to argon glow discharges: A hybrid Monte Carlo-fluid modeling investigation

    A hybrid model is developed for describing the effects of oxygen addition to argon glow discharges. The species taken into account in the model include Ar atoms in the ground state and the metastable level, O2 gas molecules in the ground state and two metastable levels, O atoms in the ground state and one metastable level, O3 molecules, Ar+, O+, O2+ and O- ions, as well as the electrons. The hybrid model consists of a Monte Carlo model for electrons and fluid models for the other plasma species. In total, 87 different reactions between the various plasma species are taken into account. Calculation results include the species densities and the importance of their production and loss processes, as well as the dissociation degree of oxygen. The effect of different O2 additions on these calculation results, as well as on the sputtering rates, is discussed.

  4. International conference and workshop on modeling and mitigating the consequences of accidental releases of hazardous materials

    This conference was held September 26--29, 1995 in New Orleans, Louisiana. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on the consequences of accidental releases of hazardous materials. Attention is focused on air dispersion of vapors. Individual papers have been processed separately for inclusion in the appropriate data bases

  5. Modelling risk in high hazard operations: integrating technical, organisational and cultural factors

    Ale, B.J.M.; Hanea, D.M.; Sillem, S.; Lin, P.H.; Van Gulijk, C.; Hudson, P.T.W.

    2012-01-01

    Recent disasters in high hazard industries such as Oil and Gas Exploration (The Deepwater Horizon) and Petrochemical production (Texas City) have been found to have causes that range from direct technical failures through organizational shortcomings right up to weak regulation and inappropriate comp

  6. Structured Additive Regression Models: An R Interface to BayesX

    Nikolaus Umlauf

    2015-02-01

    Full Text Available Structured additive regression (STAR models provide a flexible framework for model- ing possible nonlinear effects of covariates: They contain the well established frameworks of generalized linear models and generalized additive models as special cases but also allow a wider class of effects, e.g., for geographical or spatio-temporal data, allowing for specification of complex and realistic models. BayesX is standalone software package providing software for fitting general class of STAR models. Based on a comprehensive open-source regression toolbox written in C++, BayesX uses Bayesian inference for estimating STAR models based on Markov chain Monte Carlo simulation techniques, a mixed model representation of STAR models, or stepwise regression techniques combining penalized least squares estimation with model selection. BayesX not only covers models for responses from univariate exponential families, but also models from less-standard regression situations such as models for multi-categorical responses with either ordered or unordered categories, continuous time survival data, or continuous time multi-state models. This paper presents a new fully interactive R interface to BayesX: the R package R2BayesX. With the new package, STAR models can be conveniently specified using Rs formula language (with some extended terms, fitted using the BayesX binary, represented in R with objects of suitable classes, and finally printed/summarized/plotted. This makes BayesX much more accessible to users familiar with R and adds extensive graphics capabilities for visualizing fitted STAR models. Furthermore, R2BayesX complements the already impressive capabilities for semiparametric regression in R by a comprehensive toolbox comprising in particular more complex response types and alternative inferential procedures such as simulation-based Bayesian inference.

  7. A model used to derive hazardous waste concentration limits aiming at the reduction of toxic and hazardous wastes. Applications to illustrate the discharge of secondary categories types B and C

    This report describes a model which may be used to derive hazardous waste concentration limits in order to prevent ground water pollution from a landfill disposal. First the leachate concentration limits are determined taking into account the attenuation capacity of the landfill-site as a whole; waste concentrations are then derived by an elution model which assumes a constant ratio between liquid-solid concentrations. In the example two types of landfill have been considered and in each case concentration limits have been calculated for some hazardous substances and compared with the corresponding regulatory limits. (author)

  8. Feature based cost and carbon emission modelling for wire and arc additive manufacturing

    Guo, Jianing

    2012-01-01

    The wire and arc additive manufacturing (WAAM) is a CNC and welding deposition based additive manufacturing method. This novel manufacturing technique has potential cost and environment advantage and was developed as an ideal alternative for industrial sustainable development. The aim of this project is to develop a cost and carbon emission model primarily for the WAAM manufacturing cost (£) calculation and secondly for the WAAM carbon emission (KgCO2e) estimation, which can be used by the...

  9. Incorporating descriptive metadata into seismic source zone models for seismic-hazard assessment : a case study of the Azores-West Iberian Region

    Vilanova, S. P.; Nemser, E.S.; Besana-Ostman, G.M.; Bezzeghoud, M.; Borges, J. F.; Brum da Silveira, A.; Cabral, J.; Carvalho, J.; Cunha, P. P.; R.P. Dias; J. Madeira; Lopes, F.C.; C. S. Oliveira; Perea, H.; García‐Mayordomo, J.

    2014-01-01

    In probabilistic seismic-hazard analysis (PSHA), seismic source zone (SSZ) models are widely used to account for the contribution to the hazard from earth- quakes not directly correlated with geological structures. Notwithstanding the impact of SSZ models in PSHA, the theoretical framework underlying SSZ models and the criteria used to delineate the SSZs are seldom explicitly stated and suitably docu- mented. In this paper, we propose a methodological framework to develop and docu- ment SSZ m...

  10. A guide to generalized additive models in crop science using SAS and R

    Josefine Liew

    2015-06-01

    Full Text Available Linear models and generalized linear models are well known and are used extensively in crop science. Generalized additive models (GAMs are less well known. GAMs extend generalized linear models through inclusion of smoothing functions of explanatory variables, e.g., spline functions, allowing the curves to bend to better describe the observed data. This article provides an introduction to GAMs in the context of crop science experiments. This is exemplified using a dataset consisting of four populations of perennial sow-thistle (Sonchus arvensis L., originating from two regions, for which emergence of shoots over time was compared.

  11. Predicting mastitis in dairy cows using neural networks and generalized additive models: a comparison

    Anantharama Ankinakatte, Smitha; Norberg, Elise; Løvendahl, Peter; Edwards, David; Højsgaard, Søren

    2013-01-01

    classification with all indicators, using individual residuals rather than factor scores. When SCS is excluded, GAMs shows better classification result when milk yield is also excluded. In conclusion, the study shows that NNs and GAMs are similar in their ability to detect mastitis, a sensitivity of almost 75...... combines residual components into a score to improve the model. To develop and verify the model, the data are randomly divided into training and validation data sets. To predict the occurrence of mastitis, neural network models (NNs) and generalized additive models (GAMs) are developed using the training...

  12. Modeling retrospective attribution of responsibility to hazard-managing institutions: an example involving a food contamination incident.

    Johnson, Branden B; Hallman, William K; Cuite, Cara L

    2015-03-01

    Perceptions of institutions that manage hazards are important because they can affect how the public responds to hazard events. Antecedents of trust judgments have received far more attention than antecedents of attributions of responsibility for hazard events. We build upon a model of retrospective attribution of responsibility to individuals to examine these relationships regarding five classes of institutions that bear responsibility for food safety: producers (e.g., farmers), processors (e.g., packaging firms), watchdogs (e.g., government agencies), sellers (e.g., supermarkets), and preparers (e.g., restaurants). A nationally representative sample of 1,200 American adults completed an Internet-based survey in which a hypothetical scenario involving contamination of diverse foods with Salmonella served as the stimulus event. Perceived competence and good intentions of the institution moderately decreased attributions of responsibility. A stronger factor was whether an institution was deemed (potentially) aware of the contamination and free to act to prevent or mitigate it. Responsibility was rated higher the more aware and free the institution. This initial model for attributions of responsibility to impersonal institutions (as opposed to individual responsibility) merits further development. PMID:25516461

  13. Evaluating a multi-criteria model for hazard assessment in urban design. The Porto Marghera case study

    The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based on a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)

  14. An integrated approach to flood hazard assessment on alluvial fans using numerical modeling, field mapping, and remote sensing

    Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.

    2005-01-01

    Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In

  15. Vector generalized additive models for extreme rainfall data analysis (study case rainfall data in Indramayu)

    Utami, Eka Putri Nur; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall pattern are good indicators for potential disasters. Global Circulation Model (GCM) contains global scale information that can be used to predict the rainfall data. Statistical downscaling (SD) utilizes the global scale information to make inferences in the local scale. Essentially, SD can be used to predict local scale variables based on global scale variables. SD requires a method to accommodate non linear effects and extreme values. Extreme value Theory (EVT) can be used to analyze the extreme value. One of methods to identify the extreme events is peak over threshold that follows Generalized Pareto Distribution (GPD). The vector generalized additive model (VGAM) is an extension of the generalized additive model. It is able to accommodate linear or nonlinear effects by involving more than one additive predictors. The advantage of VGAM is to handle multi response models. The key idea of VGAM are iteratively reweighted least square for maximum likelihood estimation, penalized smoothing, fisher scoring and additive models. This works aims to analyze extreme rainfall data in Indramayu using VGAM. The results show that the VGAM with GPD is able to predict extreme rainfall data accurately. The prediction in February is very close to the actual value at quantile 75.

  16. GIS-Based Spatial Analysis and Modeling for Landslide Hazard Assessment: A Case Study in Upper Minjiang River Basin

    FENG Wenlan; ZHOU Qigang; ZHANG Baolei; ZHOU Wancun; LI Ainong; ZHANG Haizhen; XIAN Wei

    2006-01-01

    By analyzing the topographic features of past landslides since 1980s and the main land-cover types (including change information) in landslide-prone area, modeled spatial distribution of landslide hazard in upper Minjiang River Basin was studied based on spatial analysis of GIS in this paper. Results of GIS analysis showed that landslide occurrence in this region closely related to topographic feature. Most areas with high hazard probability were deep-sheared gorge. Most of them in investigation occurred assembly in areas with elevation lower than 3 000 m, due to fragile topographic conditions and intensive human disturbances. Land-cover type, including its change information, was likely an important environmental factor to trigger landslide. Destroy of vegetation driven by increase of population and its demands augmented the probability of landslide in steep slope.

  17. Debris flow hazard assessment by integrated modeling of landslide triggering and propagation: application to the Messina Province, Italy

    Stancanelli, L. M.; Peres, D. J.; Cavallaro, L.; Cancelliere, A.; Foti, E.

    2014-12-01

    During the last decades an increase of debris flow catastrophic events has been recorded along the Italian territory, mainly due to the increment of settlements and human activities in mountain areas. Considering the large extent of debris flow prone areas, non structural protection strategies should be preferably implemented because of economic constrains associated with structural mitigation measures. In such a framework hazard assessment methodologies play a key role representing useful tools for the development of emergency management policies. The aim of the present study is to apply an integrated debris flow hazard assessment methodology, where rainfall probabilistic analysis and physically-based landslide triggering and propagation models are combined. In particular, the probabilistic rainfall analysis provides the forcing scenarios of different return periods, which are then used as input to a model based on combination of the USGS TRIGRS and the FLO-2D codes. The TRIGRS model (Baum et al., 2008; 2010), developed for analyzing shallow landslide triggering is based on an analytical solution of linearized forms of the Richards' infiltration equation and an infinite-slope stability calculation to estimate the timing and locations of slope failures, while the FLO-2D (O'Brien 1986) is a two-dimensional finite difference model that simulates debris flow propagation following a mono-phase approach, based on empirical quadratic rheological relation developed by O'Brien and Julien (1985). Various aspects of the combination of the models are analyzed, giving a particular focus on the possible variations of triggered amounts compatible with a given return period. The methodology is applied to the case study area of the Messina Province in Italy, which has been recently struck by severe events, as the one of the 1st October 2009 which hit the Giampilieri Village causing 37 fatalities. Results are analyzed to assess the potential hazard that may affect the densely

  18. Artificial geochemical barriers for additional recovery of non-ferrous metals and reduction of ecological hazard from the mining industry waste.

    Chanturiya, Valentine; Masloboev, Vladimir; Makarov, Dmitriy; Mazukhina, Svetlana; Nesterov, Dmitriy; Men'shikov, Yuriy

    2011-01-01

    Laboratory tests and physical-chemical modeling have determined that mixtures of activated silica and carbonatite, serpophite and carbonatite show considerable promise for developing artificial geochemical barriers. The obtained average contents of nickel and copper deposited on geochemical barriers in the formed mining induced ores are acceptable for their subsequent cost efficient processing using either pyro- or hydrometallurgy methods. Some tests of geochemical barriers have been carried out, involving the use of polluted water in the impact zone of the "Kol'skaya GMK" JSC. A possibility of water purification from heavy metals down to the MAC level for fishery water bodies has been displayed. PMID:22029700

  19. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    Doligez, B.; Eschard, R. [Institut Francais du Petrole, Rueil Malmaison (France); Geffroy, F. [Centre de Geostatistique, Fontainebleau (France)] [and others

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  20. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models

    Fan, Jianqing; Song, Rui

    2011-01-01

    A variable screening procedure via correlation learning was proposed Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under the nonparametric additive models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, an iterative nonparametric independence screening (INIS) is also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data a...

  1. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    Ross, G.

    2015-12-01

    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.

  2. Modeling Flood Hazard Zones at the Sub-District Level with the Rational Model Integrated with GIS and Remote Sensing Approaches

    Daniel Asare-Kyei

    2015-07-01

    Full Text Available Robust risk assessment requires accurate flood intensity area mapping to allow for the identification of populations and elements at risk. However, available flood maps in West Africa lack spatial variability while global datasets have resolutions too coarse to be relevant for local scale risk assessment. Consequently, local disaster managers are forced to use traditional methods such as watermarks on buildings and media reports to identify flood hazard areas. In this study, remote sensing and Geographic Information System (GIS techniques were combined with hydrological and statistical models to delineate the spatial limits of flood hazard zones in selected communities in Ghana, Burkina Faso and Benin. The approach involves estimating peak runoff concentrations at different elevations and then applying statistical methods to develop a Flood Hazard Index (FHI. Results show that about half of the study areas fall into high intensity flood zones. Empirical validation using statistical confusion matrix and the principles of Participatory GIS show that flood hazard areas could be mapped at an accuracy ranging from 77% to 81%. This was supported with local expert knowledge which accurately classified 79% of communities deemed to be highly susceptible to flood hazard. The results will assist disaster managers to reduce the risk to flood disasters at the community level where risk outcomes are first materialized.

  3. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    Antoun, T; Harris, D; Lay, T; Myers, S C; Pasyanos, M E; Richards, P; Rodgers, A J; Walter, W R; Zucca, J J

    2008-02-11

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring and seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes a path by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas.

  4. Determinants of Low Birth Weight in Malawi: Bayesian Geo-Additive Modelling

    Ngwira, Alfred; Stanley, Christopher C.

    2015-01-01

    Studies on factors of low birth weight in Malawi have neglected the flexible approach of using smooth functions for some covariates in models. Such flexible approach reveals detailed relationship of covariates with the response. The study aimed at investigating risk factors of low birth weight in Malawi by assuming a flexible approach for continuous covariates and geographical random effect. A Bayesian geo-additive model for birth weight in kilograms and size of the child at birth (less than ...

  5. Business models for additive manufacturing:exploring digital technologies, consumer roles, and supply chains

    Hadar, Ronen; Bilberg, Arne; Bogers, Marcel

    2015-01-01

    Digital fabrication — including additive manufacturing (AM), rapid prototyping and 3D printing — has the potential to revolutionize the way in which products are produced and delivered to the customer. Therefore, it challenges companies to reinvent their business model — describing the logic of creating and capturing value. In this paper, we explore the implications that AM technologies have for manufacturing systems in the new business models that they enable. In particular, we consider how ...

  6. Modeling ancient and modern arithmetic practices : addition and multiplication with Arabic and Roman numerals

    Schlimm, Dirk; Neth, Hansjörg

    2008-01-01

    To analyze the task of mental arithmetic with external representations in different number systems we model algorithms for addition and multiplication with Arabic and Roman numerals. This demonstrates that Roman numerals are not only informationally equivalent to Arabic ones but also computationally similar - a claim that is widely disputed. An analysis of our models' elementary processing steps reveals intricate trade-offs between problem representation, algorithm, and interactive resources....

  7. Antimicrobial combinations: Bliss independence and Loewe additivity derived from mechanistic multi-hit models.

    Baeder, Desiree Y; Yu, Guozhi; Hozé, Nathanaël; Rolff, Jens; Regoes, Roland R

    2016-05-26

    Antimicrobial peptides (AMPs) and antibiotics reduce the net growth rate of bacterial populations they target. It is relevant to understand if effects of multiple antimicrobials are synergistic or antagonistic, in particular for AMP responses, because naturally occurring responses involve multiple AMPs. There are several competing proposals describing how multiple types of antimicrobials add up when applied in combination, such as Loewe additivity or Bliss independence. These additivity terms are defined ad hoc from abstract principles explaining the supposed interaction between the antimicrobials. Here, we link these ad hoc combination terms to a mathematical model that represents the dynamics of antimicrobial molecules hitting targets on bacterial cells. In this multi-hit model, bacteria are killed when a certain number of targets are hit by antimicrobials. Using this bottom-up approach reveals that Bliss independence should be the model of choice if no interaction between antimicrobial molecules is expected. Loewe additivity, on the other hand, describes scenarios in which antimicrobials affect the same components of the cell, i.e. are not acting independently. While our approach idealizes the dynamics of antimicrobials, it provides a conceptual underpinning of the additivity terms. The choice of the additivity term is essential to determine synergy or antagonism of antimicrobials.This article is part of the themed issue 'Evolutionary ecology of arthropod antimicrobial peptides'. PMID:27160596

  8. Efficiency modeling of solidification/stabilization of multi-metal contaminated industrial soil using cement and additives

    Highlights: → We assess the feasibility of using soil S/S for industrial land reclamation. → Retarders, accelerators, plasticizers were used in S/S cementitious formulation. → We proposed novel S/S efficiency model for multi-metal contaminated soils. - Abstract: In a laboratory study, formulations of 15% (w/w) of ordinary Portland cement (OPC), calcium aluminate cement (CAC) and pozzolanic cement (PC) and additives: plasticizers cementol delta ekstra (PCDE) and cementol antikorodin (PCA), polypropylene fibers (PPF), polyoxyethylene-sorbitan monooleate (Tween 80) and aqueous acrylic polymer dispersion (Akrimal) were used for solidification/stabilization (S/S) of soils from an industrial brownfield contaminated with up to 157, 32,175, 44,074, 7614, 253 and 7085 mg kg-1 of Cd, Pb, Zn, Cu, Ni and As, respectively. Soils formed solid monoliths with all cementitious formulations tested, with a maximum mechanical strength of 12 N mm-2 achieved after S/S with CAC + PCA. To assess the S/S efficiency of the used formulations for multi-element contaminated soils, we propose an empirical model in which data on equilibrium leaching of toxic elements into deionized water and TCLP (toxicity characteristic leaching procedure) solution and the mass transfer of elements from soil monoliths were weighed against the relative potential hazard of the particular toxic element. Based on the model calculation, the most efficient S/S formulation was CAC + Akrimal, which reduced soil leachability of Cd, Pb, Zn, Cu, Ni and As into deionized water below the limit of quantification and into TCLP solution by up to 55, 185, 8750, 214, 4.7 and 1.2-times, respectively; and the mass transfer of elements from soil monoliths by up to 740, 746, 104,000, 4.7, 343 and 181-times, respectively.

  9. Efficiency modeling of solidification/stabilization of multi-metal contaminated industrial soil using cement and additives

    Voglar, Grega E. [RDA - Regional Development Agency Celje, Kidriceva ulica 25, 3000 Celje (Slovenia); Lestan, Domen, E-mail: domen.lestan@bf.uni-lj.si [Agronomy Department, Centre for Soil and Environmental Science, Biotechnical Faculty, University of Ljubljana, Jamnikarjeva 101, 1000 Ljubljana (Slovenia)

    2011-08-30

    Highlights: {yields} We assess the feasibility of using soil S/S for industrial land reclamation. {yields} Retarders, accelerators, plasticizers were used in S/S cementitious formulation. {yields} We proposed novel S/S efficiency model for multi-metal contaminated soils. - Abstract: In a laboratory study, formulations of 15% (w/w) of ordinary Portland cement (OPC), calcium aluminate cement (CAC) and pozzolanic cement (PC) and additives: plasticizers cementol delta ekstra (PCDE) and cementol antikorodin (PCA), polypropylene fibers (PPF), polyoxyethylene-sorbitan monooleate (Tween 80) and aqueous acrylic polymer dispersion (Akrimal) were used for solidification/stabilization (S/S) of soils from an industrial brownfield contaminated with up to 157, 32,175, 44,074, 7614, 253 and 7085 mg kg{sup -1} of Cd, Pb, Zn, Cu, Ni and As, respectively. Soils formed solid monoliths with all cementitious formulations tested, with a maximum mechanical strength of 12 N mm{sup -2} achieved after S/S with CAC + PCA. To assess the S/S efficiency of the used formulations for multi-element contaminated soils, we propose an empirical model in which data on equilibrium leaching of toxic elements into deionized water and TCLP (toxicity characteristic leaching procedure) solution and the mass transfer of elements from soil monoliths were weighed against the relative potential hazard of the particular toxic element. Based on the model calculation, the most efficient S/S formulation was CAC + Akrimal, which reduced soil leachability of Cd, Pb, Zn, Cu, Ni and As into deionized water below the limit of quantification and into TCLP solution by up to 55, 185, 8750, 214, 4.7 and 1.2-times, respectively; and the mass transfer of elements from soil monoliths by up to 740, 746, 104,000, 4.7, 343 and 181-times, respectively.

  10. A fast global tsunami modeling suite as a trans-oceanic tsunami hazard prediction and mitigation tool

    Mohammed, F.; Li, S.; Jalali Farahani, R.; Williams, C. R.; Astill, S.; Wilson, P. S.; B, S.; Lee, R.

    2014-12-01

    The past decade has been witness to two mega-tsunami events, 2004 Indian ocean tsunami and 2011 Japan tsunami and multiple major tsunami events; 2006 Java, Kuril Islands, 2007 Solomon Islands, 2009 Samoa and 2010 Chile, to name a few. These events generated both local and far field tsunami inundations with runup ranging from a few meters to around 40 m in the coastal impact regions. With a majority of the coastal population at risk, there is need for a sophisticated outlook towards catastrophe risk estimation and a quick mitigation response. At the same time tools and information are needed to aid advanced tsunami hazard prediction. There is an increased need for insurers, reinsurers and Federal hazard management agencies to quantify coastal inundations and vulnerability of coastal habitat to tsunami inundations. A novel tool is developed to model local and far-field tsunami generation, propagation and inundation to estimate tsunami hazards. The tool is a combination of the NOAA MOST propagation database and an efficient and fast GPU (Graphical Processing Unit)-based non-linear shallow water wave model solver. The tsunamigenic seismic sources are mapped on to the NOAA unit source distribution along subduction zones in the ocean basin. Slip models are defined for tsunamigenic seismic sources through a slip distribution on the unit sources while maintaining limits of fault areas. A GPU based finite volume solver is used to simulate non-linear shallow water wave propagation, inundation and runup. Deformation on the unit sources provide initial conditions for modeling local impacts, while the wave history from propagation database provides boundary conditions for far field impacts. The modeling suite provides good agreement with basins for basin wide tsunami propagation to validate local and far field tsunami inundations.

  11. Multi-Spatial Criteria Modelling of Fire Risk and Hazard in the West Gonja Area of Ghana

    I. Yakubu

    2013-05-01

    Full Text Available About 30% of the West Gonja Area (WGA of Ghana is occupied by three major forest reserves, which have rich array of plants and animals. The ecosystem in the WGA has been experiencing changes as a result of activities such as lumbering, farming, poaching and ritual bush burning as well as wildfire. Of particular concern is wildfire which has devastating effect on the ecological system and the rural livelihood in the WGA. Therefore, prevention and control of wildfire in the WGA is important to the sustainability of the natural resources. This paper uses multi-spatial criteria technique to model fire risk and hazard in order to enhance the WGA ability to prevent and control wildfires in the fragile ecosystem. The input data included: topography (slope, elevation, aspect; vegetation (fuel quality, fuel size and shape; weather (rainfall, temperature, humidity, wind; land cover/use map; landform; accessibility data; fire history; culture; and population density of the WGA. Fuel risk, detection risk and response risks were modeled and used as inputs to model the final fire risk and hazard for the WGA. From the model, forest, agricultural lands and shrubs cover types were identified as the major fuel contributing loads whereas water bodies, roads and settlements were considered as minor fuel contributing loads. Steeply sloping areas, areas facing the sun, low lying areas and long distances of forests from the fire service stations were found to be more susceptible to fire. The fire risk and hazard model will assist decision makers and inhabitants of the area to know where there is the highest possibility for fire outbreak and adopt prudent ways of preventing, and managing incidences of, wildfires in the WGA.

  12. Learning from Nature - Mapping of Complex Hydrological and Geomorphological Process Systems for More Realistic Modelling of Hazard-related Maps

    Chifflard, Peter; Tilch, Nils

    2010-05-01

    Introduction Hydrological or geomorphological processes in nature are often very diverse and complex. This is partly due to the regional characteristics which vary over time and space, as well as changeable process-initiating and -controlling factors. Despite being aware of this complexity, such aspects are usually neglected in the modelling of hazard-related maps due to several reasons. But particularly when it comes to creating more realistic maps, this would be an essential component to consider. The first important step towards solving this problem would be to collect data relating to regional conditions which vary over time and geographical location, along with indicators of complex processes. Data should be acquired promptly during and after events, and subsequently digitally combined and analysed. Study area In June 2009, considerable damage occurred in the residential area of Klingfurth (Lower Austria) as a result of great pre-event wetness and repeatedly heavy rainfall, leading to flooding, debris flow deposit and gravitational mass movement. One of the causes is the fact that the meso-scale watershed (16 km²) of the Klingfurth stream is characterised by adverse geological and hydrological conditions. Additionally, the river system network with its discharge concentration within the residential zone contributes considerably to flooding, particularly during excessive rainfall across the entire region, as the flood peaks from different parts of the catchment area are superposed. First results of mapping Hydro(geo)logical surveys across the entire catchment area have shown that - over 600 gravitational mass movements of various type and stage have occurred. 516 of those have acted as a bed load source, while 325 mass movements had not reached the final stage yet and could thus supply bed load in the future. It should be noted that large mass movements in the initial or intermediate stage were predominately found in clayey-silty areas and weathered material

  13. Vector generalized linear and additive models with an implementation in R

    Yee, Thomas W

    2015-01-01

    This book presents a statistical framework that expands generalized linear models (GLMs) for regression modelling. The framework shared in this book allows analyses based on many semi-traditional applied statistics models to be performed as a coherent whole. This is possible through the approximately half-a-dozen major classes of statistical models included in the book and the software infrastructure component, which makes the models easily operable.    The book’s methodology and accompanying software (the extensive VGAM R package) are directed at these limitations, and this is the first time the methodology and software are covered comprehensively in one volume. Since their advent in 1972, GLMs have unified important distributions under a single umbrella with enormous implications. The demands of practical data analysis, however, require a flexibility that GLMs do not have. Data-driven GLMs, in the form of generalized additive models (GAMs), are also largely confined to the exponential family. This book ...

  14. Estimating interaction on an additive scale between continuous determinants in a logistic regression model

    Knol, Mirjam J.; van der Tweel, Ingeborg; Grobbee, Diederick E.; Numans, Mattijs E.; Geerlings, Mirjam I.

    2007-01-01

    Background To determine the presence of interaction in epidemiologic research, typically a product term is added to the regression model. In linear regression, the regression coefficient of the product term reflects interaction as departure from additivity. However, in logistic regression it refers

  15. Declarations pursuant to the Articles 2 and 3 of the Model Additional Protocol

    Articles 2 and 3 of the Model Additional Protocol specify the content and the time limits of the information to be provided by the States into the framework of the Safeguard Agreements. To standardize the presentation of this information the IAEA has prepared guidelines for the preparation of the documents. A detailed explanation of the guidelines is given in the paper

  16. Additional interfacial force in lattice Boltzmann models for incompressible multiphase flows

    Li, Q; Gao, Y J

    2011-01-01

    The existing lattice Boltzmann models for incompressible multiphase flows are mostly constructed with two distribution functions, one is the order parameter distribution function, which is used to track the interface between different phases, and the other is the pressure distribution function for solving the velocity field. In this brief report, it is shown that in these models the recovered momentum equation is inconsistent with the target one: an additional interfacial force is included in the recovered momentum equation. The effects of the additional force are investigated by numerical simulations of droplet splashing on a thin liquid film and falling droplet under gravity. In the former test, it is found that the formation and evolution of secondary droplets are greatly affected, while in the latter the additional force is found to increase the falling velocity and limit the stretch of the droplet.

  17. Hazard function theory for nonstationary natural hazards

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  18. Utilization of sulfate additives in biomass combustion: fundamental and modeling aspects

    Wu, Hao; Jespersen, Jacob Boll; Grell, Morten Nedergaard;

    2013-01-01

    Sulfates, such as ammonium sulfate, aluminum sulfate and ferric sulfate, are effective additives for converting the alkali chlorides released from biomass combustion to the less harmful alkali sulfates. Optimization of the use of these additives requires knowledge on their decomposition rate and...... from the decomposition were investigated experimentally in a tube reactor under different conditions, revealing that the ratio of the SO3/SO2 released varied for different sulfate and the ratio could be influenced by the decomposition temperature. The proposed decomposition model of ferric sulfate was...... elemental sulfur were used as additives. The results indicated that the SO3 released from ferric sulfate decomposition was the main contributor to KCl sulfation and that the effectiveness of ferric sulfate addition was sensitive to the applied temperature conditions. Comparison of the effectiveness of...

  19. Effectiveness of water infrastructure for river flood management - Part 1: Flood hazard assessment using hydrological models in Bangladesh

    Gusyev, M. A.; Kwak, Y.; Khairul, M. I.; Arifuzzaman, M. B.; Magome, J.; Sawano, H.; Takeuchi, K.

    2015-06-01

    This study introduces a flood hazard assessment part of the global flood risk assessment (Part 2) conducted with a distributed hydrological Block-wise TOP (BTOP) model and a GIS-based Flood Inundation Depth (FID) model. In this study, the 20 km grid BTOP model was developed with globally available data on and applied for the Ganges, Brahmaputra and Meghna (GBM) river basin. The BTOP model was calibrated with observed river discharges in Bangladesh and was applied for climate change impact assessment to produce flood discharges at each BTOP cell under present and future climates. For Bangladesh, the cumulative flood inundation maps were produced using the FID model with the BTOP simulated flood discharges and allowed us to consider levee effectiveness for reduction of flood inundation. For the climate change impacts, the flood hazard increased both in flood discharge and inundation area for the 50- and 100-year floods. From these preliminary results, the proposed methodology can partly overcome the limitation of the data unavailability and produces flood~maps that can be used for the nationwide flood risk assessment, which is presented in Part 2 of this study.

  20. Gaussian mixture models for measuring local change down-track in LWIR imagery for explosive hazard detection

    Spain, Christopher J.; Anderson, Derek T.; Keller, James M.; Popescu, Mihail; Stone, Kevin E.

    2011-06-01

    Burying objects below the ground can potentially alter their thermal properties. Moreover, there is often soil disturbance associated with recently buried objects. An intensity video frame image generated by an infrared camera in the medium and long wavelengths often locally varies in the presence of buried explosive hazards. Our approach to automatically detecting these anomalies is to estimate a background model of the image sequence. Pixel values that do not conform to the background model may represent local changes in thermal or soil signature caused by buried objects. Herein, we present a Gaussian mixture model-based technique to estimate the statistical model of background pixel values. The background model is used to detect anomalous pixel values on the road while a vehicle is moving. Foreground pixel confidence values are projected into the UTM coordinate system and a UTM confidence map is built. Different operating levels are explored and the connected component algorithm is then used to extract islands that are subjected to size, shape and orientation filters. We are currently using this approach as a feature in a larger multi-algorithm fusion system. However, in this article we also present results for using this algorithm as a stand-alone detector algorithm in order to further explore its value in detecting buried explosive hazards.

  1. User's manual of a computer code for seismic hazard evaluation for assessing the threat to a facility by fault model. SHEAT-FM

    To establish the reliability evaluation method for aged structural component, we developed a probabilistic seismic hazard evaluation code SHEAT-FM (Seismic Hazard Evaluation for Assessing the Threat to a facility site - Fault Model) using a seismic motion prediction method based on fault model. In order to improve the seismic hazard evaluation, this code takes the latest knowledge in the field of earthquake engineering into account. For example, the code involves a group delay time of observed records and an update process model of active fault. This report describes the user's guide of SHEAT-FM, including the outline of the seismic hazard evaluation, specification of input data, sample problem for a model site, system information and execution method. (author)

  2. Landslide Hazard Assessment and Mapping in the Guil Catchment (Queyras, Southern French Alps): From Landslide Inventory to Susceptibility Modelling

    Roulleau, Louise; Bétard, François; Carlier, Benoît; Lissak, Candide; Fort, Monique

    2016-04-01

    Landslides are common natural hazards in the Southern French Alps, where they may affect human lives and cause severe damages to infrastructures. As a part of the SAMCO research project dedicated to risk evaluation in mountain areas, this study focuses on the Guil river catchment (317 km2), Queyras, to assess landslide hazard poorly studied until now. In that area, landslides are mainly occasional, low amplitude phenomena, with limited direct impacts when compared to other hazards such as floods or snow avalanches. However, when interacting with floods during extreme rainfall events, landslides may have indirect consequences of greater importance because of strong hillslope-channel connectivity along the Guil River and its tributaries (i.e. positive feedbacks). This specific morphodynamic functioning reinforces the need to have a better understanding of landslide hazards and their spatial distribution at the catchment scale to prevent local population from disasters with multi-hazard origin. The aim of this study is to produce a landslide susceptibility mapping at 1:50 000 scale as a first step towards global estimation of landslide hazard and risk. The three main methodologies used for assessing landslide susceptibility are qualitative (i.e. expert opinion), deterministic (i.e. physics-based models) and statistical methods (i.e. probabilistic models). Due to the rapid development of geographical information systems (GIS) during the last two decades, statistical methods are today widely used because they offer a greater objectivity and reproducibility at large scales. Among them, multivariate analyses are considered as the most robust techniques, especially the logistic regression method commonly used in landslide susceptibility mapping. However, this method like others is strongly dependent on the accuracy of the input data to avoid significant errors in the final results. In particular, a complete and accurate landslide inventory is required before the modelling

  3. Formation and reduction of carcinogenic furan in various model systems containing food additives.

    Kim, Jin-Sil; Her, Jae-Young; Lee, Kwang-Geun

    2015-12-15

    The aim of this study was to analyse and reduce furan in various model systems. Furan model systems consisting of monosaccharides (0.5M glucose and ribose), amino acids (0.5M alanine and serine) and/or 1.0M ascorbic acid were heated at 121°C for 25 min. The effects of food additives (each 0.1M) such as metal ions (iron sulphate, magnesium sulphate, zinc sulphate and calcium sulphate), antioxidants (BHT and BHA), and sodium sulphite on the formation of furan were measured. The level of furan formed in the model systems was 6.8-527.3 ng/ml. The level of furan in the model systems of glucose/serine and glucose/alanine increased 7-674% when food additives were added. In contrast, the level of furan decreased by 18-51% in the Maillard reaction model systems that included ribose and alanine/serine with food additives except zinc sulphate. PMID:26190608

  4. A hazard-based duration model for analyzing crossing behavior of cyclists and electric bike riders at signalized intersections.

    Yang, Xiaobao; Huan, Mei; Abdel-Aty, Mohamed; Peng, Yichuan; Gao, Ziyou

    2015-01-01

    This paper presents a hazard-based duration approach to investigate riders' waiting times, violation hazards, associated risk factors, and their differences between cyclists and electric bike riders at signalized intersections. A total of 2322 two-wheeled riders approaching the intersections during red light periods were observed in Beijing, China. The data were classified into censored and uncensored data to distinguish between safe crossing and red-light running behavior. The results indicated that the red-light crossing behavior of most riders was dependent on waiting time. They were inclined to terminate waiting behavior and run against the traffic light with the increase of waiting duration. Over half of the observed riders cannot endure 49s or longer. 25% of the riders can endure 97s or longer. Rider type, gender, waiting position, conformity tendency and crossing traffic volume were identified to have significant effects on riders' waiting times and violation hazards. Electric bike riders were found to be more sensitive to the external risk factors such as other riders' crossing behavior and crossing traffic volume than cyclists. Moreover, unobserved heterogeneity was examined in the proposed models. The finding of this paper can explain when and why cyclists and electric bike riders run against the red light at intersections. The results of this paper are useful for traffic design and management agencies to implement strategies to enhance the safety of riders. PMID:25463942

  5. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  6. NB-PLC channel modelling with cyclostationary noise addition & OFDM implementation for smart grid

    Thomas, Togis; Gupta, K. K.

    2016-03-01

    Power line communication (PLC) technology can be a viable solution for the future ubiquitous networks because it provides a cheaper alternative to other wired technology currently being used for communication. In smart grid Power Line Communication (PLC) is used to support communication with low rate on low voltage (LV) distribution network. In this paper, we propose the channel modelling of narrowband (NB) PLC in the frequency range 5 KHz to 500 KHz by using ABCD parameter with cyclostationary noise addition. Behaviour of the channel was studied by the addition of 11KV/230V transformer, by varying load location and load. Bit error rate (BER) Vs signal to noise ratio SNR) was plotted for the proposed model by employing OFDM. Our simulation results based on the proposed channel model show an acceptable performance in terms of bit error rate versus signal to noise ratio, which enables communication required for smart grid applications.

  7. Results of development and field tests of a radar-tracer system providing meteorological support to modeling hazardous technological releases

    radiation monitoring laboratory and the LDU was on the bank of the cooling pond. The main characteristics of a cloud of substances are standard deviations in the spatial concentration distributions in the cloud along mutually perpendicular directions, which are representative spatial scales of the cloud in these directions. The squares of these values are relative variances of the cloud. These characteristics were the first to determine and are among the most important input parameters in different models assessing transport and dispersion of hazardous substances. Dispersion of a cloud of substances is influenced by turbulence and vertical shifts of the mean wind speed, with the patterns of dispersion being different depending an diffusion time (or distance to a source). The results of the conducted experiments have revealed an important advantage of the used radar complex in terms of support to predictions of possible contamination in case of release of hazardous materials. As the chaff cloud moves with the wind and is dispersed, it echoes the meteorological situation at a given time moment (vertical wind distribution and air temperature, turbulence level, boundary layer thickness, type of underlying surface) and any changes occurring in the conditions on its way. Based on obtained radar data, the radar complex is capable of assessing the dispersion parameters which can then be used in the mathematical transport and dispersion models. What's more, determination of these parameters does not require any meteorological measurements. In addition, using data about movement of the cloud centroid, the wind speed and direction is estimated at the release height by the radar complex itself. The conducted experiments have shown that the developed technology makes possible measuring key parameters of a cloud of substances at the height of dispersion and taking into account local features of the distribution of meteorological quantities occurring in the vicinity of a source. This

  8. Estimating the phenology of elk brucellosis transmission with hierarchical models of cause-specific and baseline hazards

    Cross, Paul C.; Maichak, Eric J.; Rogerson, Jared D.; Irvine, Kathryn M.; Jones, Jennifer D; Heisey, Dennis M.; Edwards, William H.; Scurlock, Brandon M.

    2015-01-01

    Understanding the seasonal timing of disease transmission can lead to more effective control strategies, but the seasonality of transmission is often unknown for pathogens transmitted directly. We inserted vaginal implant transmitters (VITs) in 575 elk (Cervus elaphus canadensis) from 2006 to 2014 to assess when reproductive failures (i.e., abortions or still births) occur, which is the primary transmission route of Brucella abortus, the causative agent of brucellosis in the Greater Yellowstone Ecosystem. Using a survival analysis framework, we developed a Bayesian hierarchical model that simultaneously estimated the total baseline hazard of a reproductive event as well as its 2 mutually exclusive parts (abortions or live births). Approximately, 16% (95% CI = 0.10, 0.23) of the pregnant seropositive elk had reproductive failures, whereas 2% (95% CI = 0.01, 0.04) of the seronegative elk had probable abortions. Reproductive failures could have occurred as early as 13 February and as late as 10 July, peaking from March through May. Model results suggest that less than 5% of likely abortions occurred after 6 June each year and abortions were approximately 5 times more likely in March, April, or May compared to February or June. In western Wyoming, supplemental feeding of elk begins in December and ends during the peak of elk abortions and brucellosis transmission (i.e., Mar and Apr). Years with more snow may enhance elk-to-elk transmission on supplemental feeding areas because elk are artificially aggregated for the majority of the transmission season. Elk-to-cattle transmission will depend on the transmission period relative to the end of the supplemental feeding season, elk seroprevalence, population size, and the amount of commingling. Our statistical approach allowed us to estimate the probability density function of different event types over time, which may be applicable to other cause-specific survival analyses. It is often challenging to assess the

  9. Estimation for an additive growth curve model with orthogonal design matrices

    Hu, Jianhua; You, Jinhong; 10.3150/10-BEJ315

    2012-01-01

    An additive growth curve model with orthogonal design matrices is proposed in which observations may have different profile forms. The proposed model allows us to fit data and then estimate parameters in a more parsimonious way than the traditional growth curve model. Two-stage generalized least-squares estimators for the regression coefficients are derived where a quadratic estimator for the covariance of observations is taken as the first-stage estimator. Consistency, asymptotic normality and asymptotic independence of these estimators are investigated. Simulation studies and a numerical example are given to illustrate the efficiency and parsimony of the proposed model for model specifications in the sense of minimizing Akaike's information criterion (AIC).

  10. Efficient semiparametric estimation in generalized partially linear additive models for longitudinal/clustered data

    Cheng, Guang

    2014-02-01

    We consider efficient estimation of the Euclidean parameters in a generalized partially linear additive models for longitudinal/clustered data when multiple covariates need to be modeled nonparametrically, and propose an estimation procedure based on a spline approximation of the nonparametric part of the model and the generalized estimating equations (GEE). Although the model in consideration is natural and useful in many practical applications, the literature on this model is very limited because of challenges in dealing with dependent data for nonparametric additive models. We show that the proposed estimators are consistent and asymptotically normal even if the covariance structure is misspecified. An explicit consistent estimate of the asymptotic variance is also provided. Moreover, we derive the semiparametric efficiency score and information bound under general moment conditions. By showing that our estimators achieve the semiparametric information bound, we effectively establish their efficiency in a stronger sense than what is typically considered for GEE. The derivation of our asymptotic results relies heavily on the empirical processes tools that we develop for the longitudinal/clustered data. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2014 ISI/BS.

  11. On the development of a seismic source zonation model for seismic hazard assessment in western Saudi Arabia

    Zahran, Hani M.; Sokolov, Vladimir; Roobol, M. John; Stewart, Ian C. F.; El-Hadidy Youssef, Salah; El-Hadidy, Mahmoud

    2016-07-01

    A new seismic source model has been developed for the western part of the Arabian Peninsula, which has experienced considerable earthquake activity in the historical past and in recent times. The data used for the model include an up-to-date seismic catalog, results of recent studies of Cenozoic faulting in the area, aeromagnetic anomaly and gravity maps, geological maps, and miscellaneous information on volcanic activity. The model includes 18 zones ranging along the Red Sea and the Arabian Peninsula from the Gulf of Aqaba and the Dead Sea in the north to the Gulf of Aden in the south. The seismic source model developed in this study may be considered as one of the basic branches in a logic tree approach for seismic hazard assessment in Saudi Arabia and adjacent territories.

  12. The Effect of Ignoring Statistical Interactions in Regression Analyses Conducted in Epidemiologic Studies: An Example with Survival Analysis Using Cox Proportional Hazards Regression Model

    Vatcheva, KP; Lee, M; McCormick, JB; Rahbar, MH

    2016-01-01

    Objective To demonstrate the adverse impact of ignoring statistical interactions in regression models used in epidemiologic studies. Study design and setting Based on different scenarios that involved known values for coefficient of the interaction term in Cox regression models we generated 1000 samples of size 600 each. The simulated samples and a real life data set from the Cameron County Hispanic Cohort were used to evaluate the effect of ignoring statistical interactions in these models. Results Compared to correctly specified Cox regression models with interaction terms, misspecified models without interaction terms resulted in up to 8.95 fold bias in estimated regression coefficients. Whereas when data were generated from a perfect additive Cox proportional hazards regression model the inclusion of the interaction between the two covariates resulted in only 2% estimated bias in main effect regression coefficients estimates, but did not alter the main findings of no significant interactions. Conclusions When the effects are synergic, the failure to account for an interaction effect could lead to bias and misinterpretation of the results, and in some instances to incorrect policy decisions. Best practices in regression analysis must include identification of interactions, including for analysis of data from epidemiologic studies.

  13. Use of additive technologies for practical working with complex models for foundry technologies

    Olkhovik, E.; Butsanets, A. A.; Ageeva, A. A.

    2016-07-01

    The article presents the results of research of additive technology (3D printing) application for developing a geometrically complex model of castings parts. Investment casting is well known and widely used technology for the production of complex parts. The work proposes the use of a 3D printing technology for manufacturing models parts, which are removed by thermal destruction. Traditional methods of equipment production for investment casting involve the use of manual labor which has problems with dimensional accuracy, and CNC technology which is less used. Such scheme is low productive and demands considerable time. We have offered an alternative method which consists in printing the main knots using a 3D printer (PLA and ABS) with a subsequent production of castings models from them. In this article, the main technological methods are considered and their problems are discussed. The dimensional accuracy of models in comparison with investment casting technology is considered as the main aspect.

  14. Modeling the use of sulfate additives for potassium chloride destruction in biomass combustion

    Wu, Hao; Grell, Morten Nedergaard; Jespersen, Jacob Boll;

    2013-01-01

    Potassium chloride, KCl, formed from biomass combustion may lead to ash deposition and corrosion problems in boilers. Sulfates are effective additives for converting KCl to the less harmful K2SO4. In the present study, the decomposition of ammonium sulfate, aluminum sulfate and ferric sulfate was...... studied respectively in a fast-heating rate thermogravimetric analyzer (TGA) for deriving a kinetic model. The yields of SO2 and SO3 from the decomposition were studied in a tube reactor, revealing that the ratio of the SO3/SO2 released varied for different sulfate and for ammonium sulfate the ratio was...... affected by the decomposition temperature. Based on the experimental data, a model was proposed to simulate the sulfation of KCl by different sulfate addition, and the simulation results were compared with pilot-scale experiments conducted in a bubbling fluidized bed reactor. The simulation results of...

  15. Modeling hydrologic and geomorphic hazards across post-fire landscapes using a self-organizing map approach

    Friedel, M.J.

    2011-01-01

    Few studies attempt to model the range of possible post-fire hydrologic and geomorphic hazards because of the sparseness of data and the coupled, nonlinear, spatial, and temporal relationships among landscape variables. In this study, a type of unsupervised artificial neural network, called a self-organized map (SOM), is trained using data from 540 burned basins in the western United States. The sparsely populated data set includes variables from independent numerical landscape categories (climate, land surface form, geologic texture, and post-fire condition), independent landscape classes (bedrock geology and state), and dependent initiation processes (runoff, landslide, and runoff and landslide combination) and responses (debris flows, floods, and no events). Pattern analysis of the SOM-based component planes is used to identify and interpret relations among the variables. Application of the Davies-Bouldin criteria following k-means clustering of the SOM neurons identified eight conceptual regional models for focusing future research and empirical model development. A split-sample validation on 60 independent basins (not included in the training) indicates that simultaneous predictions of initiation process and response types are at least 78% accurate. As climate shifts from wet to dry conditions, forecasts across the burned landscape reveal a decreasing trend in the total number of debris flow, flood, and runoff events with considerable variability among individual basins. These findings suggest the SOM may be useful in forecasting real-time post-fire hazards, and long-term post-recovery processes and effects of climate change scenarios. ?? 2011.

  16. Model for Assembly Line Re-Balancing Considering Additional Capacity and Outsourcing to Face Demand Fluctuations

    Samadhi, TMAA; Sumihartati, Atin

    2016-02-01

    The most critical stage in a garment industry is sewing process, because generally, it consists of a number of operations and a large number of sewing machines for each operation. Therefore, it requires a balancing method that can assign task to work station with balance workloads. Many studies on assembly line balancing assume a new assembly line, but in reality, due to demand fluctuation and demand increased a re-balancing is needed. To cope with those fluctuating demand changes, additional capacity can be carried out by investing in spare sewing machine and paying for sewing service through outsourcing. This study develops an assembly line balancing (ALB) model on existing line to cope with fluctuating demand change. Capacity redesign is decided if the fluctuation demand exceeds the available capacity through a combination of making investment on new machines and outsourcing while considering for minimizing the cost of idle capacity in the future. The objective of the model is to minimize the total cost of the line assembly that consists of operating costs, machine cost, adding capacity cost, losses cost due to idle capacity and outsourcing costs. The model develop is based on an integer programming model. The model is tested for a set of data of one year demand with the existing number of sewing machines of 41 units. The result shows that additional maximum capacity up to 76 units of machine required when there is an increase of 60% of the average demand, at the equal cost parameters..

  17. Modeling the use of sulfate additives for potassium chloride destruction in biomass combustion

    Wu, Hao; Grell, Morten Nedergaard; Jespersen, Jacob Boll; Aho, Martti; Jappe Frandsen, Flemming; Glarborg, Peter

    2013-01-01

    Potassium chloride, KCl, formed from biomass combustion may lead to ash deposition and corrosion problems in boilers. Sulfates are effective additives for converting KCl to the less harmful K2SO4. In the present study, the decomposition of ammonium sulfate, aluminum sulfate and ferric sulfate was studied respectively in a fast-heating rate thermogravimetric analyzer (TGA) for deriving a kinetic model. The yields of SO2 and SO3 from the decomposition were studied in a tube reactor, revealing t...

  18. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    Plesko, Catherine S [Los Alamos National Laboratory; Clement, R Ryan [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  19. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  20. CalTOX, a multimedia total exposure model for hazardous-waste sites; Part 1, Executive summary

    McKone, T.E.

    1993-06-01

    CalTOX has been developed as a spreadsheet model to assist in health-risk assessments that address contaminated soils and the contamination of adjacent air, surface water, sediments, and ground water. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify and reduce uncertainty in multimedia, multiple-pathway exposure models. This report provides an overview of the CalTOX model components, lists the objectives of the model, describes the philosophy under which the model was developed, identifies the chemical classes for which the model can be used, and describes critical sensitivities and uncertainties. The multimedia transport and transformation model is a dynamic model that can be used to assess time-varying concentrations of contaminants introduced initially to soil layers or for contaminants released continuously to air or water. This model assists the user in examining how chemical and landscape properties impact both the ultimate route and quantity of human contact. Multimedia, multiple pathway exposure models are used in the CalTOX model to estimate average daily potential doses within a human population in the vicinity of a hazardous substances release site. The exposure models encompass twenty-three exposure pathways. The exposure assessment process consists of relating contaminant concentrations in the multimedia model compartments to contaminant concentrations in the media with which a human population has contact (personal air, tap water, foods, household dusts soils, etc.). The average daily dose is the product of the exposure concentrations in these contact media and an intake or uptake factor that relates the concentrations to the distributions of potential dose within the population.

  1. Integrating multidisciplinary science, modelling and impact data into evolving, syn-event volcanic hazard mapping and communication: A case study from the 2012 Tongariro eruption crisis, New Zealand

    Leonard, Graham S.; Stewart, Carol; Wilson, Thomas M.; Procter, Jonathan N.; Scott, Bradley J.; Keys, Harry J.; Jolly, Gill E.; Wardman, Johnny B.; Cronin, Shane J.; McBride, Sara K.

    2014-10-01

    New Zealand's Tongariro National Park volcanoes produce hazardous eruptions every few years to decades. On 6 August 2012 the Te Maari vent of Tongariro Volcano erupted, producing a series of explosions and a fine ash of minor volume which was dispersed rapidly to the east. This manuscript presents a summary of the eruption impacts and the way these supported science communication during the crisis, particularly in terms of hazard map development. The most significant proximal impact was damage from pyroclastic surges and ballistics to the popular and economically-important Tongariro Alpine Crossing track. The only hazard to affect the medial impact zone was a few mms of ashfall with minor impacts. Field testing indicated that the Te Maari ash had extremely low resistivity when wetted, implying a very high potential to cause disruption to nationally-important power transmission networks via the mechanism of insulator flashover. This was not observed, presumably due to insufficient ash accumulation on insulators. Virtually no impacts from distal ashfall were reported. Post-event analysis of PM10 data demonstrates the additional value of regional air quality monitoring networks in quantifying population exposure to airborne respirable ash. While the eruption was minor, it generated a high level of public interest and a demand for information on volcanic hazards and impacts from emergency managers, the public, critical infrastructure managers, health officials, and the agriculture sector. Meeting this demand fully taxed available resources. We present here aspects of the New Zealand experience which may have wider applicability in moving towards improved integration of hazard impact information, mapping, and communication. These include wide use of a wiki technical clearinghouse and email listservs, a focus on multi-agency consistent messages, and a recently developed environment of collaboration and alignment of both research funding and technical science advice

  2. CAirTOX, An inter-media transfer model for assessing indirect exposures to hazardous air contaminants

    Risk assessment is a quantitative evaluation of information on potential health hazards of environmental contaminants and the extent of human exposure to these contaminants. As applied to toxic chemical emissions to air, risk assessment involves four interrelated steps. These are (1) determination of source concentrations or emission characteristics, (2) exposure assessment, (3) toxicity assessment, and (4) risk characterization. These steps can be carried out with assistance from analytical models in order to estimate the potential risk associated with existing and future releases. CAirTOX has been developed as a spreadsheet model to assist in making these types of calculations. CAirTOX follows an approach that has been incorporated into the CalTOX model, which was developed for the California Department of Toxic Substances Control, With CAirTOX, we can address how contaminants released to an air basin can lead to contamination of soil, food, surface water, and sediments. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify uncertainty in multimedia, multiple-pathway exposure assessments. The capacity to explicitly address uncertainty has been incorporated into the model in two ways. First, the spreadsheet form of the model makes it compatible with Monte-Carlo add-on programs that are available for uncertainty analysis. Second, all model inputs are specified in terms of an arithmetic mean and coefficient of variation so that uncertainty analyses can be carried out

  3. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  4. Evaporation model for beam based additive manufacturing using free surface lattice Boltzmann methods

    Evaporation plays an important role in many technical applications including beam-based additive manufacturing processes, such as selective electron beam or selective laser melting (SEBM/SLM). In this paper, we describe an evaporation model which we employ within the framework of a two-dimensional free surface lattice Boltzmann method. With this method, we solve the hydrodynamics as well as thermodynamics of the molten material taking into account the mass and energy losses due to evaporation and the recoil pressure acting on the melt pool. Validation of the numerical model is performed by measuring maximum melt depths and evaporative losses in samples of pure titanium and Ti–6Al–4V molten by an electron beam. Finally, the model is applied to create processing maps for an SEBM process. The results predict that the penetration depth of the electron beam, which is a function of the acceleration voltage, has a significant influence on evaporation effects. (paper)

  5. A model for the additional dissipation in transients: a volume viscosity coefficient

    During transients with water column separation, the formation of a two-phase mixture in more or less large conduit lengths where pressures are close to vapour pressure, does produce, as is well known, additional dissipations besides the dissipations which are normally met with during transients without cavitation. The effects, certainly related to the presence of a gaseous phase, have been represented in several models, among which those of Streeter (1970), Kalkwijk and de Vries (1971), Safwat (1972), Kalkwijk (1974). The model developed in this paper, already presented at the Vallombrosa meeting (1974), introduces a global volume viscosity coefficient. This model is compared with others and with some experimental results: this formulation does complete previous contributions and by means of a specific analytic representation of the dissipative term, gives a correct damping o

  6. Evaporation model for beam based additive manufacturing using free surface lattice Boltzmann methods

    Klassen, Alexander; Scharowsky, Thorsten; Körner, Carolin

    2014-07-01

    Evaporation plays an important role in many technical applications including beam-based additive manufacturing processes, such as selective electron beam or selective laser melting (SEBM/SLM). In this paper, we describe an evaporation model which we employ within the framework of a two-dimensional free surface lattice Boltzmann method. With this method, we solve the hydrodynamics as well as thermodynamics of the molten material taking into account the mass and energy losses due to evaporation and the recoil pressure acting on the melt pool. Validation of the numerical model is performed by measuring maximum melt depths and evaporative losses in samples of pure titanium and Ti-6Al-4V molten by an electron beam. Finally, the model is applied to create processing maps for an SEBM process. The results predict that the penetration depth of the electron beam, which is a function of the acceleration voltage, has a significant influence on evaporation effects.

  7. Local Tsunami Hazard In The Marquesas Islands (french Polynesia) : Numerical Modeling of The 1999 Fatu Hiva Landslide and Tsunami

    Hébert, H.; Schindelé, F.; Heinrich, P.; Piatanesi, A.; Okal, E. A.

    In French Polynesia, the Marquesas Islands are particularly prone to amplification of tsunamis generated at the Pacific Rim, due to relatively mild submarine slopes and to large open bays not protected by any coral reef. These islands are also threatened by local tsunamis, as shown by the recent 1999 event on Fatu Hiva. On September 13, 1999, Omoa Bay was struck by 2 to 5 m high water waves: several buildings, among them the school, were flooded and destroyed but no lives were lost. Observations gath- ered during a post-event survey revealed the recent collapse into the sea of a 300x300 m, at least 20-m thick, cliff located 5 km southeast of Omoa. This cliff failure most certainly triggered the tsunami waves since the cliff was reported intact 45 min earlier. We simulate the tsunami generation due to a subaerial landslide, using a finite- difference model assimilating the landslide to a flow of granular material. Numerical modeling shows that a 0.0024-km3 landslide located in the presumed source area ac- counts well for the tsunami waves reported in Omoa Bay. We show that the striking amplification observed in Omoa Bay is related to the trapping of waves due to the shallow submarine shelf surrounding the island. These results stress the local tsunami hazard that should be taken into account in the natural hazard assessment and mitiga- tion of the area, where historical cliff collapses can be observed and should happen again.

  8. Hazardous wastes

    The dangers and problems of hazardous wastes are described in this pictorial booklet that is part of the EPA solid waste management publication series. It is shown that how the nation's hazardous wastes are managed or mismanaged is a crucial environmental issue with vast implications for public health and for the integrity of the ecological systems on which man depends. The environmental folly of dumping or burning these wastes is emphasized, along with the economic imprudence of continuing to throw away valuable resources as wastes. The public as well as industry must pay the costs of safe hazardous waste management

  9. Numerical modeling of microstructure evolution during laser additive manufacturing of a nickel-based superalloy

    A multi-scale model that combines the finite element method and stochastic analysis is developed to simulate the evolution of the microstructure of an Nb-bearing nickel-based superalloy during laser additive manufacturing solidification. Through the use of this model, the nucleation and growth of dendrites, the segregation of niobium (Nb) and the formation of Laves phase particles during the solidification are investigated to provide the relationship between the solidification conditions and the resultant microstructure, especially in the morphology of Laves phase particles. The study shows that small equiaxed dendrite arm spacing under a high cooling rate and low temperature gradient to growth rate (G/R) ratio is beneficial for forming discrete Laves phase particles. In contrast, large columnar dendrite arm spacing under a low cooling rate and high G/R ratio tends to produce continuously distributed coarse Laves phase particles, which are known to be detrimental to mechanical properties. In addition, the improvement of hot cracking resistance by controlling the morphology of Laves phase particles is discussed by analyzing the cracking pattern and microstructure in the laser deposited material. This work provides valuable understanding of solidification microstructure development in Nb-bearing nickel-based superalloys, like IN 718, during laser additive manufacturing and constitutes a fundamental basis for controlling the microstructure to minimize the formation of deleterious Laves phase particles

  10. Internal structure and volcanic hazard potential of Mt Tongariro, New Zealand, from 3D gravity and magnetic models

    Miller, Craig A.; Williams-Jones, Glyn

    2016-06-01

    A new 3D geophysical model of the Mt Tongariro Volcanic Massif (TgVM), New Zealand, provides a high resolution view of the volcano's internal structure and hydrothermal system, from which we derive implications for volcanic hazards. Geologically constrained 3D inversions of potential field data provides a greater level of insight into the volcanic structure than is possible from unconstrained models. A complex region of gravity highs and lows (± 6 mGal) is set within a broader, ~ 20 mGal gravity low. A magnetic high (1300 nT) is associated with Mt Ngauruhoe, while a substantial, thick, demagnetised area occurs to the north, coincident with a gravity low and interpreted as representing the hydrothermal system. The hydrothermal system is constrained to the west by major faults, interpreted as an impermeable barrier to fluid migration and extends to basement depth. These faults are considered low probability areas for future eruption sites, as there is little to indicate they have acted as magmatic pathways. Where the hydrothermal system coincides with steep topographic slopes, an increased likelihood of landslides is present and the newly delineated hydrothermal system maps the area most likely to have phreatic eruptions. Such eruptions, while small on a global scale, are important hazards at the TgVM as it is a popular hiking area with hundreds of visitors per day in close proximity to eruption sites. The model shows that the volume of volcanic material erupted over the lifespan of the TgVM is five to six times greater than previous estimates, suggesting a higher rate of magma supply, in line with global rates of andesite production. We suggest that our model of physical property distribution can be used to provide constraints for other models of dynamic geophysical processes occurring at the TgVM.

  11. Bi-Objective Modelling for Hazardous Materials Road–Rail Multimodal Routing Problem with Railway Schedule-Based Space–Time Constraints

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-01-01

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing–Tianjin–Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study. PMID:27483294

  12. Bi-Objective Modelling for Hazardous Materials Road-Rail Multimodal Routing Problem with Railway Schedule-Based Space-Time Constraints.

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-01-01

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing-Tianjin-Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study. PMID:27483294

  13. "Bunched Black Swans" in Complex Geosystems: Cross-Disciplinary Approaches to the Additive and Multiplicative Modelling of Correlated Extreme Bursts

    Watkins, N. W.; Rypdal, M.; Lovsletten, O.

    2012-12-01

    For all natural hazards, the question of when the next "extreme event" (c.f. Taleb's "black swans") is expected is of obvious importance. In the environmental sciences users often frame such questions in terms of average "return periods", e.g. "is an X meter rise in the Thames water level a 1-in-Y year event ?". Frequently, however, we also care about the emergence of correlation, and whether the probability of several big events occurring in close succession is truly independent, i.e. are the black swans "bunched". A "big event", or a "burst", defined by its integrated signal above a threshold, might be a single, very large, event, or, instead, could in fact be a correlated series of "smaller" (i.e. less wildly fluctuating) events. Several available stochastic approaches provide quantitative information about such bursts, including Extreme Value Theory (EVT); the theory of records; level sets; sojourn times; and models of space-time "avalanches" of activity in non-equilibrium systems. Some focus more on the probability of single large events. Others are more concerned with extended dwell times above a given spatiotemporal threshold: However, the state of the art is not yet fully integrated, and the above-mentioned approaches differ in fundamental aspects. EVT is perhaps the best known in the geosciences. It is concerned with the distribution obeyed by the extremes of datasets, e.g. the 100 values obtained by considering the largest daily temperature recorded in each of the years of a century. However, the pioneering work from the 1920s on which EVT originally built was based on independent identically distributed samples, and took no account of memory and correlation that characterise many natural hazard time series. Ignoring this would fundamentally limit our ability to forecast; so much subsequent activity has been devoted to extending EVT to encompass dependence. A second group of approaches, by contrast, has notions of time and thus possible non

  14. Automated Standard Hazard Tool

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  15. The impact of hazardous industrial facilities on housing prices: A comparison of parametric and semiparametric hedonic price models

    Grislain-Letrémy, Céline; Katossky, Arthur

    2014-01-01

    The willingness of households to pay for prevention against industrial risks can be revealed by real estate markets. By using very rich microdata, we study housing prices in the vicinity of hazardous industries near three important French cities. We show that the impact of hazardous plants on the...... important biases in the estimated value of the impact of hazardous plants on housing values....

  16. Topsoil organic carbon content of Europe, a new map based on a generalised additive model

    de Brogniez, Delphine; Ballabio, Cristiano; Stevens, Antoine; Jones, Robert J. A.; Montanarella, Luca; van Wesemael, Bas

    2014-05-01

    There is an increasing demand for up-to-date spatially continuous organic carbon (OC) data for global environment and climatic modeling. Whilst the current map of topsoil organic carbon content for Europe (Jones et al., 2005) was produced by applying expert-knowledge based pedo-transfer rules on large soil mapping units, the aim of this study was to replace it by applying digital soil mapping techniques on the first European harmonised geo-referenced topsoil (0-20 cm) database, which arises from the LUCAS (land use/cover area frame statistical survey) survey. A generalized additive model (GAM) was calibrated on 85% of the dataset (ca. 17 000 soil samples) and a backward stepwise approach selected slope, land cover, temperature, net primary productivity, latitude and longitude as environmental covariates (500 m resolution). The validation of the model (applied on 15% of the dataset), gave an R2 of 0.27. We observed that most organic soils were under-predicted by the model and that soils of Scandinavia were also poorly predicted. The model showed an RMSE of 42 g kg-1 for mineral soils and of 287 g kg-1 for organic soils. The map of predicted OC content showed the lowest values in Mediterranean countries and in croplands across Europe, whereas highest OC content were predicted in wetlands, woodlands and in mountainous areas. The map of standard error of the OC model predictions showed high values in northern latitudes, wetlands, moors and heathlands, whereas low uncertainty was mostly found in croplands. A comparison of our results with the map of Jones et al. (2005) showed a general agreement on the prediction of mineral soils' OC content, most probably because the models use some common covariates, namely land cover and temperature. Our model however failed to predict values of OC content greater than 200 g kg-1, which we explain by the imposed unimodal distribution of our model, whose mean is tilted towards the majority of soils, which are mineral. Finally, average

  17. Mixed butanols addition to gasoline surrogates: Shock tube ignition delay time measurements and chemical kinetic modeling

    AlRamadan, Abdullah S.

    2015-10-01

    The demand for fuels with high anti-knock quality has historically been rising, and will continue to increase with the development of downsized and turbocharged spark-ignition engines. Butanol isomers, such as 2-butanol and tert-butanol, have high octane ratings (RON of 105 and 107, respectively), and thus mixed butanols (68.8% by volume of 2-butanol and 31.2% by volume of tert-butanol) can be added to the conventional petroleum-derived gasoline fuels to improve octane performance. In the present work, the effect of mixed butanols addition to gasoline surrogates has been investigated in a high-pressure shock tube facility. The ignition delay times of mixed butanols stoichiometric mixtures were measured at 20 and 40bar over a temperature range of 800-1200K. Next, 10vol% and 20vol% of mixed butanols (MB) were blended with two different toluene/n-heptane/iso-octane (TPRF) fuel blends having octane ratings of RON 90/MON 81.7 and RON 84.6/MON 79.3. These MB/TPRF mixtures were investigated in the shock tube conditions similar to those mentioned above. A chemical kinetic model was developed to simulate the low- and high-temperature oxidation of mixed butanols and MB/TPRF blends. The proposed model is in good agreement with the experimental data with some deviations at low temperatures. The effect of mixed butanols addition to TPRFs is marginal when examining the ignition delay times at high temperatures. However, when extended to lower temperatures (T < 850K), the model shows that the mixed butanols addition to TPRFs causes the ignition delay times to increase and hence behaves like an octane booster at engine-like conditions. © 2015 The Combustion Institute.

  18. Obtaining 3D PLY Part from DEM Surface Data for Terrain Modeling by Additive Fabrication

    YASHWANT KUMAR MODI

    2014-04-01

    Full Text Available Physical modeling of the earth’s terrain has been gaining popularity among architects and land-use planners in the last few years. It is partly because of the limitations with the cartographic maps and virtual reality techniques and partly because of availability of rapid manufacturing processes to produce physical models of terrain with accurate freeform surfaces. Recently many researchers have employed Additive Manufacturing (AM processes to fabricate physical scale models of the terrains. However, they got a physical model in several steps. They used more than one software package to translate surface DEM data into faceted models, leading to loss of data in intermediate file format conversion. This paper presents a methodology which can convert surface DEM data directly into PLY format in single step. This work also eliminates the data loss associated with translation of data into intermediate file formats. In this paper two data formats: DEM ASCII XYZ and Surfer Grid have been directly converted into PLY format. The results of the program are verified and validated with the help of sample data files as well as real world DEM data.

  19. Updated Colombian Seismic Hazard Map

    Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.

    2013-05-01

    The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is

  20. EFFECT OF MODE OF ADDITION OF DISINTEGRANTS ON DISSOLUTION OF MODEL DRUG FROM WET GRANULATION TABLETS

    Md. Mofizur Rahman

    2011-02-01

    Full Text Available The purpose of the study was to formulate immediate release tablets using various types of disintegrants (crospovidone, sodium starch glycolate and sodium carboxymethylcellulose, in order to investigate the effect of mode of incorporation of disintegrants on release mechanism from tablets. Acetaminophen, a poor soluble drug was used as a model drug to evaluate its release characteristics from different formulations. The USP paddle method was selected to perform the dissolution profiles carried out by USP apparatus 2 (paddle at 50 rpm in 900 ml phosphate buffer pH 5.8. Successive dissolution time, time required for 25%, 50% and 80% of the drug release (T25%, T50%, T80% was used to compare the dissolution results. A One way analysis of variance (ANOVA was used to interpret the result. tatistically significant differences were found among the drug release profile from all the formulations except mode of addition of crosspovidone. At a fixed amount of disintegrants, extragranular mode of addition seemed to be the best mode of incorporation. The best release was achieved with the crospovidone containing formulations. The T50 and T80 values were indicative of the fact that the drug release was faster from tablet formulations containing crosspovidone. The drug release was very much negligible difference by the mode of crospovidone addition. Two formulations found very small T50 and T80values indicating very much faster release. From the all formulations corresponded extragranular mode of addition could be the best mode of incorporation. The drug release was unaffected by the mode of crospovidone addition. The mode of incorporation of disintegrants suggested enchancing the release of poor soluble drugs.

  1. Assessing the effect, on animal model, of mixture of food additives, on the water balance

    Mariola Friedrich

    2013-03-01

    Full Text Available Purpose. The purpose of this study was to determine, on the animal model, the effect of modifi cation of diet composition and administration of selected food additives on water balance in the body. Material and methods. The study was conducted with 48 males and 48 females (separately for each sex of Wistar strain rats divided into four groups. For drinking, the animals from groups I and III were receiving water, whereas the animals from groups II and IV were administered 5 ml of a solution of selected food additives (potassium nitrate – E 252, sodium nitrite – E 250, benzoic acid – E 210, sorbic acid – E 200, and monosodium glutamate – E 621. Doses of the administered food additives were computed taking into account the average intake by men, expressed per body mass unit. Having drunk the solution, the animals were provided water for drinking. Results. The mixture of selected food additives applied in the experiment was found to facilitate water retention in the body both in the case of both male and female rats, and differences observed between the volume of ingested fl uids and the volume of excreted urine were statistically signifi cant in the animals fed the basal diet. The type of feed mixture provided to the animals affected the site of water retention – in the case of animals receiving the basal diet analyses demonstrated a signifi cant increase in water content in the liver tissue, whereas in the animals fed the modifi ed diet water was observed to accumulate in the vascular bed. Conclusion. Taking into account the fact of water retention in the vascular bed, the effects of food additives intake may be more adverse in the case of females.

  2. ByMuR model: interaction among risks and uncertainty treatment in long-term multi-hazard/risk assessments

    Selva, J.

    2012-12-01

    Multi-risk approaches have been recently proposed to assess and compare different risks in the same target area. The key point of multi-risk assessments are the development of homogeneous risk definitions and the treatment of risk interaction. The lack of treatment of interaction may lead to significant biases and thus to erroneous risk hierarchization, which is one of primary output of risk assessments for decision makers. Within the framework of the Italian project "ByMuR - Bayesian Multi-Risk assessment", a formal model (ByMuR model) to assess multi-risk for a target area is under development, aiming (i) to perform multi-risk analyses treating interaction between different hazardous phenomena, accounting for possible effects of interaction at hazard, vulnerability and exposure levels, and (ii) to explicitly account for all uncertainties (aleatory and epistemic) through a Bayesian approach, allowing a meaningful comparison among different risks. The model is meant to be general, but it is targeted to the assessment of volcanic, seismic and tsunami risks for the city of Naples (Italy). Here, it is presented the preliminary development of the ByMuR model. The applicability of the methodology is demonstrated through illustrative examples, in which the effects of uncertainties and the bias in single-risk estimation induced by the assumption of independence among risks are explicitly assessed. An extensive application of this methodology at regional and sub-regional scale would allow to identify where a given interaction has significant effects in long-term risk assessments, and thus when multi-risk analyses should be considered in order to provide unbiased risk estimations.

  3. An Empirical Research on the Model of the Right in Additional Allocation of Stocks

    2002-01-01

    How to define the value of the Right in Additional Al location of Stocks (RAAS) acts an important role in stock markets whether or not the shareholders execute the right. Moreover, the valuation defining of RAAS an d the exercise price (K) are mutual cause and effect. Based on some literatures on this subject, this paper presents a model valuing the RAAS per-share. With t he opening information in ShenZheng Stock Markets, we make a simulation on the R AAS's value of shenwuye, which is a shareholding corp...

  4. Thermodynamic network model for predicting effects of substrate addition and other perturbations on subsurface microbial communities

    Jack Istok; Melora Park; James McKinley; Chongxuan Liu; Lee Krumholz; Anne Spain; Aaron Peacock; Brett Baldwin

    2007-04-19

    The overall goal of this project is to develop and test a thermodynamic network model for predicting the effects of substrate additions and environmental perturbations on microbial growth, community composition and system geochemistry. The hypothesis is that a thermodynamic analysis of the energy-yielding growth reactions performed by defined groups of microorganisms can be used to make quantitative and testable predictions of the change in microbial community composition that will occur when a substrate is added to the subsurface or when environmental conditions change.

  5. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  6. Hybrid 2D-3D modelling of GTA welding with filler wire addition

    Traidia, Abderrazak

    2012-07-01

    A hybrid 2D-3D model for the numerical simulation of Gas Tungsten Arc welding is proposed in this paper. It offers the possibility to predict the temperature field as well as the shape of the solidified weld joint for different operating parameters, with relatively good accuracy and reasonable computational cost. Also, an original approach to simulate the effect of immersing a cold filler wire in the weld pool is presented. The simulation results reveal two important observations. First, the weld pool depth is locally decreased in the presence of filler metal, which is due to the energy absorption by the cold feeding wire from the hot molten pool. In addition, the weld shape, maximum temperature and thermal cycles in the workpiece are relatively well predicted even when a 2D model for the arc plasma region is used. © 2012 Elsevier Ltd. All rights reserved.

  7. Guarana provides additional stimulation over caffeine alone in the planarian model.

    Dimitrios Moustakas

    Full Text Available The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose.

  8. Guarana provides additional stimulation over caffeine alone in the planarian model.

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R; Constable, Mic Andre; Mulligan, Margaret E; Voura, Evelyn B

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  9. Quantifying spatial disparities in neonatal mortality using a structured additive regression model.

    Lawrence N Kazembe

    Full Text Available BACKGROUND: Neonatal mortality contributes a large proportion towards early childhood mortality in developing countries, with considerable geographical variation at small areas within countries. METHODS: A geo-additive logistic regression model is proposed for quantifying small-scale geographical variation in neonatal mortality, and to estimate risk factors of neonatal mortality. Random effects are introduced to capture spatial correlation and heterogeneity. The spatial correlation can be modelled using the Markov random fields (MRF when data is aggregated, while the two dimensional P-splines apply when exact locations are available, whereas the unstructured spatial effects are assigned an independent Gaussian prior. Socio-economic and bio-demographic factors which may affect the risk of neonatal mortality are simultaneously estimated as fixed effects and as nonlinear effects for continuous covariates. The smooth effects of continuous covariates are modelled by second-order random walk priors. Modelling and inference use the empirical Bayesian approach via penalized likelihood technique. The methodology is applied to analyse the likelihood of neonatal deaths, using data from the 2000 Malawi demographic and health survey. The spatial effects are quantified through MRF and two dimensional P-splines priors. RESULTS: Findings indicate that both fixed and spatial effects are associated with neonatal mortality. CONCLUSIONS: Our study, therefore, suggests that the challenge to reduce neonatal mortality goes beyond addressing individual factors, but also require to understanding unmeasured covariates for potential effective interventions.

  10. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models.

    Fan, Jianqing; Feng, Yang; Song, Rui

    2011-06-01

    A variable screening procedure via correlation learning was proposed in Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under general nonparametric models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, a data-driven thresholding and an iterative nonparametric independence screening (INIS) are also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data analysis demonstrate that the proposed procedure works well with moderate sample size and large dimension and performs better than competing methods. PMID:22279246

  11. Mathematical modelling of the composition of a high-strength composite concrete containing blended carbonate additive

    A new high-strength concrete has been developed on the basis of the utilization of the blended carbonates as an active additive (BCA). The main technological features are the preliminary mechanical-chemical activation of this natural mineral product and the stage method of production. A three-parameter polynomial model has been developed for determining the amount of the main formulation components - Portland cement, BCA and water/cement ratio by evaluation of their influence on the changes of the compressive strength for one-year time period of the hardening. The experimental plan contains 27 tests. The regression equations have been calculated for five ages. The obtained regression coefficients reflecting the trend and the effect of the three factors on the output data during the investigated period have been analyzed. The compressive strength depending on two factors has been plotted for the ages of 28 and 365 days, the value of the third factor being constant. They are important for the construction practice and they display the whole spectrum of possibilities for variation of the formulation parameters, achieving at the same time the specified design strength. Key words: high-strength composite concrete, blended carbonate additive, polynomial model, regression, compressive strength

  12. Model Scramjet Inlet Unstart Induced by Mass Addition and Heat Release

    Im, Seong-Kyun; Baccarella, Damiano; McGann, Brendan; Liu, Qili; Wermer, Lydiy; Do, Hyungrok

    2015-11-01

    The inlet unstart phenomena in a model scramjet are investigated at an arc-heated hypersonic wind tunnel. The unstart induced by nitrogen or ethylene jets at low or high enthalpy Mach 4.5 freestream flow conditions are compared. The jet injection pressurizes the downstream flow by mass addition and flow blockage. In case of the ethylene jet injection, heat release from combustion increases the backpressure further. Time-resolved schlieren imaging is performed at the jet and the lip of the model inlet to visualize the flow features during unstart. High frequency pressure measurements are used to provide information on pressure fluctuation at the scramjet wall. In both of the mass and heat release driven unstart cases, it is observed that there are similar flow transient and quasi-steady behaviors of unstart shockwave system during the unstart processes. Combustion driven unstart induces severe oscillatory flow motions of the jet and the unstart shock at the lip of the scramjet inlet after the completion of the unstart process, while the unstarted flow induced by solely mass addition remains relatively steady. The discrepancies between the processes of mass and heat release driven unstart are explained by flow choking mechanism.

  13. Exploration of land-use scenarios for flood hazard modeling – the case of Santiago de Chile

    A. Müller

    2011-04-01

    Full Text Available Urban expansion leads to modifications in land use and land cover and to the loss of vegetated areas. These developments are in some regions of the world accelerated by a changing regional climate. As a consequence, major changes in the amount of green spaces can be observed in many urban regions. Amongst other dependences the amount of green spaces determines the availability of retention areas in a watershed. The goal of this research is to develop possible land-use and land-cover scenarios for a watershed and to explore the influence of land-use and land-cover changes on its runoff behavior using the distributed hydrological model HEC-HMS. The study area for this research is a small peri-urban watershed in the eastern area of Santiago de Chile.

    Three spatially explicit exploratory land-use/land-cover scenario alternatives were developed based on the analysis of previous land-use developments using high resolution satellite data, on the analysis of urban planning laws, on the analysis of climate change predictions, and on expert interviews. Modeling the resulting changes in runoff allows making predictions about the changes in flood hazard which the adjacent urban areas are facing after heavy winter precipitation events. The paper shows how HEC-HMS was used applying a distributed event modeling approach. The derived runoff values are combined with existing flood hazard maps and can be regarded as important source of information for the adaptation to changing conditions in the study area. The most significant finding is that the land-use changes that have to be expected after long drought periods pose the highest risk with respect to floods.

  14. Determinants of Low Birth Weight in Malawi: Bayesian Geo-Additive Modelling.

    Alfred Ngwira

    Full Text Available Studies on factors of low birth weight in Malawi have neglected the flexible approach of using smooth functions for some covariates in models. Such flexible approach reveals detailed relationship of covariates with the response. The study aimed at investigating risk factors of low birth weight in Malawi by assuming a flexible approach for continuous covariates and geographical random effect. A Bayesian geo-additive model for birth weight in kilograms and size of the child at birth (less than average or average and higher with district as a spatial effect using the 2010 Malawi demographic and health survey data was adopted. A Gaussian model for birth weight in kilograms and a binary logistic model for the binary outcome (size of child at birth were fitted. Continuous covariates were modelled by the penalized (p splines and spatial effects were smoothed by the two dimensional p-spline. The study found that child birth order, mother weight and height are significant predictors of birth weight. Secondary education for mother, birth order categories 2-3 and 4-5, wealth index of richer family and mother height were significant predictors of child size at birth. The area associated with low birth weight was Chitipa and areas with increased risk to less than average size at birth were Chitipa and Mchinji. The study found support for the flexible modelling of some covariates that clearly have nonlinear influences. Nevertheless there is no strong support for inclusion of geographical spatial analysis. The spatial patterns though point to the influence of omitted variables with some spatial structure or possibly epidemiological processes that account for this spatial structure and the maps generated could be used for targeting development efforts at a glance.

  15. Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios

    Custer, Rocco; Nishijima, Kazuyoshi

    disaggregation model that considers the uncertainty in the disaggregation, taking basis in the scaled Dirichlet distribution. The proposed probabilistic disaggregation model is applied to a portfolio of residential buildings in the Canton Bern, Switzerland, subject to flood risk. Thereby, the model is verified...

  16. U.S. Department of Energy Workers' mental models of radiation and chemical hazards in the workplace

    A pilot study was performed to test the mental models methodology regarding knowledge and perceptions of U.S. Department of Energy contractor radiation workers about ionizing radiation and hazardous chemicals. The mental models methodology establishes a target population's beliefs about risks and compares them with current scientific knowledge. The ultimate intent is to develop risk communication guidelines that address information gaps or misperceptions that could affect decisions and behavior. In this study, 15 radiation workers from the Hanford Site in Washington State were interviewed about radiation exposure processes and effects. Their beliefs were mapped onto a science model of the same topics to see where differences occurred. In general, workers' mental models covered many of the high-level parts of the science model but did not have the same level of detail. The following concepts appeared to be well understood by most interviewees: types, form, and properties of workplace radiation; administrative and physical controls to reduce radiation exposure risk; and the relationship of dose and effects. However, several concepts were rarely mentioned by most interviewees, indicating potential gaps in worker understanding. Most workers did not discuss the wide range of measures for neutralizing or decontaminating individuals following internal contamination. Few noted specific ways of measuring dose or factors that affect dose. Few mentioned the range of possible effects, including genetic effects, birth defects, or high dose effects. Variables that influence potential effects were rarely discussed. Workers rarely mentioned how basic radiation principles influenced the source, type, or mitigation of radiation risk in the workplace

  17. A habitat suitability model for Chinese sturgeon determined using the generalized additive method

    Yi, Yujun; Sun, Jie; Zhang, Shanghong

    2016-03-01

    The Chinese sturgeon is a type of large anadromous fish that migrates between the ocean and rivers. Because of the construction of dams, this sturgeon's migration path has been cut off, and this species currently is on the verge of extinction. Simulating suitable environmental conditions for spawning followed by repairing or rebuilding its spawning grounds are effective ways to protect this species. Various habitat suitability models based on expert knowledge have been used to evaluate the suitability of spawning habitat. In this study, a two-dimensional hydraulic simulation is used to inform a habitat suitability model based on the generalized additive method (GAM). The GAM is based on real data. The values of water depth and velocity are calculated first via the hydrodynamic model and later applied in the GAM. The final habitat suitability model is validated using the catch per unit effort (CPUEd) data of 1999 and 2003. The model results show that a velocity of 1.06-1.56 m/s and a depth of 13.33-20.33 m are highly suitable ranges for the Chinese sturgeon to spawn. The hydraulic habitat suitability indexes (HHSI) for seven discharges (4000; 9000; 12,000; 16,000; 20,000; 30,000; and 40,000 m3/s) are calculated to evaluate integrated habitat suitability. The results show that the integrated habitat suitability reaches its highest value at a discharge of 16,000 m3/s. This study is the first to apply a GAM to evaluate the suitability of spawning grounds for the Chinese sturgeon. The study provides a reference for the identification of potential spawning grounds in the entire basin.

  18. Modeling particulate matter concentrations measured through mobile monitoring in a deletion/substitution/addition approach

    Su, Jason G.; Hopke, Philip K.; Tian, Yilin; Baldwin, Nichole; Thurston, Sally W.; Evans, Kristin; Rich, David Q.

    2015-12-01

    Land use regression modeling (LUR) through local scale circular modeling domains has been used to predict traffic-related air pollution such as nitrogen oxides (NOX). LUR modeling for fine particulate matters (PM), which generally have smaller spatial gradients than NOX, has been typically applied for studies involving multiple study regions. To increase the spatial coverage for fine PM and key constituent concentrations, we designed a mobile monitoring network in Monroe County, New York to measure pollutant concentrations of black carbon (BC, wavelength at 880 nm), ultraviolet black carbon (UVBC, wavelength at 3700 nm) and Delta-C (the difference between the UVBC and BC concentrations) using the Clarkson University Mobile Air Pollution Monitoring Laboratory (MAPL). A Deletion/Substitution/Addition (D/S/A) algorithm was conducted, which used circular buffers as a basis for statistics. The algorithm maximizes the prediction accuracy for locations without measurements using the V-fold cross-validation technique, and it reduces overfitting compared to other approaches. We found that the D/S/A LUR modeling approach could achieve good results, with prediction powers of 60%, 63%, and 61%, respectively, for BC, UVBC, and Delta-C. The advantage of mobile monitoring is that it can monitor pollutant concentrations at hundreds of spatial points in a region, rather than the typical less than 100 points from a fixed site saturation monitoring network. This research indicates that a mobile saturation sampling network, when combined with proper modeling techniques, can uncover small area variations (e.g., 10 m) in particulate matter concentrations.

  19. Generalized additive models used to predict species abundance in the Gulf of Mexico: an ecosystem modeling tool.

    Michael Drexler

    Full Text Available Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM approach is used to describe the abundance of 40 species groups (i.e. functional groups across the Gulf of Mexico (GoM using a large fisheries independent data set (SEAMAP and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist.

  20. Generalized Additive Models Used to Predict Species Abundance in the Gulf of Mexico: An Ecosystem Modeling Tool

    Drexler, Michael; Ainsworth, Cameron H.

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  1. Remediation of metal-contaminated soils with the addition of materials--part I: characterization and viability studies for the selection of non-hazardous waste materials and silicates.

    González-Núñez, R; Alba, M D; Orta, M M; Vidal, M; Rigol, A

    2011-11-01

    Contamination episodes in soils require interventions to attenuate their impact. These actions are often based on the addition of materials to increase contaminant retention in the soil and to dilute the contaminant concentration. Here, non-hazardous wastes (such as sugar foam, fly ash and a material produced by the zeolitization of fly ash) and silicates (including bentonites) were tested and fully characterized in the laboratory to select suitable materials for remediating metal-contaminated soils. Data from X-ray fluorescence (XRF), N(2) adsorption/desorption isotherms, X-ray diffraction (XRD) and scanning electron microscopy/energy-dispersive X-ray spectroscopy (SEM-EDX) analyses revealed the chemical composition, specific surface area and the phases appearing in the materials. A pH titration test allowed the calculation of their acid neutralization capacity (ANC). The metal sorption and desorption capacities of the waste materials and silicates were also estimated. Sugar foam, fly ash and the zeolitic material were the best candidate materials. Sugar foam was selected because of its high ANC (17000 meq kg(-1)), and the others were selected because of their larger distribution coefficients and lower sorption reversibilities than those predicted in the contaminated soils. PMID:22018740

  2. Nonlinear feedback in a six-dimensional Lorenz model: impact of an additional heating term

    Shen, B.-W.

    2015-12-01

    In this study, a six-dimensional Lorenz model (6DLM) is derived, based on a recent study using a five-dimensional (5-D) Lorenz model (LM), in order to examine the impact of an additional mode and its accompanying heating term on solution stability. The new mode added to improve the representation of the streamfunction is referred to as a secondary streamfunction mode, while the two additional modes, which appear in both the 6DLM and 5DLM but not in the original LM, are referred to as secondary temperature modes. Two energy conservation relationships of the 6DLM are first derived in the dissipationless limit. The impact of three additional modes on solution stability is examined by comparing numerical solutions and ensemble Lyapunov exponents of the 6DLM and 5DLM as well as the original LM. For the onset of chaos, the critical value of the normalized Rayleigh number (rc) is determined to be 41.1. The critical value is larger than that in the 3DLM (rc ~ 24.74), but slightly smaller than the one in the 5DLM (rc ~ 42.9). A stability analysis and numerical experiments obtained using generalized LMs, with or without simplifications, suggest the following: (1) negative nonlinear feedback in association with the secondary temperature modes, as first identified using the 5DLM, plays a dominant role in providing feedback for improving the solution's stability of the 6DLM, (2) the additional heating term in association with the secondary streamfunction mode may destabilize the solution, and (3) overall feedback due to the secondary streamfunction mode is much smaller than the feedback due to the secondary temperature modes; therefore, the critical Rayleigh number of the 6DLM is comparable to that of the 5DLM. The 5DLM and 6DLM collectively suggest different roles for small-scale processes (i.e., stabilization vs. destabilization), consistent with the following statement by Lorenz (1972): "If the flap of a butterfly's wings can be instrumental in generating a tornado, it can

  3. Horizontal deployment of the search for potentially hazardous facilities on digital plant model. Application of Heinrich's law

    This paper proposes an innovative method of the search for potentially hazardous facilities having the similar conditions to those of accidental facilities. By providing digital plant model equipped with knowledge included in the drawings, specification sheets, database, and standards for the design, maintenance, safety, the logical search for the risky facilities is conducted and the results are visualized by color on the knowledge based drawings. Heinrich's law tells 300 minor troubles and 30 middle leveled troubles will be accompanied with the heavy severe accidents. To prevent critical accidents it is necessary to prevent minor troubles. The proposed digital search afford to prevent critical accidents by finding similar cause in minor troubles. Sharing the trouble information among plant owners, plant makers, and third parties incl. municipal authorities is also important. (author)

  4. Climate change impact assessment on Veneto and Friuli plain groundwater. Part I: An integrated modeling approach for hazard scenario construction

    Baruffi, F. [Autorita di Bacino dei Fiumi dell' Alto Adriatico, Cannaregio 4314, 30121 Venice (Italy); Cisotto, A., E-mail: segreteria@adbve.it [Autorita di Bacino dei Fiumi dell' Alto Adriatico, Cannaregio 4314, 30121 Venice (Italy); Cimolino, A.; Ferri, M.; Monego, M.; Norbiato, D.; Cappelletto, M.; Bisaglia, M. [Autorita di Bacino dei Fiumi dell' Alto Adriatico, Cannaregio 4314, 30121 Venice (Italy); Pretner, A.; Galli, A. [SGI Studio Galli Ingegneria, via della Provvidenza 13, 35030 Sarmeola di Rubano (PD) (Italy); Scarinci, A., E-mail: andrea.scarinci@sgi-spa.it [SGI Studio Galli Ingegneria, via della Provvidenza 13, 35030 Sarmeola di Rubano (PD) (Italy); Marsala, V.; Panelli, C. [SGI Studio Galli Ingegneria, via della Provvidenza 13, 35030 Sarmeola di Rubano (PD) (Italy); Gualdi, S., E-mail: silvio.gualdi@bo.ingv.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Bucchignani, E., E-mail: e.bucchignani@cira.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Torresan, S., E-mail: torresan@cmcc.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Pasini, S., E-mail: sara.pasini@stud.unive.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Department of Environmental Sciences, Informatics and Statistics, University Ca' Foscari Venice, Calle Larga S. Marta 2137, 30123 Venice (Italy); Critto, A., E-mail: critto@unive.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Department of Environmental Sciences, Informatics and Statistics, University Ca' Foscari Venice, Calle Larga S. Marta 2137, 30123 Venice (Italy); and others

    2012-12-01

    Climate change impacts on water resources, particularly groundwater, is a highly debated topic worldwide, triggering international attention and interest from both researchers and policy makers due to its relevant link with European water policy directives (e.g. 2000/60/EC and 2007/118/EC) and related environmental objectives. The understanding of long-term impacts of climate variability and change is therefore a key challenge in order to address effective protection measures and to implement sustainable management of water resources. This paper presents the modeling approach adopted within the Life + project TRUST (Tool for Regional-scale assessment of groUndwater Storage improvement in adaptation to climaTe change) in order to provide climate change hazard scenarios for the shallow groundwater of high Veneto and Friuli Plain, Northern Italy. Given the aim to evaluate potential impacts on water quantity and quality (e.g. groundwater level variation, decrease of water availability for irrigation, variations of nitrate infiltration processes), the modeling approach integrated an ensemble of climate, hydrologic and hydrogeologic models running from the global to the regional scale. Global and regional climate models and downscaling techniques were used to make climate simulations for the reference period 1961-1990 and the projection period 2010-2100. The simulation of the recent climate was performed using observed radiative forcings, whereas the projections have been done prescribing the radiative forcings according to the IPCC A1B emission scenario. The climate simulations and the downscaling, then, provided the precipitation, temperatures and evapo-transpiration fields used for the impact analysis. Based on downscaled climate projections, 3 reference scenarios for the period 2071-2100 (i.e. the driest, the wettest and the mild year) were selected and used to run a regional geomorphoclimatic and hydrogeological model. The final output of the model ensemble produced

  5. Climate change impact assessment on Veneto and Friuli plain groundwater. Part I: An integrated modeling approach for hazard scenario construction

    Climate change impacts on water resources, particularly groundwater, is a highly debated topic worldwide, triggering international attention and interest from both researchers and policy makers due to its relevant link with European water policy directives (e.g. 2000/60/EC and 2007/118/EC) and related environmental objectives. The understanding of long-term impacts of climate variability and change is therefore a key challenge in order to address effective protection measures and to implement sustainable management of water resources. This paper presents the modeling approach adopted within the Life + project TRUST (Tool for Regional-scale assessment of groUndwater Storage improvement in adaptation to climaTe change) in order to provide climate change hazard scenarios for the shallow groundwater of high Veneto and Friuli Plain, Northern Italy. Given the aim to evaluate potential impacts on water quantity and quality (e.g. groundwater level variation, decrease of water availability for irrigation, variations of nitrate infiltration processes), the modeling approach integrated an ensemble of climate, hydrologic and hydrogeologic models running from the global to the regional scale. Global and regional climate models and downscaling techniques were used to make climate simulations for the reference period 1961–1990 and the projection period 2010–2100. The simulation of the recent climate was performed using observed radiative forcings, whereas the projections have been done prescribing the radiative forcings according to the IPCC A1B emission scenario. The climate simulations and the downscaling, then, provided the precipitation, temperatures and evapo-transpiration fields used for the impact analysis. Based on downscaled climate projections, 3 reference scenarios for the period 2071–2100 (i.e. the driest, the wettest and the mild year) were selected and used to run a regional geomorphoclimatic and hydrogeological model. The final output of the model ensemble

  6. A Model of Incentive Compatibility under Moral Hazard in Livestock Disease Outbreak Response

    Gramig, Benjamin M.; Horan, Richard D.; Wolf, Christopher A.

    2005-01-01

    This paper uses a principal-agent model to examine incentive compatibility in the presence of information asymmetry between the government and individual producers. Prior models of livestock disease have not incorporated information asymmetry between livestock managers and social planners. By incorporating the asymmetry, we investigate the role of incentives in producer behavior that influences the duration and magnitude of a disease epidemic.

  7. A Coupled Damage and Reaction Model for Simulating Energetic Material Response to Impact Hazards

    BAER,MELVIN R.; DRUMHELLER,D.S.; MATHESON,E.R.

    1999-09-01

    The Baer-Nunziato multiphase reactive theory for a granulated bed of energetic material is extended to allow for dynamic damage processes, that generate new surfaces as well as porosity. The Second Law of Thermodynamics is employed to constrain the constitutive forms of the mass, momentum, and energy exchange functions as well as those for the mechanical damage model ensuring that the models will be dissipative. The focus here is on the constitutive forms of the exchange functions. The mechanical constitutive modeling is discussed in a companion paper. The mechanical damage model provides dynamic surface area and porosity information needed by the exchange functions to compute combustion rates and interphase momentum and energy exchange rates. The models are implemented in the CTH shock physics code and used to simulate delayed detonations due to impacts in a bed of granulated energetic material and an undamaged cylindrical sample.

  8. Distributional modeling and short-term forecasting of electricity prices by Generalized Additive Models for Location, Scale and Shape

    In the context of the liberalized and deregulated electricity markets, price forecasting has become increasingly important for energy company's plans and market strategies. Within the class of the time series models that are used to perform price forecasting, the subclasses of methods based on stochastic time series and causal models commonly provide point forecasts, whereas the corresponding uncertainty is quantified by approximate or simulation-based confidence intervals. Aiming to improve the uncertainty assessment, this study introduces the Generalized Additive Models for Location, Scale and Shape (GAMLSS) to model the dynamically varying distribution of prices. The GAMLSS allow fitting a variety of distributions whose parameters change according to covariates via a number of linear and nonlinear relationships. In this way, price periodicities, trends and abrupt changes characterizing both the position parameter (linked to the expected value of prices), and the scale and shape parameters (related to price volatility, skewness, and kurtosis) can be explicitly incorporated in the model setup. Relying on the past behavior of the prices and exogenous variables, the GAMLSS enable the short-term (one-day ahead) forecast of the entire distribution of prices. The approach was tested on two datasets from the widely studied California Power Exchange (CalPX) market, and the less mature Italian Power Exchange (IPEX). CalPX data allow comparing the GAMLSS forecasting performance with published results obtained by different models. The study points out that the GAMLSS framework can be a flexible alternative to several linear and nonlinear stochastic models. - Research Highlights: ► Generalized Additive Models for Location, Scale and Shape (GAMLSS) are used to model electricity prices' time series. ► GAMLSS provide the entire dynamicaly varying distribution function of prices resorting to a suitable set of covariates that drive the instantaneous values of the parameters

  9. Hazardous Chemicals

    2007-04-10

    Chemicals are a part of our daily lives, providing many products and modern conveniences. With more than three decades of experience, The Centers for Disease Control and Prevention (CDC) has been in the forefront of efforts to protect and assess people's exposure to environmental and hazardous chemicals. This report provides information about hazardous chemicals and useful tips on how to protect you and your family from harmful exposure.  Created: 4/10/2007 by CDC National Center for Environmental Health.   Date Released: 4/13/2007.

  10. Estimation of the lag time in a subsequent monomer addition model for fibril elongation.

    Shoffner, Suzanne K; Schnell, Santiago

    2016-08-01

    Fibrillogenesis, the production or development of protein fibers, has been linked to protein folding diseases. The progress curve of fibrils or aggregates typically takes on a sigmoidal shape with a lag phase, a rapid growth phase, and a final plateau regime. The study of the lag phase and the estimation of its critical timescale provide insight into the factors regulating the fibrillation process. However, methods to estimate a quantitative expression for the lag time rely on empirical expressions, which cannot connect the lag time to kinetic parameters associated with the reaction mechanisms of protein fibrillation. Here we introduce an approach for the estimation of the lag time using the governing rate equations of the elementary reactions of a subsequent monomer addition model for protein fibrillation as a case study. We show that the lag time is given by the sum of the critical timescales for each fibril intermediate in the subsequent monomer addition mechanism and therefore reveals causal connectivity between intermediate species. Furthermore, we find that single-molecule assays of protein fibrillation can exhibit a lag phase without a nucleation process, while dyes and extrinsic fluorescent probe bulk assays of protein fibrillation do not exhibit an observable lag phase during template-dependent elongation. Our approach could be valuable for investigating the effects of intrinsic and extrinsic factors to the protein fibrillation reaction mechanism and provides physicochemical insights into parameters regulating the lag phase. PMID:27250246

  11. Supra-additive effects of tramadol and acetaminophen in a human pain model.

    Filitz, Jörg; Ihmsen, Harald; Günther, Werner; Tröster, Andreas; Schwilden, Helmut; Schüttler, Jürgen; Koppert, Wolfgang

    2008-06-01

    The combination of analgesic drugs with different pharmacological properties may show better efficacy with less side effects. Aim of this study was to examine the analgesic and antihyperalgesic properties of the weak opioid tramadol and the non-opioid acetaminophen, alone as well as in combination, in an experimental pain model in humans. After approval of the local Ethics Committee, 17 healthy volunteers were enrolled in this double-blind and placebo-controlled study in a cross-over design. Transcutaneous electrical stimulation at high current densities (29.6+/-16.2 mA) induced spontaneous acute pain (NRS=6 of 10) and distinct areas of hyperalgesia for painful mechanical stimuli (pinprick-hyperalgesia). Pain intensities as well as the extent of the areas of hyperalgesia were assessed before, during and 150 min after a 15 min lasting intravenous infusion of acetaminophen (650 mg), tramadol (75 mg), a combination of both (325 mg acetaminophen and 37.5mg tramadol), or saline 0.9%. Tramadol led to a maximum pain reduction of 11.7+/-4.2% with negligible antihyperalgesic properties. In contrast, acetaminophen led to a similar pain reduction (9.8+/-4.4%), but a sustained antihyperalgesic effect (34.5+/-14.0% reduction of hyperalgesic area). The combination of both analgesics at half doses led to a supra-additive pain reduction of 15.2+/-5.7% and an enhanced antihyperalgesic effect (41.1+/-14.3% reduction of hyperalgesic areas) as compared to single administration of acetaminophen. Our study provides first results on interactions of tramadol and acetaminophen on experimental pain and hyperalgesia in humans. Pharmacodynamic modeling combined with the isobolographic technique showed supra-additive effects of the combination of acetaminophen and tramadol concerning both, analgesia and antihyperalgesia. The results might act as a rationale for combining both analgesics. PMID:17709207

  12. Applications of the seismic hazard model of Italy: from a new building code to the L'Aquila trial against seismologists

    Meletti, C.

    2013-05-01

    In 2003, a large national project fur updating the seismic hazard map and the seismic zoning in Italy started, according to the rules fixed by an Ordinance by Italian Prime Minister. New input elements for probabilistic seismic hazard assessment were compiled: the earthquake catalogue, the seismogenic zonation, the catalogue completeness, a set of new attenuation relationships. The map of expected PGA on rock soil condition with 10% probability of exceedance is the new reference seismic hazard map for Italy (http://zonesismiche.mi.ingv.it). In the following, further 9 probabilities of exceedance and the uniform hazard spectra up to 2 seconds together with the disaggregation of the PGA was also released. A comprehensive seismic hazard model that fully describes the seismic hazard in Italy was then available, accessible by a webGis application (http://esse1-gis.mi.ingv.it/en.php). The detailed information make possible to change the approach for evaluating the proper seismic action for designing: from a zone-dependent approach (in Italy there were 4 seismic zones, each one with a single design spectrum) to a site-dependent approach: the design spectrum is now defined at each site of a grid of about 11000 points covering the whole national territory. The new building code becomes mandatory only after the 6 April 2009 L'Aquila earthquake, the first strong event in Italy after the release of the seismic hazard map. The large number of recordings and the values of the experienced accelerations suggested the comparisons between the recorded spectra and spectra defined in the seismic codes Even if such comparisons could be robust only after several consecutive 50-year periods of observation and in a probabilistic approach it is not a single observation that can validate or not the hazard estimate, some of the comparisons that can be undertaken between the observed ground motions and the hazard model used for the seismic code have been performed and have shown that the

  13. Thermal stress modeling of in situ vitrified barriers for hazardous waste containment

    Development of In Situ Vitrification technology has included the concept of subsurface barriers. Structural integrity of vitrified soil bodies is important to barrier performance. Analytical methods are under development for predicting thermal-structural performance during melt cooldown. A thermal modeling capability has been developed for predicting the cooling transient of subsurface molten masses using the finite element method. A computationally efficient 'instant freezing model'' was demonstrated to give qualitative agreement for predicted stresses with a more sophisticated creep model. A method for predicting stress relief due to cracking, as a preliminary step to predicting crack densities, has been demonstrated. 8 refs., 10 figs

  14. Earthquake forecasting and seismic hazard analysis: some insights on the testing phase and the modeling

    Taroni, Matteo

    2014-01-01

    This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that...

  15. Modelling by Petri Nets of an active product for the security management of hazardous products

    Zouinkhi, Ahmed; Bajic, Eddy; Zidi, Raja; Ben Gayed, Mohamed; Rondeau, Eric; Abdelkrim, Naceur

    2008-01-01

    This paper presents and proposes a model by coloured Petri Nets of an innovative concept of active products which are equipped with a platform of wireless sensor networks and with an ambient communication in order to increase security, in a context of ambient intelligence of a deposit for chemical substances. The concept of active products supported by a model that we propose, offers the possibility to objects to interact between them in an autonomous, transparent and intelligent way, without...

  16. Two state model for a constant disease hazard in paratuberculosis (and other bovine diseases).

    Louzoun, Yoram; Mitchell, Rebecca; Behar, Hilla; Schukken, Ynte

    2015-01-01

    Many diseases are characterized by a long and varying sub-clinical period. Two main mechanisms can explain such periods: a slow progress toward disease or a sudden transition from a healthy state to a disease state induced by internal or external events. We here survey epidemiological features of the amount of bacteria shed during Mycobacterium Avium Paratuberculosis (MAP) infection to test which of these two models, slow progression or sudden transition (or a combination of the two), better explains the transition from intermittent and low shedding to high shedding. Often, but not always, high shedding is associated with the occurrence of clinical signs. In the case of MAP, the clinical signs include diarrhea, low milk production, poor fertility and eventually emaciation and death. We propose a generic model containing bacterial growth, immune control and fluctuations. This proposed generic model can represent the two hypothesized types of transitions in different parameter regimes. The results show that the sudden transition model provides a simpler explanation of the data, but also suffers from some limitations. We discuss the different immunological mechanism that can explain and support the sudden transition model and the interpretation of each term in the studied model. These conclusions are applicable to a wide variety of diseases, and MAP serves as a good test case based on the large scale measurements of single cow longitudinal profiles in this disease. PMID:26092587

  17. Modelling of C2 addition route to the formation of C60

    Khan, Sabih D

    2016-01-01

    To understand the phenomenon of fullerene growth during its synthesis, an attempt is made to model a minimum energy growth route using a semi-empirical quantum mechanics code. C2 addition leading to C60 was modelled and three main routes, i.e. cyclic ring growth, pentagon and fullerene road, were studied. The growth starts with linear chains and, at n = 10, ring structures begins to dominate. The rings continue to grow and, at some point n > 30, they transform into close-cage fullerenes and the growth is shown to progress by the fullerene road until C60 is formed. The computer simulations predict a transition from a C38 ring to fullerene. Other growth mechanisms could also occur in the energetic environment commonly encountered in fullerene synthesis, but our purpose was to identify a minimal energy route which is the most probable structure. Our results also indicate that, at n = 20, the corannulene structure is energetically more stable than the corresponding fullerene and graphene sheet, however a ring str...

  18. Independent screening for single-index hazard rate models with ultrahigh dimensional features

    Gorst-Rasmussen, Anders; Scheike, Thomas

    2013-01-01

    In data sets with many more features than observations, independent screening based on all univariate regression models leads to a computationally convenient variable selection method. Recent efforts have shown that, in the case of generalized linear models, independent screening may suffice to...... capture all relevant features with high probability, even in ultrahigh dimension. It is unclear whether this formal sure screening property is attainable when the response is a right-censored survival time. We propose a computationally very efficient independent screening method for survival data which...

  19. Citizens' Perceptions of Flood Hazard Adjustments: An Application of the Protective Action Decision Model

    Terpstra, Teun; Lindell, Michael K.

    2013-01-01

    Although research indicates that adoption of flood preparations among Europeans is low, only a few studies have attempted to explain citizens' preparedness behavior. This article applies the Protective Action Decision Model (PADM) to explain flood preparedness intentions in the Netherlands. Survey data ("N" = 1,115) showed that…

  20. Life-Stage Physiologically-Based Pharmacokinetic (PBPK) Model Applications to Screen Environmental Hazards.

    This presentation discusses methods used to extrapolate from in vitro high-throughput screening (HTS) toxicity data for an endocrine pathway to in vivo for early life stages in humans, and the use of a life stage PBPK model to address rapidly changing physiological parameters. A...

  1. Hazard rate model and statistical analysis of a compound point process

    Volf, Petr

    2005-01-01

    Roč. 41, č. 6 (2005), s. 773-786. ISSN 0023-5954 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : couting process * compound process * Cox regression model * intensity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.343, year: 2005

  2. Development of a high-fidelity numerical model for hazard prediction in the urban environment

    The release of chemical, biological, radiological, or nuclear (CBRN) agents by terrorists or rogue states in a North American city (densely populated urban centre) and the subsequent exposure, deposition, and contamination are emerging threats in an uncertain world. The transport, dispersion, deposition, and fate of a CBRN agent released in an urban environment is an extremely complex problem that encompasses potentially multiple space and time scales. The availability of high-fidelity, time-dependent models for the prediction of a CBRN agent's movement and fate in a complex urban environment can provide the strongest technical and scientific foundation for support of Canada's more broadly based effort at advancing counter-terrorism planning and operational capabilities. The objective of this paper is to report the progress of developing and validating an integrated, state-of-the-art, high-fidelity multi-scale, multi-physics modeling system for the accurate and efficient prediction of urban flow and dispersion of CBRN materials. Development of this proposed multi-scale modeling system will provide the real-time modeling and simulation tool required to predict injuries, casualties, and contamination and to make relevant decisions (based on the strongest technical and scientific foundations) in order to minimize the consequences of a CBRN incident based on a pre-determined decision making framework. (author)

  3. Modelling extreme flood hazard events on the middle Yellow River using DFLOW-flexible mesh approach

    M. Castro Gama

    2013-11-01

    DFLOW-FMβeta gives the opportunity to enhance the understanding of the behavior of the Yellow River during extreme events. The modeling approaches based on discretization of the modeled domain in square and rectangular grids have a great importance in the management of rivers but usually they present two drawbacks: the required accuracy of the meandering of wide long rivers is not well represented, and the reduced speed in computational runtime due to the need of using many grid cells. A new tool, developed by Deltares, based on a flexible mesh discretization of the domain, presents the advantage that the two drawbacks can be overcome. The approach has the advantage of combining different grids, in order to properly represent the river and compute the flooding extent accurately. The method is checked and demonstrated on the Yellow River case. Along with the test of the new proposed modeling method new characteristics of the spatial flooding process in the Yellow River emerges and are presented in the paper, showing the capabilities of the software application tool in modeling such a complex environment like the one studied.

  4. Applying Distributed, Coupled Hydrological Slope-Stability Models for Landslide Hazard Assessments

    Godt, J. W.; Baum, R. L.; Lu, N.; Savage, W. Z.; McKenna, J. P.

    2006-12-01

    Application of distributed, coupled hydrological slope-stability models requires knowledge of hydraulic and material-strength properties at the scale of landslide processes. We describe results from a suite of laboratory and field tests that were used to define the soil-water characteristics of landslide-prone colluvium on the steep coastal bluffs in the Seattle, Washington area and then use these results in a coupled model. Many commonly used tests to determine soil-water characteristics are performed for the drying process. Because most soils display a pronounced hysteresis in the relation between moisture content and matric suction, results from such tests may not accurately describe the soil-water characteristics for the wetting process during rainfall infiltration. Open-tube capillary-rise and constant-flow permeameter tests on bluff colluvium were performed in the laboratory to determine the soil-water characteristic curves (SWCC) and unsaturated hydraulic conductivity functions (HCF) for the wetting process. Field-tests using a borehole permeameter were used to determine the saturated hydraulic conductivity of colluvial materials. Measurements of pore-water response to rainfall were used in an inverse numerical modeling procedure to determine the in-situ hydraulic parameters of hillside colluvium at the scale of the instrument installation. Comparison of laboratory and field results show that although both techniques generally produce SWCCs and HCFs with similar shapes, differences in bulk density among field and lab tests yield differences in saturated moisture content and saturated hydrologic conductivity. We use these material properties in an application of a new version of a distributed transient slope stability model (TRIGRS) that accounts for the effects of the unsaturated zone on the infiltration process. Applied over a LiDAR-based digital landscape of part of the Seattle area for an hourly rainfall history known to trigger shallow landslides, the

  5. Individual-level space-time analyses of emergency department data using generalized additive modeling

    Vieira Verónica M

    2012-08-01

    Full Text Available Abstract Background Although daily emergency department (ED data is a source of information that often includes residence, its potential for space-time analyses at the individual level has not been fully explored. We propose that ED data collected for surveillance purposes can also be used to inform spatial and temporal patterns of disease using generalized additive models (GAMs. This paper describes the methods for adapting GAMs so they can be applied to ED data. Methods GAMs are an effective approach for modeling spatial and temporal distributions of point-wise data, producing smoothed surfaces of continuous risk while adjusting for confounders. In addition to disease mapping, the method allows for global and pointwise hypothesis testing and selection of statistically optimum degree of smoothing using standard statistical software. We applied a two-dimensional GAM for location to ED data of overlapping calendar time using a locally-weighted regression smoother. To illustrate our methods, we investigated the association between participants’ address and the risk of gastrointestinal illness in Cape Cod, Massachusetts over time. Results The GAM space-time analyses simultaneously smooth in units of distance and time by using the optimum degree of smoothing to create data frames of overlapping time periods and then spatially analyzing each data frame. When resulting maps are viewed in series, each data frame contributes a movie frame, allowing us to visualize changes in magnitude, geographic size, and location of elevated risk smoothed over space and time. In our example data, we observed an underlying geographic pattern of gastrointestinal illness with risks consistently higher in the eastern part of our study area over time and intermittent variations of increased risk during brief periods. Conclusions Spatial-temporal analysis of emergency department data with GAMs can be used to map underlying disease risk at the individual-level and view

  6. Cadmium-hazard mapping using a general linear regression model (Irr-Cad) for rapid risk assessment.

    Simmons, Robert W; Noble, Andrew D; Pongsakul, P; Sukreeyapongse, O; Chinabut, N

    2009-02-01

    Research undertaken over the last 40 years has identified the irrefutable relationship between the long-term consumption of cadmium (Cd)-contaminated rice and human Cd disease. In order to protect public health and livelihood security, the ability to accurately and rapidly determine spatial Cd contamination is of high priority. During 2001-2004, a General Linear Regression Model Irr-Cad was developed to predict the spatial distribution of soil Cd in a Cd/Zn co-contaminated cascading irrigated rice-based system in Mae Sot District, Tak Province, Thailand (Longitude E 98 degrees 59'-E 98 degrees 63' and Latitude N 16 degrees 67'-16 degrees 66'). The results indicate that Irr-Cad accounted for 98% of the variance in mean Field Order total soil Cd. Preliminary validation indicated that Irr-Cad 'predicted' mean Field Order total soil Cd, was significantly (p regia-digested) for a given Field Order(IS). In 2004-2005, Irr-Cad was utilized to evaluate the spatial distribution of total soil Cd in a 'high-risk' area of Mae Sot District. Secondary validation on six randomly selected field groups verified that Irr-Cad predicted mean Field Order total soil Cd and was significantly (p water)) to the Irr-Cad model accounted for over 79% of the variation in mean Field Order bio-available (DTPA (diethylenetriaminepentaacetic acid)-extractable) soil Cd. Rice is the staple food of countries of the Greater Mekong Sub-region (includes Vietnam, Myanmar, Lao PDR, Thailand and Yunnan Province, China). These countries also have actively and historically mined Zn, Pb, and Cu deposits where Cd is likely to be a potential hazard if un-controlled discharge/runoff enters areas of rice cultivation. As such, it is envisaged that the Irr-Cad model could be applied for Cd hazard assessment and effectively form the basis of intervention options and policy decisions to protect public health, livelihoods, and export security. PMID:18311588

  7. The impact of hazardous industrial facilities on housing prices: A comparison of parametric and semiparametric hedonic price models

    Grislain-Letrémy, Céline; Katossky, Arthur

    2014-01-01

    The willingness of households to pay for prevention against industrial risks can be revealed by real estate markets. By using very rich microdata, we study housing prices in the vicinity of hazardous industries near three important French cities. We show that the impact of hazardous plants on the...

  8. Seismic hazard assessment of the Hanford region, Eastern Washington State

    A probabilistic seismic hazard assessment was made for a site within the Hanford region of eastern Washington state, which is characterized as an intraplate region having a relatively low rate of seismic activity. Probabilistic procedures, such as logic trees, were utilized to account for the uncertainties in identifying and characterizing the potential seismic sources in the region. Logic trees provide a convenient, flexible means of assessing the values and relative likelihoods of input parameters to the hazard model that may be dependent upon each other. Uncertainties accounted for in this way include the tectonic model, segmentation, capability, fault geometry, maximum earthquake magnitude, and earthquake recurrence rate. The computed hazard results are expressed as a distribution from which confidence levels are assessed. Analysis of the results show the contributions to the total hazard from various seismic sources and due to various earthquake magnitudes. In addition, the contributions of uncertainties in the various source parameters to the uncertainty in the computed hazard are assessed. For this study, the major contribution to uncertainty in the computed hazard are due to uncertainties in the applicable tectonic model and the earthquake recurrence rate. This analysis serves to illustrate some of the probabilistic tools that are available for conducting seismic hazard assessments and for analyzing the results of these studies. 5 references, 7 figures

  9. Probabilistic seismic hazard in the San Francisco Bay area based on a simplified viscoelastic cycle model of fault interactions

    Pollitz, F.F.; Schwartz, D.P.

    2008-01-01

    We construct a viscoelastic cycle model of plate boundary deformation that includes the effect of time-dependent interseismic strain accumulation, coseismic strain release, and viscoelastic relaxation of the substrate beneath the seismogenic crust. For a given fault system, time-averaged stress changes at any point (not on a fault) are constrained to zero; that is, kinematic consistency is enforced for the fault system. The dates of last rupture, mean recurrence times, and the slip distributions of the (assumed) repeating ruptures are key inputs into the viscoelastic cycle model. This simple formulation allows construction of stress evolution at all points in the plate boundary zone for purposes of probabilistic seismic hazard analysis (PSHA). Stress evolution is combined with a Coulomb failure stress threshold at representative points on the fault segments to estimate the times of their respective future ruptures. In our PSHA we consider uncertainties in a four-dimensional parameter space: the rupture peridocities, slip distributions, time of last earthquake (for prehistoric ruptures) and Coulomb failure stress thresholds. We apply this methodology to the San Francisco Bay region using a recently determined fault chronology of area faults. Assuming single-segment rupture scenarios, we find that fature rupture probabilities of area faults in the coming decades are the highest for the southern Hayward, Rodgers Creek, and northern Calaveras faults. This conclusion is qualitatively similar to that of Working Group on California Earthquake Probabilities, but the probabilities derived here are significantly higher. Given that fault rupture probabilities are highly model-dependent, no single model should be used to assess to time-dependent rupture probabilities. We suggest that several models, including the present one, be used in a comprehensive PSHA methodology, as was done by Working Group on California Earthquake Probabilities.

  10. Predictive modeling of hazardous waste landfill total above-ground biomass using passive optical and LIDAR remotely sensed data

    Hadley, Brian Christopher

    This dissertation assessed remotely sensed data and geospatial modeling technique(s) to map the spatial distribution of total above-ground biomass present on the surface of the Savannah River National Laboratory's (SRNL) Mixed Waste Management Facility (MWMF) hazardous waste landfill. Ordinary least squares (OLS) regression, regression kriging, and tree-structured regression were employed to model the empirical relationship between in-situ measured Bahia (Paspalum notatum Flugge) and Centipede [Eremochloa ophiuroides (Munro) Hack.] grass biomass against an assortment of explanatory variables extracted from fine spatial resolution passive optical and LIDAR remotely sensed data. Explanatory variables included: (1) discrete channels of visible, near-infrared (NIR), and short-wave infrared (SWIR) reflectance, (2) spectral vegetation indices (SVI), (3) spectral mixture analysis (SMA) modeled fractions, (4) narrow-band derivative-based vegetation indices, and (5) LIDAR derived topographic variables (i.e. elevation, slope, and aspect). Results showed that a linear combination of the first- (1DZ_DGVI), second- (2DZ_DGVI), and third-derivative of green vegetation indices (3DZ_DGVI) calculated from hyperspectral data recorded over the 400--960 nm wavelengths of the electromagnetic spectrum explained the largest percentage of statistical variation (R2 = 0.5184) in the total above-ground biomass measurements. In general, the topographic variables did not correlate well with the MWMF biomass data, accounting for less than five percent of the statistical variation. It was concluded that tree-structured regression represented the optimum geospatial modeling technique due to a combination of model performance and efficiency/flexibility factors.

  11. On the proportional hazards model for occupational and environmental case-control analyses

    Gauvin, Héloïse; Lacourt, Aude; Leffondré, Karen

    2013-01-01

    Background Case-control studies are generally designed to investigate the effect of exposures on the risk of a disease. Detailed information on past exposures is collected at the time of study. However, only the cumulated value of the exposure at the index date is usually used in logistic regression. A weighted Cox (WC) model has been proposed to estimate the effects of time-dependent exposures. The weights depend on the age conditional probabilities to develop the disease in the source popul...

  12. Modeling of hazardous air pollutant removal in the pulsed corona discharge

    Derakhshesh, Marzie [Department of Chemical and Petroleum Engineering, University of Calgary, Schulich School of Engineering, 2500 University Drive, N.W., Calgary, AB, T2N 1N4 (Canada); Abedi, Jalal [Department of Chemical and Petroleum Engineering, University of Calgary, Schulich School of Engineering, 2500 University Drive, N.W., Calgary, AB, T2N 1N4 (Canada)], E-mail: jabedi@ucalgary.ca; Omidyeganeh, Mohammad [Department of Chemical and Petroleum Engineering, University of Calgary, Schulich School of Engineering, 2500 University Drive, N.W., Calgary, AB, T2N 1N4 (Canada)

    2009-03-09

    This study investigated the effects of two parts of the performance equation of the pulsed corona reactor, which is one of the non-thermal plasma processing tools of atmospheric pressure for eliminating pollutant streams. First, the effect of axial dispersion in the diffusion term and then the effect of different orders of the reaction in the decomposition rate term were considered. The mathematical model was primarily developed to predict the effluent concentration of the pulsed corona reactor using mass balance, and considering axial dispersion, linear velocity and decomposition rate of pollutant. The steady state form of this equation was subsequently solved assuming different reaction orders. For the derivation of the performance equation of the reactor, it was assumed that the decomposition rate of the pollutant was directly proportional to discharge power and the concentration of the pollutant. The results were validated and compared with another predicted model using their experimental data. The model developed in this study was also validated with two other experimental data in the literature for N{sub 2}O.

  13. Structured additive regression modeling of age of menarche and menopause in a breast cancer screening program.

    Duarte, Elisa; de Sousa, Bruno; Cadarso-Suarez, Carmen; Rodrigues, Vitor; Kneib, Thomas

    2014-05-01

    Breast cancer risk is believed to be associated with several reproductive factors, such as early menarche and late menopause. This study is based on the registries of the first time a woman enters the screening program, and presents a spatio-temporal analysis of the variables age of menarche and age of menopause along with other reproductive and socioeconomic factors. The database was provided by the Portuguese Cancer League (LPCC), a private nonprofit organization dealing with multiple issues related to oncology of which the Breast Cancer Screening Program is one of its main activities. The registry consists of 259,652 records of women who entered the screening program for the first time between 1990 and 2007 (45-69-year age group). Structured Additive Regression (STAR) models were used to explore spatial and temporal correlations with a wide range of covariates. These models are flexible enough to deal with a variety of complex datasets, allowing us to reveal possible relationships among the variables considered in this study. The analysis shows that early menarche occurs in younger women and in municipalities located in the interior of central Portugal. Women living in inland municipalities register later ages for menopause, and those born in central Portugal after 1933 show a decreasing trend in the age of menopause. Younger ages of menarche and late menopause are observed in municipalities with a higher purchasing power index. The analysis performed in this study portrays the time evolution of the age of menarche and age of menopause and their spatial characterization, adding to the identification of factors that could be of the utmost importance in future breast cancer incidence research. PMID:24615881

  14. A combined M5P tree and hazard-based duration model for predicting urban freeway traffic accident durations.

    Lin, Lei; Wang, Qian; Sadek, Adel W

    2016-06-01

    The duration of freeway traffic accidents duration is an important factor, which affects traffic congestion, environmental pollution, and secondary accidents. Among previous studies, the M5P algorithm has been shown to be an effective tool for predicting incident duration. M5P builds a tree-based model, like the traditional classification and regression tree (CART) method, but with multiple linear regression models as its leaves. The problem with M5P for accident duration prediction, however, is that whereas linear regression assumes that the conditional distribution of accident durations is normally distributed, the distribution for a "time-to-an-event" is almost certainly nonsymmetrical. A hazard-based duration model (HBDM) is a better choice for this kind of a "time-to-event" modeling scenario, and given this, HBDMs have been previously applied to analyze and predict traffic accidents duration. Previous research, however, has not yet applied HBDMs for accident duration prediction, in association with clustering or classification of the dataset to minimize data heterogeneity. The current paper proposes a novel approach for accident duration prediction, which improves on the original M5P tree algorithm through the construction of a M5P-HBDM model, in which the leaves of the M5P tree model are HBDMs instead of linear regression models. Such a model offers the advantage of minimizing data heterogeneity through dataset classification, and avoids the need for the incorrect assumption of normality for traffic accident durations. The proposed model was then tested on two freeway accident datasets. For each dataset, the first 500 records were used to train the following three models: (1) an M5P tree; (2) a HBDM; and (3) the proposed M5P-HBDM, and the remainder of data were used for testing. The results show that the proposed M5P-HBDM managed to identify more significant and meaningful variables than either M5P or HBDMs. Moreover, the M5P-HBDM had the lowest overall mean

  15. Mapping hazard from urban non-point pollution: a screening model to support sustainable urban drainage planning.

    Mitchell, Gordon

    2005-01-01

    Non-point sources of pollution are difficult to identify and control, and are one of the main reasons that urban rivers fail to reach the water quality objectives set for them. Whilst sustainable drainage systems (SuDS) are available to help combat this diffuse pollution, they are mostly installed in areas of new urban development. However, SuDS must also be installed in existing built areas if diffuse loadings are to be reduced. Advice on where best to locate SuDS within existing built areas is limited, hence a semi-distributed stochastic GIS-model was developed to map small-area basin-wide loadings of 18 key stormwater pollutants. Load maps are combined with information on surface water quality objectives to permit mapping of diffuse pollution hazard to beneficial uses of receiving waters. The model thus aids SuDS planning and strategic management of urban diffuse pollution. The identification of diffuse emission 'hot spots' within a water quality objectives framework is consistent with the 'combined' (risk assessment) approach to pollution control advocated by the EU Water Framework Directive. PMID:15572076

  16. Seismic Hazard of the Uttarakhand Himalaya, India, from Deterministic Modeling of Possible Rupture Planes in the Area

    Anand Joshi

    2013-01-01

    Full Text Available This paper presents use of semiempirical method for seismic hazard zonation. The seismotectonically important region of Uttarakhand Himalaya has been considered in this work. Ruptures along the lineaments in the area identified from tectonic map are modeled deterministically using semi empirical approach given by Midorikawa (1993. This approach makes use of attenuation relation of peak ground acceleration for simulating strong ground motion at any site. Strong motion data collected over a span of three years in this region have been used to develop attenuation relation of peak ground acceleration of limited magnitude and distance applicability. The developed attenuation relation is used in the semi empirical method to predict peak ground acceleration from the modeled rupture planes in the area. A set of values of peak ground acceleration from possible ruptures in the area at the point of investigation is further used to compute probability of exceedance of peak ground acceleration of values 100 and 200 gals. The prepared map shows that regions like Tehri, Chamoli, Almora, Srinagar, Devprayag, Bageshwar, and Pauri fall in a zone of 10% probability of exceedence of peak ground acceleration of value 200 gals.

  17. Enhancement of colour stability of anthocyanins in model beverages by gum arabic addition.

    Chung, Cheryl; Rojanasasithara, Thananunt; Mutilangi, William; McClements, David Julian

    2016-06-15

    This study investigated the potential of gum arabic to improve the stability of anthocyanins that are used in commercial beverages as natural colourants. The degradation of purple carrot anthocyanin in model beverage systems (pH 3.0) containing L-ascorbic acid proceeded with a first-order reaction rate during storage (40 °C for 5 days in light). The addition of gum arabic (0.05-5.0%) significantly enhanced the colour stability of anthocyanin, with the most stable systems observed at intermediate levels (1.5%). A further increase in concentration (>1.5%) reduced its efficacy due to a change in the conformation of the gum arabic molecules that hindered their exposure to the anthocyanins. Fluorescence quenching measurements showed that the anthocyanin could have interacted with the glycoprotein fractions of the gum arabic through hydrogen bonding, resulting in enhanced stability. Overall, this study provides valuable information about enhancing the stability of anthocyanins in beverage systems using natural ingredients. PMID:26868542

  18. A random field model for the estimation of seismic hazard. Final report for the period 1 January 1990 - 31 December 1990

    The general theory of stationary random functions is utilized to assess the seismic hazard associated with a linearly extending seismic source. The past earthquake occurrence data associated with a portion of the North Anatolian fault are used to demonstrate the implementation of the proposed model. 18 refs, figs and tabs

  19. Vulnerability of the Dover Strait to coseismic tsunami hazards: insights from numerical modelling

    Roger, J.; Gunnell, Y.

    2012-02-01

    On 1580 April 6, a large earthquake shook the eastern English Channel and its shores, with numerous casualties and significant destruction documented. Some reports suggest that it was followed by a tsunami. Meanwhile, earthquake magnitudes of MW= 7 have been deemed possible on intraplate fault systems in neighbouring Benelux. This study aims to determine the possibility of a MW > 5.5 magnitude earthquake generating a tsunami in the Dover Strait, one of the world's busiest seaways. In a series of numerical models focusing on sensitivity analysis, earthquake source parameters for the Dover Strait are constrained by palaeoseismological evidence and historical accounts, producing maps of wave heights and analysis of frequencies based on six strategically located virtual tide gauges. Of potential concern to engineering geologists, a maximum credible scenario is also tested for MW= 6.9. For earthquakes with MW of 5.5, none of the fault models we tested produced a tsunami on neighbouring shores. However, for earthquakes with MW 6.9, both extensional and thrusting events produced tsunami waves with open-water amplitudes of up to 1.5 m, and higher amplitudes might be expected in regions where waves are amplified by regional nearshore bathymetry. Sensitivity to parameter choice is emphasized but a pattern of densely inhabited coastal hotspots liable to tsunami-related damage because of bathymetric forcing factors is consistently obtained.

  20. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    Zucca, J J; Walter, W R; Rodgers, A J; Richards, P; Pasyanos, M E; Myers, S C; Lay, T; Harris, D; Antoun, T

    2008-11-19

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of Earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring and seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D Earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes two specific paths by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas. Seismic monitoring agencies are tasked with detection, location, and characterization of seismic activity in near real time. In the case of nuclear explosion monitoring or seismic hazard, decisions to further investigate a suspect event or to launch disaster relief efforts may rely heavily on real-time analysis and results. Because these are weighty decisions, monitoring agencies are regularly called upon to meticulously document and justify every aspect of their monitoring system. In order to meet this level of scrutiny and maintain operational robustness requirements, only mature technologies are considered for operational monitoring systems, and operational technology necessarily lags

  1. Potential hazards to embryo implantation: A human endometrial in vitro model to identify unwanted antigestagenic actions of chemicals

    Fischer, L.; Deppert, W.R. [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany); Pfeifer, D. [Department of Hematology and Oncology, University Hospital Freiburg (Germany); Stanzel, S.; Weimer, M. [Department of Biostatistics, German Cancer Research Center, Heidelberg (Germany); Hanjalic-Beck, A.; Stein, A.; Straßer, M.; Zahradnik, H.P. [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany); Schaefer, W.R., E-mail: wolfgang.schaefer@uniklinik-freiburg.de [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany)

    2012-05-01

    Embryo implantation is a crucial step in human reproduction and depends on the timely development of a receptive endometrium. The human endometrium is unique among adult tissues due to its dynamic alterations during each menstrual cycle. It hosts the implantation process which is governed by progesterone, whereas 17β-estradiol regulates the preceding proliferation of the endometrium. The receptors for both steroids are targets for drugs and endocrine disrupting chemicals. Chemicals with unwanted antigestagenic actions are potentially hazardous to embryo implantation since many pharmaceutical antiprogestins adversely affect endometrial receptivity. This risk can be addressed by human tissue-specific in vitro assays. As working basis we compiled data on chemicals interacting with the PR. In our experimental work, we developed a flexible in vitro model based on human endometrial Ishikawa cells. Effects of antiprogestin compounds on pre-selected target genes were characterized by sigmoidal concentration–response curves obtained by RT-qPCR. The estrogen sulfotransferase (SULT1E1) was identified as the most responsive target gene by microarray analysis. The agonistic effect of progesterone on SULT1E1 mRNA was concentration-dependently antagonized by RU486 (mifepristone) and ZK137316 and, with lower potency, by 4-nonylphenol, bisphenol A and apigenin. The negative control methyl acetoacetate showed no effect. The effects of progesterone and RU486 were confirmed on the protein level by Western blotting. We demonstrated proof of principle that our Ishikawa model is suitable to study quantitatively effects of antiprogestin-like chemicals on endometrial target genes in comparison to pharmaceutical reference compounds. This test is useful for hazard identification and may contribute to reduce animal studies. -- Highlights: ► We compare progesterone receptor-mediated endometrial effects of chemicals and drugs. ► 4-Nonylphenol, bisphenol A and apigenin exert weak

  2. Mathematical modeling and experimental validation of Phaeodactylum tricornutum microalgae growth rate with glycerol addition

    Morais, Keli Cristiane Correia; Ribeiro, Robert Luis Lara; Santos, Kassiana Ribeiro dos; Mariano, Andre Bellin [Mariano Center for Research and Development of Sustainable Energy (NPDEAS), Curitiba, PR (Brazil); Vargas, Jose Viriato Coelho [Departament of Mechanical Engineering, Federal University of Parana (UFPR) Curitiba, PR (Brazil)

    2010-07-01

    The Brazilian National Program for Bio fuel Production has been encouraging diversification of feedstock for biofuel production. One of the most promising alternatives is the use of microalgae biomass for biofuel production. The cultivation of microalgae is conducted in aquatic systems, therefore microalgae oil production does not compete with agricultural land. Microalgae have greater photosynthetic efficiency than higher plants and are efficient fixing CO{sub 2}. The challenge is to reduce production costs, which can be minimized by increasing productivity and oil biomass. Aiming to increase the production of microalgae biomass, mixotrophic cultivation, with the addition of glycerol has been shown to be very promising. During the production of biodiesel from microalgae there is availability of glycerol as a side product of the transesterification reaction, which could be used as organic carbon source for microalgae mixotrophic growth, resulting in increased biomass productivity. In this paper, to study the effect of glycerol in experimental conditions, the batch culture of the diatom Phaeodactylum tricornutum was performed in a 2-liter flask in a temperature and light intensity controlled room. During 16 days of cultivation, the number of cells per ml was counted periodically in a Neubauer chamber. The calculation of dry biomass in the control experiment (without glycerol) was performed every two days by vacuum filtration. In the dry biomass mixotrophic experiment with glycerol concentration of 1.5 M, the number of cells was assessed similarly in the 10{sup th} and 14{sup th} days of cultivation. Through a volume element methodology, a mathematical model was written to calculate the microalgae growth rate. It was used an equation that describes the influence of irradiation and concentration of nutrients in the growth of microalgae. A simulation time of 16 days was used in the computations, with initial concentration of 0.1 g l{sup -1}. In order to compare

  3. Coastal Digital Elevation Models (DEMs) for tsunami hazard assessment on the French coasts

    Maspataud, Aurélie; Biscara, Laurie; Hébert, Hélène; Schmitt, Thierry; Créach, Ronan

    2015-04-01

    Building precise and up-to-date coastal DEMs is a prerequisite for accurate modeling and forecasting of hydrodynamic processes at local scale. Marine flooding, originating from tsunamis, storm surges or waves, is one of them. Some high resolution DEMs are being generated for multiple coast configurations (gulf, embayment, strait, estuary, harbor approaches, low-lying areas…) along French Atlantic and Channel coasts. This work is undertaken within the framework of the TANDEM project (Tsunamis in the Atlantic and the English ChaNnel: Definition of the Effects through numerical Modeling) (2014-2017). DEMs boundaries were defined considering the vicinity of French civil nuclear facilities, site effects considerations and potential tsunamigenic sources. Those were identified from available historical observations. Seamless integrated topographic and bathymetric coastal DEMs will be used by institutions taking part in the study to simulate expected wave height at regional and local scale on the French coasts, for a set of defined scenarii. The main tasks were (1) the development of a new capacity of production of DEM, (2) aiming at the release of high resolution and precision digital field models referred to vertical reference frameworks, that require (3) horizontal and vertical datum conversions (all source elevation data need to be transformed to a common datum), on the basis of (4) the building of (national and/or local) conversion grids of datum relationships based on known measurements. Challenges in coastal DEMs development deal with good practices throughout model development that can help minimizing uncertainties. This is particularly true as scattered elevation data with variable density, from multiple sources (national hydrographic services, state and local government agencies, research organizations and private engineering companies) and from many different types (paper fieldsheets to be digitized, single beam echo sounder, multibeam sonar, airborne laser

  4. Update of the tectonic model for the Pannonian basin: a contribution to the seismic hazard reassessment of the Paks NPP (Hungary)

    Horváth, Ferenc; Tóth, Tamás; Wórum, Géza; Koroknai, Balázs; Kádi, Zoltán; Kovács, Gábor; Balázs, Attila; Visnovitz, Ferenc

    2015-04-01

    The planned construction of two new units at the site of the Paks NPP requires a comprehensive site investigation including complete reassessment of the seismic hazard according to the Hungarian as well as international standards. Following the regulations of the Specific Safety Guide no. 9 (IAEA 2010), the approved Hungarian Geological Investigation Program (HGIP) includes integrated geological-geophysical studies at different scales. The regional study aims at to elaborate a new synthesis of all published data for the whole Pannonian basin. This task is nearly completed and the main outcomes have already been published (Horváth et al. 2015). The near regional study is in progress and addresses the construction of a new tectonic model for the circular area with 50 km radius around the NPP using a wealth of unpublished oil company seismic and borehole data. The site vicinity study has also been started with a core activity of 300 km² 3D seismic data acquisition, processing and interpretation assisted by a series of additional geophysical surveys, new drillings and geological mapping. This lecture will present a few important results of the near regional study, which sheds new light on the intricate tectonic evolution of the Mid-Hungarian Fault Zone (MHFZ), which is a strongly deformed belt between the Alcapa and Tisza-Dacia megatectonic units. The nuclear power plant is located at the margin of the Tisza unit near to the southern edge of the MHFZ. Reassessment of seismic hazard at the site of the NPP requires better understanding of the Miocene to Recent tectonic evolution of this region in the central part of the Pannonian basin. Early to Middle Miocene was a period of rifting with formation of 1 to 3 km deep half-grabens filled with terrestrial to marine deposits and large amount of rift-related volcanic material. Graben fill became strongly deformed as a consequence of juxtaposition of the two megatectonic units leading to strong compression and development of

  5. Experimental study of methane hydrate formation kinetics with or without additives and modeling based on chemical affinity

    Highlights: • Applying chemical affinity for investigating the effects of additives. • Effects of thermodynamic additives on methane hydrate formation kinetics. • Determining kinetic parameters for methane hydrate formation with additives. • A unique path for the methane hydrate formation with aqueous solution of addetives. - Abstract: In this work, methane hydrate formation process (as a process for energy conversion and cool-energy storage) with or without additives was investigated. First, the effects of initial pressure and three surfactants (sodium dodecyl sulfate (SDS), dodecyltrimethyl ammonium bromide (DTAB) and Triton X-100 (TX-100)) and two thermodynamic additives (tetrahydrofuran (THF) and tetra butyl ammonium bromide (TBAB)) on methane hydrate formation kinetics were experimentally studied. Then the macroscopic modeling of methane hydrate formation kinetics with and without additives based on chemical affinity was done. The kinetic parameters of the chemical affinity model were determined for methane hydrate formation with and without additives. The effects of initial pressure and additives on the chemical affinity model parameters were also investigated. In addition, the results of the model were in a good agreement with experimental data

  6. Improvement of ash plume monitoring, modeling and hazard assessment in the MED-SUV project

    Coltelli, Mauro; Andronico, Daniele; Boselli, Antonella; Corradini, Stefano; Costa, Antonio; Donnadieu, Franck; Leto, Giuseppe; Macedonio, Giovanni; Merucci, Luca; Neri, Augusto; Pecora, Emilio; Prestifilippo, Michele; Scarlato, Piergiorgio; Scollo, Simona; Spinelli, Nicola; Spata, Gaetano; Taddeucci, Jacopo; Wang, Xuan; Zanmar Sanchez, Ricardo

    2014-05-01

    Volcanic ash clouds produced by explosive eruptions represent a strong problem for civil aviation, road transportation and other human activities. Since Etna volcano produced in the last 35 years more the 200 explosive eruptions of small and medium size. The INGV, liable for its volcano monitoring, developed since 2006 a specific system for forecasting and monitoring Etna's volcanic ash plumes in collaboration with several national and international institutions. Between 12 January 2011 and 31 December 2013 Etna produced forty-six basaltic lava fountains. Every paroxysm produced an eruption column ranging from a few up to eleven kilometers of height above sea level. The ash cloud contaminated the controlled airspace (CTR) of Catania and Reggio Calabria airports and caused tephra fallout on eastern Sicily sometime disrupting the operations of these airports. In order to give prompt and detailed warnings to the Aviation and Civil Protection authorities, ash plumes monitoring at Osservatorio Etneo, the INGV department in Catania, is carried out using multispectral (from visible to infrared) satellite and ground-based video-surveillance images; seismic and infrasound signals processed in real-time, a Doppler RADAR (Voldorad IIB) able to detect the eruption column in all weather conditions and a LIDAR (AMPLE) for retrieving backscattering and depolarization values of the ash clouds. Forecasting is performed running tephra dispersal models using weather forecast data, and then plotting results on maps published on a dedicated website. 24/7 Control Room operators were able to timely inform Aviation and Civil Protection operators for an effective aviation safety management. A variety of multidisciplinary activities are planned in the MED-SUV project with reference to volcanic ash observations and studies. These include: 1) physical and analogue laboratory experiments on ash dispersal and aggregation; 2) integration of satellite data (e.g. METEOSAT, MODIS) and ground

  7. The 2007 Bengkulu earthquake, its rupture model and implications for seismic hazard

    A Ambikapathy; J K Catherine; V K Gahalaut; M Narsaiah; A Bansal; P Mahesh

    2010-08-01

    The 12 September 2007 great Bengkulu earthquake ( 8.4) occurred on the west coast of Sumatra about 130 km SW of Bengkulu. The earthquake was followed by two strong aftershocks of 7.9 and 7.0. We estimate coseismic offsets due to the mainshock, derived from near-field Global Positioning System (GPS) measurements from nine continuous SuGAr sites operated by the California Institute of Technology (Caltech) group. Using a forward modelling approach, we estimated slip distribution on the causative rupture of the 2007 Bengkulu earthquake and found two patches of large slip, one located north of the mainshock epicenter and the other, under the Pagai Islands. Both patches of large slip on the rupture occurred under the island belt and shallow water. Thus, despite its great magnitude, this earthquake did not generate a major tsunami. Further, we suggest that the occurrence of great earthquakes in the subduction zone on either side of the Siberut Island region, might have led to the increase in static stress in the region, where the last great earthquake occurred in 1797 and where there is evidence of strain accumulation.

  8. GeoClaw-STRICHE: A coupled model for Sediment TRansport In Coastal Hazard Events

    Tang, Hui

    2016-01-01

    GeoClaw-STRICHE is designed for simulating the physical impacts of tsunami as it relates to erosion, transport and deposition. GeoClaw-STRICHE comprises of three components: (1) nonlinear shallow water equations; (2) advection-diffusion equation; (3) an equation for morphology updating. Multiple grain sizes and sediment layers are added into GeoClaw-STRICHE to simulate grain-size distribution and add the capability to develop grain-size trends from bottom to the top of a simulated deposit as well as along the inundation. Unlike previous models based on empirical equations or sediment concentration gradient, the standard Van Leer method is applied to calculate sediment flux. We tested and verified GeoClaw-STRICHE with flume experiment by \\citet{johnson2016experimental} and data from the 2004 Indian Ocean tsunami in Kuala Meurisi as published in \\citet{JGRF:JGRF786}. The comparison with experimental data shows GeoClaw-STRICHE's capability to simulate sediment thickness and grain-size distribution in experimenta...

  9. Efficient Semiparametric Marginal Estimation for the Partially Linear Additive Model for Longitudinal/Clustered Data

    Carroll, Raymond

    2009-04-23

    We consider the efficient estimation of a regression parameter in a partially linear additive nonparametric regression model from repeated measures data when the covariates are multivariate. To date, while there is some literature in the scalar covariate case, the problem has not been addressed in the multivariate additive model case. Ours represents a first contribution in this direction. As part of this work, we first describe the behavior of nonparametric estimators for additive models with repeated measures when the underlying model is not additive. These results are critical when one considers variants of the basic additive model. We apply them to the partially linear additive repeated-measures model, deriving an explicit consistent estimator of the parametric component; if the errors are in addition Gaussian, the estimator is semiparametric efficient. We also apply our basic methods to a unique testing problem that arises in genetic epidemiology; in combination with a projection argument we develop an efficient and easily computed testing scheme. Simulations and an empirical example from nutritional epidemiology illustrate our methods.

  10. Drivers' hazard perception modeling and experimental analysis%驾驶人危险感知建模与实验解析

    杨京帅; 王文亮; 苏薇; 杨得婷; 孙正一

    2014-01-01

    为了量化驾驶人群体危险感知的差异性并指出差异性的原因所在,对驾驶人危险感知进行建模与实验测试分析.借鉴污染环境下生物种群生存这一自然现象,构建了驾驶人危险感知模型,通过实验测试了驾驶人在不同交通场景的危险识别时间与反应时间.模型分析与实验结果表明,驾驶人危险感知阈限值与交通情境危险输入的速率负相关,与驾驶人正确反应率和危险识别率正相关.不同危险程度的交通场景对驾驶人危险感知的总反应时间和识别时间具有显著影响(p<0.001);驾驶经验对驾驶人危险识别时间没有显著影响(p=0.080),但是对危险的反应时间具有显著影响(p=0.003).熟练驾驶人相比非熟练驾驶人具有较高的危险感知水平,这种差异性主要体现在熟练驾驶人能够更快速准确地预测评估交通情境中的危险并进行合理判断.%To quantify the difference of drivers' hazard perception and point out the causes of the difference,drivers' hazard perception was modeled and tested by experiments.The hazard percep-tion model was built based on the natural phenomena of biological population surviving in pollution environments.The hazard detection time and hazard reaction time were tested in different traffic situ-ations.The model analysis and experimental results show that drivers' hazard perception threshold limit value is negatively affected by the input rate of traffic situation dangers and positively affected by drivers' correct reaction ratios and hazards detection ratios.The results of drivers'hazard percep-tion test reveal a main effect of traffic situations with different risk level on drivers' overall reaction time and hazard detection time (p<0.001).Driving experiences have no significant effect on haz-ard detection time (p=0.080)but have significant effect on drivers' hazard reaction time (p=0.003).Experienced

  11. Leaching of hazardous substances from a composite construction product – An experimental and modelling approach for fibre-cement sheets

    Highlights: • Biocide and heavy metals leaching from fibre-cement sheet was investigated. • Equilibrium and dynamic leaching tests were used as modelling support. • The chemical-transport model identifies the main fixation/solubilisation mechanisms. • Biocides as terbutryn and boron were released by the commercial product. • FCS exhibit a cement-like leaching behaviour with high organic carbon release. -- Abstract: The leaching behaviour of a commercial fibre-cement sheet (FCS) product has been investigated. A static pH dependency test and a dynamic surface leaching test have been performed at lab scale. These tests allowed the development of a chemical-transport model capable to predict the release of major and trace elements over the entire pH range, in function of time. FCS exhibits a cement-type leaching behaviour with respect to the mineral species. Potentially hazardous species are released in significant quantities when compared to their total content. These are mainly heavy metals commonly encountered in cement matrixes and boron (probably added as biocide). Organic compounds considered as global dissolved carbon are released in significant concentrations, originating probably from the partial degradation of the organic fibres. The pesticide terbutryn (probably added during the preservative treatment of the organic fibres) was systematically identified in the leachates. The simulation of an upscaled runoff scenario allowed the evaluation of the cumulative release over long periods and the distribution of the released quantities in time, in function of the local exposure conditions. After 10 years of exposure the release reaches significant fractions of the species’ total content – going from 4% for Cu to near 100% for B

  12. Leaching of hazardous substances from a composite construction product – An experimental and modelling approach for fibre-cement sheets

    Lupsea, Maria [University of Toulouse, INSA, UPS, INP, LISBP, 135 Avenue de Rangueil, F-31077 Toulouse (France); INRA, UMR 792, F-31400 Toulouse (France); CNRS, UMR 5504, F-31400 Toulouse (France); Paris–Est University, CSTB–Scientific and Technical Centre for the Building Industry, DEE/Environmentand Life Cycle Engineering Team, 24 rue Joseph Fourier, F–38400 Saint Martin d’Hères (France); Tiruta-Barna, Ligia, E-mail: ligia.barna@insa-toulouse.fr [University of Toulouse, INSA, UPS, INP, LISBP, 135 Avenue de Rangueil, F-31077 Toulouse (France); INRA, UMR 792, F-31400 Toulouse (France); CNRS, UMR 5504, F-31400 Toulouse (France); Schiopu, Nicoleta [Paris–Est University, CSTB–Scientific and Technical Centre for the Building Industry, DEE/Environmentand Life Cycle Engineering Team, 24 rue Joseph Fourier, F–38400 Saint Martin d’Hères (France)

    2014-01-15

    Highlights: • Biocide and heavy metals leaching from fibre-cement sheet was investigated. • Equilibrium and dynamic leaching tests were used as modelling support. • The chemical-transport model identifies the main fixation/solubilisation mechanisms. • Biocides as terbutryn and boron were released by the commercial product. • FCS exhibit a cement-like leaching behaviour with high organic carbon release. -- Abstract: The leaching behaviour of a commercial fibre-cement sheet (FCS) product has been investigated. A static pH dependency test and a dynamic surface leaching test have been performed at lab scale. These tests allowed the development of a chemical-transport model capable to predict the release of major and trace elements over the entire pH range, in function of time. FCS exhibits a cement-type leaching behaviour with respect to the mineral species. Potentially hazardous species are released in significant quantities when compared to their total content. These are mainly heavy metals commonly encountered in cement matrixes and boron (probably added as biocide). Organic compounds considered as global dissolved carbon are released in significant concentrations, originating probably from the partial degradation of the organic fibres. The pesticide terbutryn (probably added during the preservative treatment of the organic fibres) was systematically identified in the leachates. The simulation of an upscaled runoff scenario allowed the evaluation of the cumulative release over long periods and the distribution of the released quantities in time, in function of the local exposure conditions. After 10 years of exposure the release reaches significant fractions of the species’ total content – going from 4% for Cu to near 100% for B.

  13. Precarious Rock Methodology for Seismic Hazard: Physical Testing, Numerical Modeling and Coherence Studies

    This report covers the following projects: Shake table tests of precarious rock methodology, field tests of precarious rocks at Yucca Mountain and comparison of the results with PSHA predictions, study of the coherence of the wave field in the ESF, and a limited survey of precarious rocks south of the proposed repository footprint. A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Structures Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where the objects on the shake table overturned a specified number of times. The results of these experiments have been compared with numerical overturning predictions. Numerical predictions for toppling of large objects with simple contact conditions (e.g., I-beams with sharp basal edges) agree well with shake-table results. The numerical model slightly underpredicts the overturning of small rectangular blocks. It overpredicts the overturning PGA for asymmetric granite boulders with complex basal contact conditions. In general the results confirm the approximate predictions of previous studies. Field testing of several rocks at Yucca Mountain has approximately confirmed the preliminary results from previous studies, suggesting that the PSHA predictions are too high, possibly because the uncertainty in the mean of the attenuation relations. Study of the coherence of wavefields in the ESF has provided results which will be very important in design of the canisters distribution, in particular a preliminary estimate of the wavelengths at which the wavefields become incoherent. No evidence was found for extreme focusing by lens-like inhomogeneities. A limited survey for precarious rocks confirmed that they extend south of the repository, and one of these has been field tested

  14. Additive factors do not imply discrete processing stages: a worked example using models of the Stroop task

    TomStafford

    2011-11-01

    Full Text Available Previously, it has been shown that the psychophysical law known as Pi´ eron’s Law holds for colour intensity and that the size of the effect is additive with that of Stroop condition (Stafford, Gurney & Ingram, in press. According to the additive factors method (Donders, 1868-9/1969; Sternberg, 1998, additivity is assumed to indicate independent and discrete processing stages. We present computational modelling work that demonstrates that these results can be successfully accounted for by existing single stage models of the Stroop effect. Consequently, it is not valid to infer either discrete stages or separate loci of effects from additive fac-tors. Further, our modelling work suggests that information binding may be a more important architectural property than discrete stages for producing additive factors.

  15. Earthquake Hazard and Risk Assessment for Turkey

    Betul Demircioglu, Mine; Sesetyan, Karin; Erdik, Mustafa

    2010-05-01

    Using a GIS-environment to present the results, seismic risk analysis is considered as a helpful tool to support the decision making for planning and prioritizing seismic retrofit intervention programs at large scale. The main ingredients of seismic risk analysis consist of seismic hazard, regional inventory of buildings and vulnerability analysis. In this study, the assessment of the national earthquake hazard based on the NGA ground motion prediction models and the comparisons of the results with the previous models have been considered, respectively. An evaluation of seismic risk based on the probabilistic intensity ground motion prediction for Turkey has been investigated. According to the Macroseismic approach of Giovinazzi and Lagomarsino (2005), two alternative vulnerability models have been used to estimate building damage. The vulnerability and ductility indices for Turkey have been taken from the study of Giovinazzi (2005). These two vulnerability models have been compared with the observed earthquake damage database. A good agreement between curves has been clearly observed. In additional to the building damage, casualty estimations based on three different methods for each return period and for each vulnerability model have been presented to evaluate the earthquake loss. Using three different models of building replacement costs, the average annual loss (AAL) and probable maximum loss ratio (PMLR) due to regional earthquake hazard have been provided to form a basis for the improvement of the parametric insurance model and the determination of premium rates for the compulsory earthquake insurance in Turkey.

  16. Modeling of sulfation of potassium chloride by ferric sulfate addition during grate-firing of biomass

    Wu, Hao; Jespersen, Jacob Boll; Aho, Martti;

    2013-01-01

    Potassium chloride, KCl, formed from critical ash-forming elements released during combustion may lead to severe ash deposition and corrosion problems in biomass-fired boilers. Ferric sulfate, Fe2(SO4)3 is an effective additive, which produces sulfur oxides (SO2 and SO3) to convert KCl to the less...... order to simulate the sulfation of KCl by ferric sulfate addition during grate-firing of biomass. The simulation results show good agreements with the experimental data obtained in a pilot-scale biomass grate-firing reactor, where different amounts of ferric sulfate was injected on the grate or into the...... freeboard. In addition, the simulations of elemental sulfur addition on the grate fit well with the experimental data. The results suggest that the SO3 released from ferric sulfate decomposition is the main contributor to KCl sulfation, and that the effectiveness of the ferric sulfate addition is sensitive...

  17. Comment on "Polynomial cointegration tests of anthropogenic impact on global warming" by Beenstock et al. (2012) – some hazards in econometric modelling of climate change

    F. Pretis; Hendry, D. F.

    2013-01-01

    We outline six important hazards that can be encountered in econometric modelling of time-series data, and apply that analysis to demonstrate errors in the empirical modelling of climate data in Beenstock et al. (2012). We show that the claim made in Beenstock et al. (2012) as to the different degrees of integrability of CO2 and temperature is incorrect. In particular, the level of integration is not constant and not intrinsic to the process. Further, we illustrate that the ...

  18. Hazardous Air Pollutants

    ... Facebook Twitter Google+ Pinterest Contact Us Hazardous Air Pollutants Hazardous air pollutants are those known to cause ... protect against adverse environmental effects. About Hazardous Air Pollutants What are hazardous air pollutants? Health and Environmental ...

  19. Improving the all-hazards homeland security enterprise through the use of an emergency management intelligence model

    Schulz, William N.

    2013-01-01

    CHDS State/Local As the all-hazards approach takes hold in our national Emergency Management and Homeland Security efforts and continues to seek greater collaboration between these two fields, an area that has yet to be explored to its fullest extent is the utilization of an intelligence process to enhance EM operations. Despite the existence of multiple Federal-level policies that outline the importance of intelligence and information sharing across the all-hazards community, EM is still ...

  20. A review of successful aging models: proposing proactive coping as an important additional strategy.

    Ouwehand, C.; Ridder, D.T.D. de; BENSING, J

    2007-01-01

    Successful aging is an important concept, and one that has been the subject of much research. During the last 15 years, the emphasis of this research has shifted from formulating criteria for successful aging to describing the processes involved in successful aging. The main purpose of the present article is to review psychological models of successful aging. The model of Selective Optimization with Compensation (SOC-model) proves to be one of the leading models in this field. Although eviden...

  1. Sustainable manufacturing: evaluation and modeling of environmental impacts in additive manufacturing

    Le Bourhis, Florent; Kerbrat, Olivier; Hascoët, Jean-Yves; MOGNOL, Pascal

    2013-01-01

    International audience Cleaner production and sustainability are of crucial importance in the field of manufacturing processes where great amounts of energy and materials are being consumed. Nowadays, additive manufacturing technologies such as direct additive laser manufacturing allow us to manufacture functional products with high added value. Insofar as environmental considerations become an important issue in our society, as well as legislation regarding environment become prominent (N...

  2. Utilization of sulfate additives in biomass combustion: fundamental and modeling aspects

    Wu, Hao; Jespersen, Jacob Boll; Grell, Morten Nedergaard; Aho, Martti; Jappe Frandsen, Flemming; Glarborg, Peter

    2013-01-01

    Sulfates, such as ammonium sulfate, aluminum sulfate and ferric sulfate, are effective additives for converting the alkali chlorides released from biomass combustion to the less harmful alkali sulfates. Optimization of the use of these additives requires knowledge on their decomposition rate and product distribution under high temperature conditions. In the present work, the decomposition of ammonium sulfate, aluminum sulfate and ferric sulfate wasstudied respectively in a fast-heating rate t...

  3. Can an energy balance model provide additional constraints on how to close the energy imbalance?

    Wohlfahrt, Georg; Widmoser, Peter

    2013-01-01

    Elucidating the causes for the energy imbalance, i.e. the phenomenon that eddy covariance latent and sensible heat fluxes fall short of available energy, is an outstanding problem in micrometeorology. This paper tests the hypothesis that the full energy balance, through incorporation of additional independent measurements which determine the driving forces of and resistances to energy transfer, provides further insights into the causes of the energy imbalance and additional constraints on ene...

  4. Tracking hazardous air pollutants from a refinery fire by applying on-line and off-line air monitoring and back trajectory modeling

    Highlights: • An industrial fire can emit hazardous air pollutants into the surrounding areas. • Both on- and off-line monitoring are needed to study air pollution from fires. • Back trajectory and dispersion modeling can trace emission sources of fire-related pollution. -- Abstract: The air monitors used by most regulatory authorities are designed to track the daily emissions of conventional pollutants and are not well suited for measuring hazardous air pollutants that are released from accidents such as refinery fires. By applying a wide variety of air-monitoring systems, including on-line Fourier transform infrared spectroscopy, gas chromatography with a flame ionization detector, and off-line gas chromatography–mass spectrometry for measuring hazardous air pollutants during and after a fire at a petrochemical complex in central Taiwan on May 12, 2011, we were able to detect significantly higher levels of combustion-related gaseous and particulate pollutants, refinery-related hydrocarbons, and chlorinated hydrocarbons, such as 1,2-dichloroethane, vinyl chloride monomer, and dichloromethane, inside the complex and 10 km downwind from the fire than those measured during the normal operation periods. Both back trajectories and dispersion models further confirmed that high levels of hazardous air pollutants in the neighboring communities were carried by air mass flown from the 22 plants that were shut down by the fire. This study demonstrates that hazardous air pollutants from industrial accidents can successfully be identified and traced back to their emission sources by applying a timely and comprehensive air-monitoring campaign and back trajectory air flow models

  5. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Hu, Qinhong

    2016-07-31

    The additivity model assumed that field-scale reaction properties in a sediment including surface area, reactive site concentration, and reaction rate can be predicted from field-scale grain-size distribution by linearly adding reaction properties estimated in laboratory for individual grain-size fractions. This study evaluated the additivity model in scaling mass transfer-limited, multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment. Experimental data of rate-limited U(VI) desorption in a stirred flow-cell reactor were used to estimate the statistical properties of the rate constants for individual grain-size fractions, which were then used to predict rate-limited U(VI) desorption in the composite sediment. The result indicated that the additivity model with respect to the rate of U(VI) desorption provided a good prediction of U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel-size fraction (2 to 8 mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.

  6. Earthquake Hazard and Risk in Alaska

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  7. Earthquake Hazard and Risk in New Zealand

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates

  8. Seismic hazard: UK continental shelf

    NONE

    2002-07-01

    An evaluation of seismic hazard for offshore UK waters has been completed by a joint UK/Norwegian team. For the first time, consistency of hazard mapping has been achieved for the northern North Sea and peak acceleration hazard contour maps have been drawn for return periods of 100, 200, 475, 1000 and 10,000 years. The report describes (a) the spatial pattern of seismicity; (b) scam's hazard source model; (c) seismic ground motion and (d) seismic hazard computation. Diagrams show (1) a map of the structural framework of the UK and Norwegian North Sea; (2) epicentres; (3) zonation models; (4) peak ground acceleration contours and (5) generic offshore spectral shapes.

  9. Inclusion of Additional Plant Species and Trait Information in Dynamic Vegetation Modeling of Arctic Tundra and Boreal Forest Ecosystem

    Euskirchen, E. S.; Patil, V.; Roach, J.; Griffith, B.; McGuire, A. D.

    2015-12-01

    Dynamic vegetation models (DVMs) have been developed to model the ecophysiological characteristics of plant functional types in terrestrial ecosystems. They have frequently been used to answer questions pertaining to processes such as disturbance, plant succession, and community composition under historical and future climate scenarios. While DVMs have proved useful in these types of applications, it has often been questioned if additional detail, such as including plant dynamics at the species-level and/or including species-specific traits would make these models more accurate and/or broadly applicable. A sub-question associated with this issue is, 'How many species, or what degree of functional diversity, should we incorporate to sustain ecosystem function in modeled ecosystems?' Here, we focus on how the inclusion of additional plant species and trait information may strengthen dynamic vegetation modeling in applications pertaining to: (1) forage for caribou in northern Alaska, (2) above- and belowground carbon storage in the boreal forest and lake margin wetlands of interior Alaska, and (3) arctic tundra and boreal forest leaf phenology. While the inclusion of additional information generally proved valuable in these three applications, this additional detail depends on field data that may not always be available and may also result in increased computational complexity. Therefore, it is important to assess these possible limitations against the perceived need for additional plant species and trait information in the development and application of dynamic vegetation models.

  10. Mechanisms and modeling of the effects of additives on the nitrogen oxides emission

    Kundu, Krishna P.; Nguyen, Hung Lee; Kang, M. Paul

    1991-01-01

    A theoretical study on the emission of the oxides of nitrogen in the combustion of hydrocarbons is presented. The current understanding of the mechanisms and the rate parameters for gas phase reactions were used to calculate the NO(x) emission. The possible effects of different chemical species on thermal NO(x), on a long time scale were discussed. The mixing of these additives at various stages of combustion were considered and NO(x) concentrations were calculated; effects of temperatures were also considered. The chemicals such as hydrocarbons, H2, CH3OH, NH3, and other nitrogen species were chosen as additives in this discussion. Results of these calculations can be used to evaluate the effects of these additives on the NO(x) emission in the industrial combustion system.

  11. High-resolution and Monte Carlo additions to the SASKTRAN radiative transfer model

    D. J. Zawada

    2015-06-01

    Full Text Available The Optical Spectrograph and InfraRed Imaging System (OSIRIS instrument on board the Odin spacecraft has been measuring limb-scattered radiance since 2001. The vertical radiance profiles measured as the instrument nods are inverted, with the aid of the SASKTRAN radiative transfer model, to obtain vertical profiles of trace atmospheric constituents. Here we describe two newly developed modes of the SASKTRAN radiative transfer model: a high-spatial-resolution mode and a Monte Carlo mode. The high-spatial-resolution mode is a successive-orders model capable of modelling the multiply scattered radiance when the atmosphere is not spherically symmetric; the Monte Carlo mode is intended for use as a highly accurate reference model. It is shown that the two models agree in a wide variety of solar conditions to within 0.2 %. As an example case for both models, Odin–OSIRIS scans were simulated with the Monte Carlo model and retrieved using the high-resolution model. A systematic bias of up to 4 % in retrieved ozone number density between scans where the instrument is scanning up or scanning down was identified. The bias is largest when the sun is near the horizon and the solar scattering angle is far from 90°. It was found that calculating the multiply scattered diffuse field at five discrete solar zenith angles is sufficient to eliminate the bias for typical Odin–OSIRIS geometries.

  12. TURBHO - Higher order turbulence modeling for industrial appications. Design document: Module Test Phase (MTP). Software engineering module: Additional physical models; TURBHO. Turbulenzmodellierung hoeherer Ordnung fuer industrielle Anwendungen. Design document: Module Test Phase (MTP). Software engineering module: additional physical models

    Grotjans, H.

    1998-04-01

    In the current Software Engineering Module (SEM2) three additional test cases have been investigated, as listed in Chapter 2. For all test cases it has been shown that the computed results are grid independent. This has been done by systematic grid refinement studies. The main objective of the current SEM2 was the verification and validation of the new wall function implementation for the k-{epsilon} mode and the SMC-model. Analytical relations and experimental data have been used for comparison of the computational results. The agreement of the results is good. Therefore, the correct implementation of the new wall function has been demonstrated. As the results in this report have shown, a consistent grid refinement can be done for any test case. This is an important improvement for industrial applications, as no model specific requirements must be considered during grid generation. (orig.)

  13. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 0.0 m: wave-hazard projections

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  14. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 0.5 m: wave-hazard projections

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  15. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 1.0 m: wave-hazard projections

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  16. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 2.0 m: wave-hazard projections

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  17. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 1.5 m: wave-hazard projections

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  18. Hydrodynamic Modeling of Flash Floods in an Andean Stream: Challenges for Assessing Flood Hazards in Mountain Rivers

    Contreras, M. T.; Escauriaza, C. R.

    2015-12-01

    Rain-induced flash floods are common events in regions close to the southern Andes, in north and central Chile. Rapid urban development combined to the changing climate and ENSO effects have resulted in an alarming proximity of flood-prone streams to densely populated areas in the Andean foothills, increasing the risk for cities and infrastructure. Simulations of rapid floods in these complex watersheds are particularly challenging, especially if there is insufficient geomorphological and hydrometeorological data. In the Quebrada de Ramón, an Andean stream that passes through a highly populated area in the east part of Santiago, Chile, previous events have demonstrated that sediment concentration, flow resistance, and the characteristic temporal and spatial scales of the hydrograph, are important variables to predict the arrival time of the peak discharge, flow velocities and the extension of inundated areas. The objective of this investigation is to improve our understanding of the dynamics of flash floods in the Quebrada de Ramón, quantifying the effects of these factors on the flood propagation. We implement a two-dimensional model based on the shallow water equations (Guerra et al. 2014) modified to account for hyperconcentrated flows over natural topography. We evaluate events of specific return periods and sediment concentrations, using different methodologies to quantify the flow resistance in the channel and floodplains. Through this work we provide a framework for future studies aimed at improving hazard assessment, urban planning, and early warning systems in urban areas near mountain streams with limited data, and affected by rapid flood events. Work supported by Fondecyt grant 1130940 and CONICYT/FONDAP grant 15110017.

  19. Potentially Hazardous Asteroid (85989) 1999 JD6: Radar, Infrared, and Lightcurve Observations and a Preliminary Shape Model

    Marshall, Sean E.; Howell, Ellen S.; Brozović, Marina; Taylor, Patrick A.; Campbell, Donald B.; Benner, Lance A. M.; Naidu, Shantanu P.; Giorgini, Jon D.; Jao, Joseph S.; Lee, Clement G.; Richardson, James E.; Rodriguez-Ford, Linda A.; Rivera-Valentin, Edgard G.; Ghigo, Frank; Kobelski, Adam; Busch, Michael W.; Pravec, Petr; Warner, Brian D.; Reddy, Vishnu; Hicks, Michael D.; Crowell, Jenna L.; Fernandez, Yanga R.; Vervack, Ronald J.; Nolan, Michael C.; Magri, Christopher; Sharkey, Benjamin; Bozek, Brandon

    2015-11-01

    We report observations of potentially hazardous asteroid (85989) 1999 JD6, which passed 0.048 AU from Earth (19 lunar distances) during its close approach on July 25, 2015. During eleven days between July 15 and August 4, 2015, we observed 1999 JD6 with the Goldstone Solar System Radar and with Arecibo Observatory's planetary radar, including bistatic reception of some Goldstone echoes at Green Bank. We obtained delay-Doppler radar images at a wide range of latitudes, with range resolutions varying from 7.5 to 150 meters per pixel, depending on the observing conditions. We acquired near-infrared spectra from the NASA InfraRed Telescope Facility (IRTF) on two nights in July 2015, at wavelengths from 0.75 to 5.0 microns, showing JD6's thermal emission. We also obtained optical lightcurves from Ondrejov Observatory (in 1999), Table Mountain Observatory (in 2000), and Palmer Divide Station (in 2015). Previous observers had suggested that 1999 JD6 was most likely an elongated object, based on its large lightcurve amplitude of 1.2 magnitudes (Szabo et al. 2001; Polishook and Brosch 2008; Warner 2014). The radar images reveal an elongated peanut-shaped object, with two lobes separated by a sharp concavity. JD6's maximum diameter is about two kilometers, and its larger lobe is approximately 50% longer than its smaller lobe. The larger lobe has a concavity on its end. We will present more details on the shape and rotation state of 1999 JD6, as well as its surface properties from optical and infrared data and thermal modeling.

  20. Generalized Additive Models for Location Scale and Shape (GAMLSS in R

    D. Mikis Stasinopoulos

    2007-11-01

    Full Text Available GAMLSS is a general framework for fitting regression type models where the distribution of the response variable does not have to belong to the exponential family and includes highly skew and kurtotic continuous and discrete distribution. GAMLSS allows all the parameters of the distribution of the response variable to be modelled as linear/non-linear or smooth functions of the explanatory variables. This paper starts by defining the statistical framework of GAMLSS, then describes the current implementation of GAMLSS in R and finally gives four different data examples to demonstrate how GAMLSS can be used for statistical modelling.