WorldWideScience

Sample records for additive hazards model

  1. A flexible additive multiplicative hazard model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas H.

    2002-01-01

    Aalen's additive model; Counting process; Cox regression; Hazard model; Proportional excess harzard model; Time-varying effect......Aalen's additive model; Counting process; Cox regression; Hazard model; Proportional excess harzard model; Time-varying effect...

  2. Further Results on Dynamic Additive Hazard Rate Model

    Directory of Open Access Journals (Sweden)

    Zhengcheng Zhang

    2014-01-01

    Full Text Available In the past, the proportional and additive hazard rate models have been investigated in the works. Nanda and Das (2011 introduced and studied the dynamic proportional (reversed hazard rate model. In this paper we study the dynamic additive hazard rate model, and investigate its aging properties for different aging classes. The closure of the model under some stochastic orders has also been investigated. Some examples are also given to illustrate different aging properties and stochastic comparisons of the model.

  3. Coordinate descent methods for the penalized semiparametric additive hazards model

    DEFF Research Database (Denmark)

    Gorst-Rasmussen, Anders; Scheike, Thomas

    For survival data with a large number of explanatory variables, lasso penalized Cox regression is a popular regularization strategy. However, a penalized Cox model may not always provide the best fit to data and can be difficult to estimate in high dimension because of its intrinsic nonlinearity....... The semiparametric additive hazards model is a flexible alternative which is a natural survival analogue of the standard linear regression model. Building on this analogy, we develop a cyclic coordinate descent algorithm for fitting the lasso and elastic net penalized additive hazards model. The algorithm requires...

  4. Coordinate descent methods for the penalized semiprarametric additive hazard model

    DEFF Research Database (Denmark)

    Gorst-Rasmussen, Anders; Scheike, Thomas

    2012-01-01

    For survival data with a large number of explanatory variables, lasso penalized Cox regression is a popular regularization strategy. However, a penalized Cox model may not always provide the best fit to data and can be difficult to estimate in high dimension because of its intrinsic nonlinearity....... The semiparametric additive hazards model is a flexible alternative which is a natural survival analogue of the standard linear regression model. Building on this analogy, we develop a cyclic coordinate descent algorithm for fitting the lasso and elastic net penalized additive hazards model. The algorithm requires...

  5. Additive Hazard Regression Models: An Application to the Natural History of Human Papillomavirus

    OpenAIRE

    Xianhong Xie; STRICKLER, Howard D.; Xiaonan Xue

    2013-01-01

    There are several statistical methods for time-to-event analysis, among which is the Cox proportional hazards model that is most commonly used. However, when the absolute change in risk, instead of the risk ratio, is of primary interest or when the proportional hazard assumption for the Cox proportional hazards model is violated, an additive hazard regression model may be more appropriate. In this paper, we give an overview of this approach and then apply a semiparametric as well as a nonpara...

  6. The additive hazards model with high-dimensional regressors

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas

    2009-01-01

    model. A standard PLS algorithm can also be constructed, but it turns out that the resulting predictor can only be related to the original covariates via time-dependent coefficients. The methods are applied to a breast cancer data set with gene expression recordings and to the well known primary biliary...

  7. Additive Hazard Regression Models: An Application to the Natural History of Human Papillomavirus

    Directory of Open Access Journals (Sweden)

    Xianhong Xie

    2013-01-01

    Full Text Available There are several statistical methods for time-to-event analysis, among which is the Cox proportional hazards model that is most commonly used. However, when the absolute change in risk, instead of the risk ratio, is of primary interest or when the proportional hazard assumption for the Cox proportional hazards model is violated, an additive hazard regression model may be more appropriate. In this paper, we give an overview of this approach and then apply a semiparametric as well as a nonparametric additive model to a data set from a study of the natural history of human papillomavirus (HPV in HIV-positive and HIV-negative women. The results from the semiparametric model indicated on average an additional 14 oncogenic HPV infections per 100 woman-years related to CD4 count < 200 relative to HIV-negative women, and those from the nonparametric additive model showed an additional 40 oncogenic HPV infections per 100 women over 5 years of followup, while the estimated hazard ratio in the Cox model was 3.82. Although the Cox model can provide a better understanding of the exposure disease association, the additive model is often more useful for public health planning and intervention.

  8.  The additive hazards model with high-dimensional regressors

    DEFF Research Database (Denmark)

    Martinussen, Torben

    2009-01-01

    This paper considers estimation and prediction in the Aalen additive hazards model in the case where the covariate vector is high-dimensional such as gene expression measurements. Some form of dimension reduction of the covariate space is needed to obtain useful statistical analyses. We study the...... model. A standard PLS algorithm can also be constructed, but it turns out that the resulting predictor can only be related to the original covariates via time-dependent coefficients...

  9. Asymptotics on Semiparametric Analysis of Multivariate Failure Time Data Under the Additive Hazards Model

    Institute of Scientific and Technical Information of China (English)

    Huan-bin Liu; Liu-quan Sun; Li-xing Zhu

    2005-01-01

    Many survival studies record the times to two or more distinct failures on each subject. The failures may be events of different natures or may be repetitions of the same kind of event. In this article, we consider the regression analysis of such multivariate failure time data under the additive hazards model. Simple weighted estimating functions for the regression parameters are proposed, and asymptotic distribution theory of the resulting estimators are derived. In addition, a class of generalized Wald and generalized score statistics for hypothesis testing and model selection are presented, and the asymptotic properties of these statistics are examined.

  10. Estimation of direct effects for survival data by using the Aalen additive hazards model

    DEFF Research Database (Denmark)

    Martinussen, T.; Vansteelandt, S.; Gerster, M.;

    2011-01-01

    We extend the definition of the controlled direct effect of a point exposure on a survival outcome, other than through some given, time-fixed intermediate variable, to the additive hazard scale. We propose two-stage estimators for this effect when the exposure is dichotomous and randomly assigned...... and when the association between the intermediate variable and the survival outcome is confounded only by measured factors, which may themselves be affected by the exposure. The first stage of the estimation procedure involves assessing the effect of the intermediate variable on the survival outcome via...... Aalen's additive regression for the event time, given exposure, intermediate variable and confounders. The second stage involves applying Aalen's additive model, given the exposure alone, to a modified stochastic process (i.e. a modification of the observed counting process based on the first...

  11. Education and risk of coronary heart disease: Assessment of mediation by behavioural risk factors using the additive hazards model

    DEFF Research Database (Denmark)

    Nordahl, H; Rod, NH; Frederiksen, BL;

    2013-01-01

    seven Danish cohort studies were linked to registry data on education and incidence of CHD. Mediation by smoking, low physical activity, and body mass index (BMI) on the association between education and CHD were estimated by applying newly proposed methods for mediation based on the additive hazards...... % CI: 12, 22) for women and 37 (95 % CI: 28, 46) for men could be ascribed to the pathway through smoking. Further, 39 (95 % CI: 30, 49) cases for women and 94 (95 % CI: 79, 110) cases for men could be ascribed to the pathway through BMI. The effects of low physical activity were negligible. Using...

  12. Additive Hazards Regression with Random Eff ects for Clustered Failure Times

    Institute of Scientific and Technical Information of China (English)

    Deng PAN; Yan Yan LIU; Yuan Shan WU

    2015-01-01

    Additive hazards model with random eff ects is proposed for modelling the correlated failure time data when focus is on comparing the failure times within clusters and on estimating the correlation between failure times from the same cluster, as well as the marginal regression parameters. Our model features that, when marginalized over the random eff ect variable, it still enjoys the structure of the additive hazards model. We develop the estimating equations for inferring the regression parameters. The proposed estimators are shown to be consistent and asymptotically normal under appropriate regularity conditions. Furthermore, the estimator of the baseline hazards function is proposed and its asymptotic properties are also established. We propose a class of diagnostic methods to assess the overall fitting adequacy of the additive hazards model with random eff ects. We conduct simulation studies to evaluate the finite sample behaviors of the proposed estimators in various scenarios. Analysis of the Diabetic Retinopathy Study is provided as an illustration for the proposed method.

  13. Model Additional Protocol

    International Nuclear Information System (INIS)

    Since the end of the cold war a series of events has changed the circumstances and requirements of the safeguards system. The discovery of a clandestine nuclear weapons program in Iraq, the continuing difficulty in verifying the initial report of Democratic People's Republic of Korea upon entry into force of their safeguards agreement, and the decision of the South African Government to give up its nuclear weapons program and join the Treaty on the Non-Proliferation of Nuclear Weapons have all played a role in an ambitious effort by IAEA Member States and the Secretariat to strengthen the safeguards system. A major milestone in this effort was reached in May 1997 when the IAEA Board of Governors approved a Model Protocol Additional to Safeguards Agreements. The Model Additional Protocol was negotiated over a period of less than a year by an open-ended committee of the Board involving some 70 Member States and two regional inspectorates. The IAEA is now in the process of negotiating additional protocols, State by State, and implementing them. These additional protocols will provide the IAEA with rights of access to information about all activities related to the use of nuclear material in States with comprehensive safeguards agreements and greatly expanded physical access for IAEA inspectors to confirm or verify this information. In conjunction with this, the IAEA is working on the integration of these measures with those provided for in comprehensive safeguards agreements, with a view to maximizing the effectiveness and efficiency, within available resources, the implementation of safeguards. Details concerning the Model Additional Protocol are given. (author)

  14. Computer Model Locates Environmental Hazards

    Science.gov (United States)

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  15. Crossing Hazard Functions in Common Survival Models.

    Science.gov (United States)

    Zhang, Jiajia; Peng, Yingwei

    2009-10-15

    Crossing hazard functions have extensive applications in modeling survival data. However, existing studies in the literature mainly focus on comparing crossed hazard functions and estimating the time at which the hazard functions cross, and there is little theoretical work on conditions under which hazard functions from a model will have a crossing. In this paper, we investigate crossing status of hazard functions from the proportional hazards (PH) model, the accelerated hazard (AH) model, and the accelerated failure time (AFT) model. We provide and prove conditions under which the hazard functions from the AH and the AFT models have no crossings or a single crossing. A few examples are also provided to demonstrate how the conditions can be used to determine crossing status of hazard functions from the three models.

  16. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  17. Satellite image collection modeling for large area hazard emergency response

    Science.gov (United States)

    Liu, Shufan; Hodgson, Michael E.

    2016-08-01

    Timely collection of critical hazard information is the key to intelligent and effective hazard emergency response decisions. Satellite remote sensing imagery provides an effective way to collect critical information. Natural hazards, however, often have large impact areas - larger than a single satellite scene. Additionally, the hazard impact area may be discontinuous, particularly in flooding or tornado hazard events. In this paper, a spatial optimization model is proposed to solve the large area satellite image acquisition planning problem in the context of hazard emergency response. In the model, a large hazard impact area is represented as multiple polygons and image collection priorities for different portion of impact area are addressed. The optimization problem is solved with an exact algorithm. Application results demonstrate that the proposed method can address the satellite image acquisition planning problem. A spatial decision support system supporting the optimization model was developed. Several examples of image acquisition problems are used to demonstrate the complexity of the problem and derive optimized solutions.

  18. Modeling and Hazard Analysis Using STPA

    Science.gov (United States)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  19. Mixed additive models

    Science.gov (United States)

    Carvalho, Francisco; Covas, Ricardo

    2016-06-01

    We consider mixed models y =∑i =0 w Xiβi with V (y )=∑i =1 w θiMi Where Mi=XiXi⊤ , i = 1, . . ., w, and µ = X0β0. For these we will estimate the variance components θ1, . . ., θw, aswell estimable vectors through the decomposition of the initial model into sub-models y(h), h ∈ Γ, with V (y (h ))=γ (h )Ig (h )h ∈Γ . Moreover we will consider L extensions of these models, i.e., y˚=Ly+ɛ, where L=D (1n1, . . ., 1nw) and ɛ, independent of y, has null mean vector and variance covariance matrix θw+1Iw, where w =∑i =1 n wi .

  20. Validation of a heteroscedastic hazards regression model.

    Science.gov (United States)

    Wu, Hong-Dar Isaac; Hsieh, Fushing; Chen, Chen-Hsin

    2002-03-01

    A Cox-type regression model accommodating heteroscedasticity, with a power factor of the baseline cumulative hazard, is investigated for analyzing data with crossing hazards behavior. Since the approach of partial likelihood cannot eliminate the baseline hazard, an overidentified estimating equation (OEE) approach is introduced in the estimation procedure. It by-product, a model checking statistic, is presented to test for the overall adequacy of the heteroscedastic model. Further, under the heteroscedastic model setting, we propose two statistics to test the proportional hazards assumption. Implementation of this model is illustrated in a data analysis of a cancer clinical trial. PMID:11878222

  1. POTENTIAL HAZARDS DUE TO FOOD ADDITIVES IN ORAL HYGIENE PRODUCTS

    OpenAIRE

    Damla TUNCER-BUDANUR; Murat Cengizhan YAŞ; SEPET, Elif

    2016-01-01

    Food additives used to preserve flavor or to enhance the taste and appearance of foods are also available in oral hygiene products. The aim of this review is to provide information concerning food additives in oral hygiene products and their adverse effects. A great many of food additives in oral hygiene products are potential allergens and they may lead to allergic reactions such as urticaria, contact dermatitis, rhinitis, and angioedema. Dental practitioners, as w...

  2. POTENTIAL HAZARDS DUE TO FOOD ADDITIVES IN ORAL HYGIENE PRODUCTS

    Directory of Open Access Journals (Sweden)

    Damla TUNCER-BUDANUR

    2016-04-01

    Full Text Available Food additives used to preserve flavor or to enhance the taste and appearance of foods are also available in oral hygiene products. The aim of this review is to provide information concerning food additives in oral hygiene products and their adverse effects. A great many of food additives in oral hygiene products are potential allergens and they may lead to allergic reactions such as urticaria, contact dermatitis, rhinitis, and angioedema. Dental practitioners, as well as health care providers, must be aware of the possibility of allergic reactions due to food additives in oral hygiene products. Proper dosage levels, delivery vehicles, frequency, potential benefits, and adverse effects of oral health products should be explained completely to the patients. There is a necessity to raise the awareness among dental professionals on this subject and to develop a data gathering system for possible adverse reactions.

  3. Hazard Warning: model misuse ahead

    DEFF Research Database (Denmark)

    Dickey-Collas, M.; Payne, Mark; Trenkel, V.;

    2014-01-01

    The use of modelling approaches in marine science, and in particular fisheries science, is explored. We highlight that the choice of model used for an analysis should account for the question being posed or the context of the management problem. We examine a model-classification scheme based on R...

  4. A conflict model for the international hazardous waste disposal dispute

    Energy Technology Data Exchange (ETDEWEB)

    Hu Kaixian, E-mail: k2hu@engmail.uwaterloo.ca [Department of Systems Design Engineering, University of Waterloo, 200 University Avenue West, Waterloo, Ontario, N2L 3G1 (Canada); Hipel, Keith W., E-mail: kwhipel@uwaterloo.ca [Department of Systems Design Engineering, University of Waterloo, 200 University Avenue West, Waterloo, Ontario, N2L 3G1 (Canada); Fang, Liping, E-mail: lfang@ryerson.ca [Department of Mechanical and Industrial Engineering, Ryerson University, 350 Victoria Street, Toronto, Ontario, M5B 2K3 (Canada)

    2009-12-15

    A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.

  5. Spatial extended hazard model with application to prostate cancer survival.

    Science.gov (United States)

    Li, Li; Hanson, Timothy; Zhang, Jiajia

    2015-06-01

    This article develops a Bayesian semiparametric approach to the extended hazard model, with generalization to high-dimensional spatially grouped data. County-level spatial correlation is accommodated marginally through the normal transformation model of Li and Lin (2006, Journal of the American Statistical Association 101, 591-603), using a correlation structure implied by an intrinsic conditionally autoregressive prior. Efficient Markov chain Monte Carlo algorithms are developed, especially applicable to fitting very large, highly censored areal survival data sets. Per-variable tests for proportional hazards, accelerated failure time, and accelerated hazards are efficiently carried out with and without spatial correlation through Bayes factors. The resulting reduced, interpretable spatial models can fit significantly better than a standard additive Cox model with spatial frailties. PMID:25521422

  6. Business models for additive manufacturing

    DEFF Research Database (Denmark)

    Hadar, Ronen; Bilberg, Arne; Bogers, Marcel

    2015-01-01

    Digital fabrication — including additive manufacturing (AM), rapid prototyping and 3D printing — has the potential to revolutionize the way in which products are produced and delivered to the customer. Therefore, it challenges companies to reinvent their business model — describing the logic...... of creating and capturing value. In this paper, we explore the implications that AM technologies have for manufacturing systems in the new business models that they enable. In particular, we consider how a consumer goods manufacturer can organize the operations of a more open business model when moving from...... a manufacturer-centric to a consumer-centric value logic. A major shift includes a move from centralized to decentralized supply chains, where consumer goods manufacturers can implement a “hybrid” approach with a focus on localization and accessibility or develop a fully personalized model where the consumer...

  7. On multiple agent models of moral hazard

    OpenAIRE

    Andrea Attar; Eloisa Campioni; Gwena�l Piaser; Uday Rajan

    2006-01-01

    In multiple principal, multiple agent models of moral hazard, we provide conditions under which the outcomes of equilibria in direct mechanisms are preserved when principals can offer indirect communication schemes. We discuss the role of random allocations and recommendations and relate the result to the existing literature.

  8. Hazard identification based on plant functional modelling

    International Nuclear Information System (INIS)

    A major objective of the present work is to provide means for representing a process plant as a socio-technical system, so as to allow hazard identification at a high level. The method includes technical, human and organisational aspects and is intended to be used for plant level hazard identification so as to identify critical areas and the need for further analysis using existing methods. The first part of the method is the preparation of a plant functional model where a set of plant functions link together hardware, software, operations, work organisation and other safety related aspects of the plant. The basic principle of the functional modelling is that any aspect of the plant can be represented by an object (in the sense that this term is used in computer science) based upon an Intent (or goal); associated with each Intent are Methods, by which the Intent is realized, and Constraints, which limit the Intent. The Methods and Constraints can themselves be treated as objects and decomposed into lower-level Intents (hence the procedure is known as functional decomposition) so giving rise to a hierarchical, object-oriented structure. The plant level hazard identification is carried out on the plant functional model using the Concept Hazard Analysis method. In this, the user will be supported by checklists and keywords and the analysis is structured by pre-defined worksheets. The preparation of the plant functional model and the performance of the hazard identification can be carried out manually or with computer support. (au) (4 tabs., 10 ills., 7 refs.)

  9. Proportional hazards models with discrete frailty.

    Science.gov (United States)

    Caroni, Chrys; Crowder, Martin; Kimber, Alan

    2010-07-01

    We extend proportional hazards frailty models for lifetime data to allow a negative binomial, Poisson, Geometric or other discrete distribution of the frailty variable. This might represent, for example, the unknown number of flaws in an item under test. Zero frailty corresponds to a limited failure model containing a proportion of units that never fail (long-term survivors). Ways of modifying the model to avoid this are discussed. The models are illustrated on a previously published set of data on failures of printed circuit boards and on new data on breaking strengths of samples of cord.

  10. Ground-level ozone following astrophysical ionizing radiation events: an additional biological hazard?

    CERN Document Server

    Thomas, Brian C

    2015-01-01

    Astrophysical ionizing radiation events such as supernovae, gamma-ray bursts, and solar proton events have been recognized as a potential threat to life on Earth, primarily through depletion of stratospheric ozone and subsequent increase in solar UV radiation at Earth's surface and in the upper levels of the ocean. Other work has also considered the potential impact of nitric acid rainout, concluding that no significant threat is likely. Not yet studied to-date is the potential impact of ozone produced in the lower atmosphere following an ionizing radiation event. Ozone is a known irritant to organisms on land and in water and therefore may be a significant additional hazard. Using previously completed atmospheric chemistry modeling we have examined the amount of ozone produced in the lower atmosphere for the case of a gamma-ray burst and find that the values are too small to pose a significant additional threat to the biosphere. These results may be extended to other ionizing radiation events, including supe...

  11. Ground-Level Ozone Following Astrophysical Ionizing Radiation Events: An Additional Biological Hazard?

    Science.gov (United States)

    Thomas, Brian C; Goracke, Byron D

    2016-01-01

    Astrophysical ionizing radiation events such as supernovae, gamma-ray bursts, and solar proton events have been recognized as a potential threat to life on Earth, primarily through depletion of stratospheric ozone and subsequent increase in solar UV radiation at Earth's surface and in the upper levels of the ocean. Other work has also considered the potential impact of nitric acid rainout, concluding that no significant threat is likely. Not yet studied to date is the potential impact of ozone produced in the lower atmosphere following an ionizing radiation event. Ozone is a known irritant to organisms on land and in water and therefore may be a significant additional hazard. Using previously completed atmospheric chemistry modeling, we examined the amount of ozone produced in the lower atmosphere for the case of a gamma-ray burst and found that the values are too small to pose a significant additional threat to the biosphere. These results may be extended to other ionizing radiation events, including supernovae and extreme solar proton events. PMID:26745353

  12. Application of a hazard-based visual predictive check to evaluate parametric hazard models.

    Science.gov (United States)

    Huh, Yeamin; Hutmacher, Matthew M

    2016-02-01

    Parametric models used in time to event analyses are evaluated typically by survival-based visual predictive checks (VPC). Kaplan-Meier survival curves for the observed data are compared with those estimated using model-simulated data. Because the derivative of the log of the survival curve is related to the hazard--the typical quantity modeled in parametric analysis--isolation, interpretation and correction of deficiencies in the hazard model determined by inspection of survival-based VPC's is indirect and thus more difficult. The purpose of this study is to assess the performance of nonparametric hazard estimators of hazard functions to evaluate their viability as VPC diagnostics. Histogram-based and kernel-smoothing estimators were evaluated in terms of bias of estimating the hazard for Weibull and bathtub-shape hazard scenarios. After the evaluation of bias, these nonparametric estimators were assessed as a method for VPC evaluation of the hazard model. The results showed that nonparametric hazard estimators performed reasonably at the sample sizes studied with greater bias near the boundaries (time equal to 0 and last observation) as expected. Flexible bandwidth and boundary correction methods reduced these biases. All the nonparametric estimators indicated a misfit of the Weibull model when the true hazard was a bathtub shape. Overall, hazard-based VPC plots enabled more direct interpretation of the VPC results compared to survival-based VPC plots. PMID:26563504

  13. Integrated Modeling for Flood Hazard Mapping Using Watershed Modeling System

    Directory of Open Access Journals (Sweden)

    Seyedeh S. Sadrolashrafi

    2008-01-01

    Full Text Available In this stduy, a new framework which integrates the Geographic Information System (GIS with the Watershed Modeling System (WMS for flood modeling is developed. It also interconnects the terrain models and the GIS software, with commercial standard hydrological and hydraulic models, including HEC-1, HEC-RAS, etc. The Dez River Basin (about 16213 km2 in Khuzestan province, IRAN, is domain of study because of occuring frequent severe flash flooding. As a case of study, a major flood in autumn of 2001 is chosen to examine the modeling framework. The model consists of a rainfall-runoff model (HEC-1 that converts excess precipitation to overland flow and channel runoff and a hydraulic model (HEC-RAS that simulates steady state flow through the river channel network based on the HEC-1, peak hydrographs. In addition, it delineates the maps of potential flood zonation for the Dez River Basin. These are achieved based on the state of the art GIS with using WMS software. Watershed parameters are calibrated manually to perform a good simulation of discharge at three sub-basins. With the calibrated discharge, WMS is capable of producing flood hazard map. The modeling framework presented in this study demonstrates the accuracy and usefulness of the WMS software for flash flooding control. The results of this research will benefit future modeling efforts by providing validate hydrological software to forecast flooding on a regional scale. This model designed for the Dez River Basin, while this regional scale model may be used as a prototype for model applications in other areas.

  14. Lahar Hazard Modeling at Tungurahua Volcano, Ecuador

    Science.gov (United States)

    Sorensen, O. E.; Rose, W. I.; Jaya, D.

    2003-04-01

    lahar-hazard-zones using a digital elevation model (DEM), was used to construct a hazard map for the volcano. The 10 meter resolution DEM was constructed for Tungurahua Volcano using scanned topographic lines obtained from the GIS Department at the Escuela Politécnica Nacional, Quito, Ecuador. The steep topographic gradients and rapid downcutting of most rivers draining the edifice prevents the deposition of lahars on the lower flanks of Tungurahua. Modeling confirms the high degree of flow channelization in the deep Tungurahua canyons. Inundation zones observed and shown by LAHARZ at Baños yield identification of safe zones within the city which would provide safety from even the largest magnitude lahar expected.

  15. Reactive Additive Stabilization Process (RASP) for hazardous and mixed waste vitrification

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, C.M.; Pickett, J.B.; Ramsey, W.G.

    1993-07-01

    Solidification of hazardous/mixed wastes into glass is being examined at the Savannah River Site (SRS) for (1) nickel plating line (F006) sludges and (2) incinerator wastes. Vitrification of these wastes using high surface area additives, the Reactive Additive Stabilization Process (RASP), has been determined to greatly enhance the dissolution and retention of hazardous, mixed, and heavy metal species in glass. RASP lowers melt temperatures (typically 1050-- 1150{degrees}C), thereby minimizing volatility concerns during vitrification. RASP maximizes waste loading (typically 50--75 wt% on a dry oxide basis) by taking advantage of the glass forming potential of the waste. RASP vitrification thereby minimizes waste disposal volume (typically 86--97 vol. %), and maximizes cost savings. Solidification of the F006 plating line sludges containing depleted uranium has been achieved in both soda-lime-silica (SLS) and borosilicate glasses at 1150{degrees}C up to waste loadings of 75 wt%. Solidification of incinerator blowdown and mixtures of incinerator blowdown and bottom kiln ash have been achieved in SLS glass at 1150{degrees}C up to waste loadings of 50% using RASP. These waste loadings correspond to volume reductions of 86 and 94 volume %, respectively, with large associated savings in storage costs.

  16. Reactive Additive Stabilization Process (RASP) for hazardous and mixed waste vitrification

    International Nuclear Information System (INIS)

    Solidification of hazardous/mixed wastes into glass is being examined at the Savannah River Site (SRS) for (1) nickel plating line (F006) sludges and (2) incinerator wastes. Vitrification of these wastes using high surface area additives, the Reactive Additive Stabilization Process (RASP), has been determined to greatly enhance the dissolution and retention of hazardous, mixed, and heavy metal species in glass. RASP lowers melt temperatures (typically 1050-- 1150 degrees C), thereby minimizing volatility concerns during vitrification. RASP maximizes waste loading (typically 50--75 wt% on a dry oxide basis) by taking advantage of the glass forming potential of the waste. RASP vitrification thereby minimizes waste disposal volume (typically 86--97 vol. %), and maximizes cost savings. Solidification of the F006 plating line sludges containing depleted uranium has been achieved in both soda-lime-silica (SLS) and borosilicate glasses at 1150 degrees C up to waste loadings of 75 wt%. Solidification of incinerator blowdown and mixtures of incinerator blowdown and bottom kiln ash have been achieved in SLS glass at 1150 degrees C up to waste loadings of 50% using RASP. These waste loadings correspond to volume reductions of 86 and 94 volume %, respectively, with large associated savings in storage costs

  17. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  18. Incident duration modeling using flexible parametric hazard-based models.

    Science.gov (United States)

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  19. A quantitative model for volcanic hazard assessment

    OpenAIRE

    Marzocchi, W.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Roma1, Roma, Italia; Sandri, L.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Bologna, Bologna, Italia; Furlan, C.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Bologna, Bologna, Italia

    2006-01-01

    Volcanic hazard assessment is a basic ingredient for risk-based decision-making in land-use planning and emergency management. Volcanic hazard is defined as the probability of any particular area being affected by a destructive volcanic event within a given period of time (Fournier d’Albe 1979). The probabilistic nature of such an important issue derives from the fact that volcanic activity is a complex process, characterized by several and usually unknown degrees o...

  20. Nonparametric and semiparametric dynamic additive regression models

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder; Martinussen, Torben

    Dynamic additive regression models provide a flexible class of models for analysis of longitudinal data. The approach suggested in this work is suited for measurements obtained at random time points and aims at estimating time-varying effects. Both fully nonparametric and semiparametric models can...

  1. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    Science.gov (United States)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  2. a model based on crowsourcing for detecting natural hazards

    Science.gov (United States)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  3. A Moral Hazard Model of Parental Care

    OpenAIRE

    Baomin Dong; Tianpeng Zhou

    2013-01-01

    One perplexing observation is that although men and women have different comparative advantages, cooperation is often only seen during child-bearing and rearing periods. One interpretation is that the juvenile offspring serves as an indivisible public goods to facilitate cooperation between opposite sexes of adults. We show that moral hazard in maternal parental care will either force the father to pay the mother a rent in order to induce optimal care (when the child is of intrinsic high qual...

  4. 2015 USGS Seismic Hazard Model for Induced Seismicity

    Science.gov (United States)

    Petersen, M. D.; Mueller, C. S.; Moschetti, M. P.; Hoover, S. M.; Ellsworth, W. L.; Llenos, A. L.; Michael, A. J.

    2015-12-01

    Over the past several years, the seismicity rate has increased markedly in multiple areas of the central U.S. Studies have tied the majority of this increased activity to wastewater injection in deep wells and hydrocarbon production. These earthquakes are induced by human activities that change rapidly based on economic and policy decisions, making them difficult to forecast. Our 2014 USGS National Seismic Hazard Model and previous models are intended to provide the long-term hazard (2% probability of exceedance in 50 years) and are based on seismicity rates and patterns observed mostly from tectonic earthquakes. However, potentially induced earthquakes were identified in 14 regions that were not included in the earthquake catalog used for constructing the 2014 model. We recognized the importance of considering these induced earthquakes in a separate hazard analysis, and as a result in April 2015 we released preliminary models that explored the impact of this induced seismicity on the hazard. Several factors are important in determining the hazard from induced seismicity: period of the catalog that optimally forecasts the next year's activity, earthquake magnitude-rate distribution, earthquake location statistics, maximum magnitude, ground motion models, and industrial drivers such as injection rates. The industrial drivers are not currently available in a form that we can implement in a 1-year model. Hazard model inputs have been evaluated by a broad group of scientists and engineers to assess the range of acceptable models. Results indicate that next year's hazard is significantly higher by more than a factor of three in Oklahoma, Texas, and Colorado compared to the long-term 2014 hazard model. These results have raised concern about the impacts of induced earthquakes on the built environment and have led to many engineering and policy discussions about how to mitigate these effects for the more than 7 million people that live near areas of induced seismicity.

  5. Flood hazard maps from SAR data and global hydrodynamic models

    Science.gov (United States)

    Giustarini, Laura; Chini, Marci; Hostache, Renaud; Matgen, Patrick; Pappenberger, Florian; Bally, Phillippe

    2015-04-01

    With flood consequences likely to amplify because of growing population and ongoing accumulation of assets in flood-prone areas, global flood hazard and risk maps are greatly needed for improving flood preparedness at large scale. At the same time, with the rapidly growing archives of SAR images of floods, there is a high potential of making use of these images for global and regional flood management. In this framework, an original method is presented to integrate global flood inundation modeling and microwave remote sensing. It takes advantage of the combination of the time and space continuity of a global inundation model with the high spatial resolution of satellite observations. The availability of model simulations over a long time period offers the opportunity to estimate flood non-exceedance probabilities in a robust way. The probabilities can later be attributed to historical satellite observations. SAR-derived flood extent maps with their associated non-exceedance probabilities are then combined to generate flood hazard maps with a spatial resolution equal to that of the satellite images, which is most of the time higher than that of a global inundation model. The method can be applied to any area of interest in the world, provided that a sufficient number of relevant remote sensing images are available. We applied the method on the Severn River (UK) and on the Zambezi River (Mozambique), where large archives of Envisat flood images can be exploited. The global ECMWF flood inundation model is considered for computing the statistics of extreme events. A comparison with flood hazard maps estimated with in situ measured discharge is carried out. An additional analysis has been performed on the Severn River, using high resolution SAR data from the COSMO-SkyMed SAR constellation, acquired for a single flood event (one flood map per day between 27/11/2012 and 4/12/2012). The results showed that it is vital to observe the peak of the flood. However, a single

  6. VHub - Cyberinfrastructure for volcano eruption and hazards modeling and simulation

    Science.gov (United States)

    Valentine, G. A.; Jones, M. D.; Bursik, M. I.; Calder, E. S.; Gallo, S. M.; Connor, C.; Carn, S. A.; Rose, W. I.; Moore-Russo, D. A.; Renschler, C. S.; Pitman, B.; Sheridan, M. F.

    2009-12-01

    Volcanic risk is increasing as populations grow in active volcanic regions, and as national economies become increasingly intertwined. In addition to their significance to risk, volcanic eruption processes form a class of multiphase fluid dynamics with rich physics on many length and time scales. Risk significance, physics complexity, and the coupling of models to complex dynamic spatial datasets all demand the development of advanced computational techniques and interdisciplinary approaches to understand and forecast eruption dynamics. Innovative cyberinfrastructure is needed to enable global collaboration and novel scientific creativity, while simultaneously enabling computational thinking in real-world risk mitigation decisions - an environment where quality control, documentation, and traceability are key factors. Supported by NSF, we are developing a virtual organization, referred to as VHub, to address this need. Overarching goals of the VHub project are: Dissemination. Make advanced modeling and simulation capabilities and key data sets readily available to researchers, students, and practitioners around the world. Collaboration. Provide a mechanism for participants not only to be users but also co-developers of modeling capabilities, and contributors of experimental and observational data sets for use in modeling and simulation, in a collaborative environment that reaches far beyond local work groups. Comparison. Facilitate comparison between different models in order to provide the practitioners with guidance for choosing the "right" model, depending upon the intended use, and provide a platform for multi-model analysis of specific problems and incorporation into probabilistic assessments. Application. Greatly accelerate access and application of a wide range of modeling tools and related data sets to agencies around the world that are charged with hazard planning, mitigation, and response. Education. Provide resources that will promote the training of the

  7. RELIABILITY AND HAZARD RATE ESTIMATION OF A LIFE TESTING MODEL

    Directory of Open Access Journals (Sweden)

    Vinod Kumar

    2010-01-01

    Full Text Available The present paper deals with the reliability and hazard rate estimation of a Weibulltype life testing model. Its use as a life testing model has also been illustrated. The proposedmodel has been found better then exponential for several sets of lifetime data. Somecharacteristics of the model have also been investigated.

  8. A high-resolution global flood hazard model

    Science.gov (United States)

    Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-09-01

    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.

  9. Regional landslide hazard assessment based on Distance Evaluation Model

    Institute of Scientific and Technical Information of China (English)

    Jiacun LI; Yan QIN; Jing LI

    2008-01-01

    There are many factors influencing landslide occurrence. The key for landslide control is to confirm the regional landslide hazard factors. The Cameron Highlands of Malaysia was selected as the study area. By bivariate statistical analysis method with GIS software the authors analyzed the relationships among landslides and environmental factors such as lithology, geomorphy, elevation, road and land use. Distance Evaluation Model was developed with Landslide Density(LD). And the assessment of landslide hazard of Cameron Highlands was performed. The result shows that the model has higher prediction precision.

  10. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    Science.gov (United States)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  11. Research collaboration, hazard modeling and dissemination in volcanology with Vhub

    Science.gov (United States)

    Palma Lizana, J. L.; Valentine, G. A.

    2011-12-01

    Vhub (online at vhub.org) is a cyberinfrastructure for collaboration in volcanology research, education, and outreach. One of the core objectives of this project is to accelerate the transfer of research tools to organizations and stakeholders charged with volcano hazard and risk mitigation (such as observatories). Vhub offers a clearinghouse for computational models of volcanic processes and data analysis, documentation of those models, and capabilities for online collaborative groups focused on issues such as code development, configuration management, benchmarking, and validation. A subset of simulations is already available for online execution, eliminating the need to download and compile locally. In addition, Vhub is a platform for sharing presentations and other educational material in a variety of media formats, which are useful in teaching university-level volcanology. VHub also has wikis, blogs and group functions around specific topics to encourage collaboration and discussion. In this presentation we provide examples of the vhub capabilities, including: (1) tephra dispersion and block-and-ash flow models; (2) shared educational materials; (3) online collaborative environment for different types of research, including field-based studies and plume dispersal modeling; (4) workshops. Future goals include implementation of middleware to allow access to data and databases that are stored and maintained at various institutions around the world. All of these capabilities can be exercised with a user-defined level of privacy, ranging from completely private (only shared and visible to specified people) to completely public. The volcanological community is encouraged to use the resources of vhub and also to contribute models, datasets, and other items that authors would like to disseminate. The project is funded by the US National Science Foundation and includes a core development team at University at Buffalo, Michigan Technological University, and University

  12. A generalized additive regression model for survival times

    DEFF Research Database (Denmark)

    Scheike, Thomas H.

    2001-01-01

    Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models......Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models...

  13. Analysis of time to event outcomes in randomized controlled trials by generalized additive models.

    Directory of Open Access Journals (Sweden)

    Christos Argyropoulos

    Full Text Available Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking.By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population.PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data.By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial results under proportional and

  14. Generalized Additive Models in Business and Economics

    Directory of Open Access Journals (Sweden)

    Sunil K Sapra

    2013-06-01

    Full Text Available The paper presents applications of a class of semi-parametric models called generalized additive models (GAMs to several business and economic datasets. Applications include analysis of wage-education relationship, brand choice, and number of trips to a doctor’s office. The dependent variable may be continuous, categorical or count.  These semi-parametric models are flexible and robust extensions of Logit, Poisson, Negative Binomial and other generalized linear models. The GAMs are represented using penalized regression splines and are estimated by penalized regression methods. The degree of smoothness for the unknown functions in the linear predictor part of the GAM is estimated using cross validation. The GAMs allow us to build a regression surface as a sum of lower-dimensional nonparametric terms circumventing the curse of dimensionality: the slow convergence of an estimator to the true value in high dimensions. For each application studied in the paper, several GAMs are compared and the best model is selected using AIC, UBRE score, deviances, and R-sq (adjusted. The econometric techniques utilized in the paper are widely applicable to the analysis of count, binary response and duration types of data encountered in business and economics.

  15. Toward Building a New Seismic Hazard Model for Mainland China

    Science.gov (United States)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  16. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  17. Rockfall hazard analysis using LiDAR and spatial modeling

    Science.gov (United States)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  18. Random weighting method for Cox's proportional hazards model

    Institute of Scientific and Technical Information of China (English)

    CUI WenQuan; LI Kai; YANG YaNing; WU YueHua

    2008-01-01

    Variance of parameter estimate in Cox's proportional hazards model is based on asymptotic variance.When sample size is small,variance can be estimated by bootstrap method.However,if censoring rate in a survival data set is high,bootstrap method may fail to work properly.This is because bootstrap samples may be even more heavily censored due to repeated sampling of the censored observations.This paper proposes a random weighting method for variance estimation and confidence interval estimation for proportional hazards model.This method,unlike the bootstrap method,does not lead to more severe censoring than the original sample does.Its large sample properties are studied and the consistency and asymptotic normality are proved under mild conditions.Simulation studies show that the random weighting method is not as sensitive to heavy censoring as bootstrap method is and can produce good variance estimates or confidence intervals.

  19. Random weighting method for Cox’s proportional hazards model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Variance of parameter estimate in Cox’s proportional hazards model is based on asymptotic variance. When sample size is small, variance can be estimated by bootstrap method. However, if censoring rate in a survival data set is high, bootstrap method may fail to work properly. This is because bootstrap samples may be even more heavily censored due to repeated sampling of the censored observations. This paper proposes a random weighting method for variance estimation and confidence interval estimation for proportional hazards model. This method, unlike the bootstrap method, does not lead to more severe censoring than the original sample does. Its large sample properties are studied and the consistency and asymptotic normality are proved under mild conditions. Simulation studies show that the random weighting method is not as sensitive to heavy censoring as bootstrap method is and can produce good variance estimates or confidence intervals.

  20. Estimation of the 2-sample hazard ratio function using a semiparametric model

    OpenAIRE

    Yang, Song; Prentice, Ross L.

    2010-01-01

    The hazard ratio provides a natural target for assessing a treatment effect with survival data, with the Cox proportional hazards model providing a widely used special case. In general, the hazard ratio is a function of time and provides a visual display of the temporal pattern of the treatment effect. A variety of nonproportional hazards models have been proposed in the literature. However, available methods for flexibly estimating a possibly time-dependent hazard ratio are limited. Here, we...

  1. Defaultable Game Options in a Hazard Process Model

    Directory of Open Access Journals (Sweden)

    Tomasz R. Bielecki

    2009-01-01

    Full Text Available The valuation and hedging of defaultable game options is studied in a hazard process model of credit risk. A convenient pricing formula with respect to a reference filteration is derived. A connection of arbitrage prices with a suitable notion of hedging is obtained. The main result shows that the arbitrage prices are the minimal superhedging prices with sigma martingale cost under a risk neutral measure.

  2. Modelling public risk evaluation of natural hazards: a conceptual approach

    Directory of Open Access Journals (Sweden)

    Th. Plattner

    2005-01-01

    Full Text Available In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  3. Modelling public risk evaluation of natural hazards: a conceptual approach

    Science.gov (United States)

    Plattner, Th.

    2005-04-01

    In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk) situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  4. Development of hazard-compatible building fragility and vulnerability models

    Science.gov (United States)

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  5. An Additive-Multiplicative Cox-Aalen Regression Model

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2002-01-01

    Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects...

  6. On penalized likelihood estimation for a non-proportional hazards regression model

    OpenAIRE

    Devarajan, Karthik; Ebrahimi, Nader

    2013-01-01

    In this paper, a semi-parametric generalization of the Cox model that permits crossing hazard curves is described. A theoretical framework for estimation in this model is developed based on penalized likelihood methods. It is shown that the optimal solution to the baseline hazard, baseline cumulative hazard and their ratio are hyperbolic splines with knots at the distinct failure times.

  7. On penalized likelihood estimation for a non-proportional hazards regression model.

    Science.gov (United States)

    Devarajan, Karthik; Ebrahimi, Nader

    2013-07-01

    In this paper, a semi-parametric generalization of the Cox model that permits crossing hazard curves is described. A theoretical framework for estimation in this model is developed based on penalized likelihood methods. It is shown that the optimal solution to the baseline hazard, baseline cumulative hazard and their ratio are hyperbolic splines with knots at the distinct failure times.

  8. A decision model for the risk management of hazardous processes

    International Nuclear Information System (INIS)

    A decision model for risk management of hazardous processes as an optimisation problem of a point process is formulated in the study. In the approach, the decisions made by the management are divided into three categories: (1) planned process lifetime, (2) selection of the design and, (3) operational decisions. These three controlling methods play quite different roles in the practical risk management, which is also reflected in our approach. The optimisation of the process lifetime is related to the licensing problem of the process. It provides a boundary condition for a feasible utility function that is used as the actual objective function, i.e., maximizing the process lifetime utility. By design modifications, the management can affect the inherent accident hazard rate of the process. This is usually a discrete optimisation task. The study particularly concentrates upon the optimisation of the operational strategies given a certain design and licensing time. This is done by a dynamic risk model (marked point process model) representing the stochastic process of events observable or unobservable to the decision maker. An optimal long term control variable guiding the selection of operational alternatives in short term problems is studied. The optimisation problem is solved by the stochastic quasi-gradient procedure. The approach is illustrated by a case study. (23 refs.)

  9. Model averaging for semiparametric additive partial linear models

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    To improve the prediction accuracy of semiparametric additive partial linear models(APLM) and the coverage probability of confidence intervals of the parameters of interest,we explore a focused information criterion for model selection among ALPM after we estimate the nonparametric functions by the polynomial spline smoothing,and introduce a general model average estimator.The major advantage of the proposed procedures is that iterative backfitting implementation is avoided,which thus results in gains in computational simplicity.The resulting estimators are shown to be asymptotically normal.A simulation study and a real data analysis are presented for illustrations.

  10. Jackknifed random weighting for Cox proportional hazards model

    Institute of Scientific and Technical Information of China (English)

    LI Xiao; WU YaoHua; TU DongSheng

    2012-01-01

    The Cox proportional hazards model is the most used statistical model in the analysis of survival time data.Recently,a random weighting method was proposed to approximate the distribution of the maximum partial likelihood estimate for the regression coefficient in the Cox model.This method was shown not as sensitive to heavy censoring as the bootstrap method in simulation studies but it may not be second-order accurate as was shown for the bootstrap approximation.In this paper,we propose an alternative random weighting method based on one-step linear jackknife pseudo values and prove the second accuracy of the proposed method.Monte Carlo simulations are also performed to evaluate the proposed method for fixed sample sizes.

  11. A DNA based model for addition computation

    Institute of Scientific and Technical Information of China (English)

    GAO Lin; YANG Xiao; LIU Wenbin; XU Jin

    2004-01-01

    Much effort has been made to solve computing problems by using DNA-an organic simulating method, which in some cases is preferable to the current electronic computer. However, No one at present has proposed an effective and applicable method to solve addition problem with molecular algorithm due to the difficulty in solving the carry problem which can be easily solved by hardware of an electronic computer. In this article, we solved this problem by employing two kinds of DNA strings, one is called result and operation string while the other is named carrier. The result and operation string contains some carry information by its own and denotes the ultimate result while the carrier is just for carrying use. The significance of this algorithm is the original code, the fairly easy steps to follow and the feasibility under current molecular biological technology.

  12. Hazard based models for freeway traffic incident duration.

    Science.gov (United States)

    Tavassoli Hojati, Ahmad; Ferreira, Luis; Washington, Simon; Charles, Phil

    2013-03-01

    Assessing and prioritising cost-effective strategies to mitigate the impacts of traffic incidents and accidents on non-recurrent congestion on major roads represents a significant challenge for road network managers. This research examines the influence of numerous factors associated with incidents of various types on their duration. It presents a comprehensive traffic incident data mining and analysis by developing an incident duration model based on twelve months of incident data obtained from the Australian freeway network. Parametric accelerated failure time (AFT) survival models of incident duration were developed, including log-logistic, lognormal, and Weibul-considering both fixed and random parameters, as well as a Weibull model with gamma heterogeneity. The Weibull AFT models with random parameters were appropriate for modelling incident duration arising from crashes and hazards. A Weibull model with gamma heterogeneity was most suitable for modelling incident duration of stationary vehicles. Significant variables affecting incident duration include characteristics of the incidents (severity, type, towing requirements, etc.), and location, time of day, and traffic characteristics of the incident. Moreover, the findings reveal no significant effects of infrastructure and weather on incident duration. A significant and unique contribution of this paper is that the durations of each type of incident are uniquely different and respond to different factors. The results of this study are useful for traffic incident management agencies to implement strategies to reduce incident duration, leading to reduced congestion, secondary incidents, and the associated human and economic losses.

  13. BEYOND FLOOD HAZARD MAPS: DETAILED FLOOD CHARACTERIZATION WITH REMOTE SENSING, GIS AND 2D MODELLING

    Directory of Open Access Journals (Sweden)

    J. R. Santillan

    2016-09-01

    Full Text Available Flooding is considered to be one of the most destructive among many natural disasters such that understanding floods and assessing the risks associated to it are becoming more important nowadays. In the Philippines, Remote Sensing (RS and Geographic Information System (GIS are two main technologies used in the nationwide modelling and mapping of flood hazards. Although the currently available high resolution flood hazard maps have become very valuable, their use for flood preparedness and mitigation can be maximized by enhancing the layers of information these maps portrays. In this paper, we present an approach based on RS, GIS and two-dimensional (2D flood modelling to generate new flood layers (in addition to the usual flood depths and hazard layers that are also very useful in flood disaster management such as flood arrival times, flood velocities, flood duration, flood recession times, and the percentage within a given flood event period a particular location is inundated. The availability of these new layers of flood information are crucial for better decision making before, during, and after occurrence of a flood disaster. The generation of these new flood characteristic layers is illustrated using the Cabadbaran River Basin in Mindanao, Philippines as case study area. It is envisioned that these detailed maps can be considered as additional inputs in flood disaster risk reduction and management in the Philippines.

  14. Beyond Flood Hazard Maps: Detailed Flood Characterization with Remote Sensing, GIS and 2d Modelling

    Science.gov (United States)

    Santillan, J. R.; Marqueso, J. T.; Makinano-Santillan, M.; Serviano, J. L.

    2016-09-01

    Flooding is considered to be one of the most destructive among many natural disasters such that understanding floods and assessing the risks associated to it are becoming more important nowadays. In the Philippines, Remote Sensing (RS) and Geographic Information System (GIS) are two main technologies used in the nationwide modelling and mapping of flood hazards. Although the currently available high resolution flood hazard maps have become very valuable, their use for flood preparedness and mitigation can be maximized by enhancing the layers of information these maps portrays. In this paper, we present an approach based on RS, GIS and two-dimensional (2D) flood modelling to generate new flood layers (in addition to the usual flood depths and hazard layers) that are also very useful in flood disaster management such as flood arrival times, flood velocities, flood duration, flood recession times, and the percentage within a given flood event period a particular location is inundated. The availability of these new layers of flood information are crucial for better decision making before, during, and after occurrence of a flood disaster. The generation of these new flood characteristic layers is illustrated using the Cabadbaran River Basin in Mindanao, Philippines as case study area. It is envisioned that these detailed maps can be considered as additional inputs in flood disaster risk reduction and management in the Philippines.

  15. Extenics Model for Evaluating Vulnerable Degree of Regional Sustaining Hazard Body

    Institute of Scientific and Technical Information of China (English)

    Fan Yunxiao; Luo Yun; Chen Qingshou

    2004-01-01

    The effect of hazard was determined by the dangerous degree of hazard factor-environment and the vulnerable degree of sustaining body. The research into the latter is of importance for the hazard theory and the formation of laws on the mitigation of natural hazards. The way to evaluate the vulnerable degree is the foundation of and the key to the research. In this paper, the extenics model is established to do this job.

  16. A Mathematical Model for the Industrial Hazardous Waste Location-Routing Problem

    OpenAIRE

    Boyer, Omid; Sai Hong, Tang; Pedram, Ali; Mohd Yusuff, Rosnah Bt; Zulkifli, Norzima

    2013-01-01

    Technology progress is a cause of industrial hazardous wastes increasing in the whole world . Management of hazardous waste is a significant issue due to the imposed risk on environment and human life. This risk can be a result of location of undesirable facilities and also routing hazardous waste. In this paper a biobjective mixed integer programing model for location-routing industrial hazardous waste with two objectives is developed. First objective is total cost minimization including tra...

  17. Preliminary deformation model for National Seismic Hazard map of Indonesia

    International Nuclear Information System (INIS)

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year

  18. Preliminary deformation model for National Seismic Hazard map of Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Meilano, Irwan; Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z. [Geodesy Research Division, Faculty of Earth Science and Technology, Institute of Technology Bandung (Indonesia); Susilo,; Efendi, Joni [Agency for Geospatial Information (BIG) (Indonesia)

    2015-04-24

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  19. Optimization of maintenance policy using the proportional hazard model

    Energy Technology Data Exchange (ETDEWEB)

    Samrout, M. [Information Sciences and Technologies Institute, University of Technology of Troyes, 10000 Troyes (France)], E-mail: mohamad.el_samrout@utt.fr; Chatelet, E. [Information Sciences and Technologies Institute, University of Technology of Troyes, 10000 Troyes (France)], E-mail: chatelt@utt.fr; Kouta, R. [M3M Laboratory, University of Technology of Belfort Montbeliard (France); Chebbo, N. [Industrial Systems Laboratory, IUT, Lebanese University (Lebanon)

    2009-01-15

    The evolution of system reliability depends on its structure as well as on the evolution of its components reliability. The latter is a function of component age during a system's operating life. Component aging is strongly affected by maintenance activities performed on the system. In this work, we consider two categories of maintenance activities: corrective maintenance (CM) and preventive maintenance (PM). Maintenance actions are characterized by their ability to reduce this age. PM consists of actions applied on components while they are operating, whereas CM actions occur when the component breaks down. In this paper, we expound a new method to integrate the effect of CM while planning for the PM policy. The proportional hazard function was used as a modeling tool for that purpose. Interesting results were obtained when comparison between policies that take into consideration the CM effect and those that do not is established.

  20. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  1. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research&Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorist's actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  2. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  3. Regional Integrated Meteorological Forecasting and Warning Model for Geological Hazards Based on Logistic Regression

    Institute of Scientific and Technical Information of China (English)

    XU Jing; YANG Chi; ZHANG Guoping

    2007-01-01

    Information model is adopted to integrate factors of various geosciences to estimate the susceptibility of geological hazards. Further combining the dynamic rainfall observations, Logistic regression is used for modeling the probabilities of geological hazard occurrences, upon which hierarchical warnings for rainfall-induced geological hazards are produced. The forecasting and warning model takes numerical precipitation forecasts on grid points as its dynamic input, forecasts the probabilities of geological hazard occurrences on the same grid, and translates the results into likelihoods in the form of a 5-level hierarchy. Validation of the model with observational data for the year 2004 shows that 80% of the geological hazards of the year have been identified as "likely enough to release warning messages". The model can satisfy the requirements of an operational warning system, thus is an effective way to improve the meteorological warnings for geological hazards.

  4. The Pedestrian Evacuation Analyst: geographic information systems software for modeling hazard evacuation potential

    Science.gov (United States)

    Jones, Jeanne M.; Ng, Peter; Wood, Nathan J.

    2014-01-01

    time map) throughout a hazard zone. Model results provide a general, static view of the evacuation landscape at different pedestrian travel speeds and can be used to identify areas outside the reach of naturally occurring high ground. In addition, data on the size and location of different populations within the hazard zone can be integrated with travel-time maps to create tables and graphs of at-risk population counts as a function of travel time to safety. As a decision-support tool, the Pedestrian Evacuation Analyst provides the capability to evaluate the effectiveness of various vertical-evacuation structures within a study area, both through time maps of the modeled travel-time landscape with a potential structure in place and through comparisons of population counts within reach of safety. The Pedestrian Evacuation Analyst is designed for use by researchers examining the pedestrian-evacuation potential of an at-risk community. In communities where modeled evacuation times exceed the event (for example, tsunami wave) arrival time, researchers can use the software with emergency managers to assess the area and population served by potential vertical-evacuation options. By automating and managing the modeling process, the software allows researchers to concentrate efforts on providing crucial and timely information on community vulnerability to sudden-onset hazards.

  5. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  6. On Model Specification and Selection of the Cox Proportional Hazards Model*

    OpenAIRE

    Lin, Chen-Yen; Halabi, Susan

    2013-01-01

    Prognosis plays a pivotal role in patient management and trial design. A useful prognostic model should correctly identify important risk factors and estimate their effects. In this article, we discuss several challenges in selecting prognostic factors and estimating their effects using the Cox proportional hazards model. Although a flexible semiparametric form, the Cox’s model is not entirely exempt from model misspecification. To minimize possible misspecification, instead of imposing tradi...

  7. Physically Based Landslide Hazard Model U Method and Issues

    Science.gov (United States)

    Dhakal, A. S.; Sidle, R. C.

    An Integrated Dynamic Slope Stability Model (IDSSM) that integrates GIS with topo- graphic, distributed hydrologic and vegetation models to assess the slope stability at a basin scale is described to address the issues related to prediction of landslide hazards with physically based landslide models. Data limitations, which can be addressed as one of the major problem range from lack of spatially distributed data on soil depth, soil physical and engineering properties, and vegetation root strength to the need for better digital elevation models to characterize topography. Many times point data and their averages such as for soil depth and soil cohesion needs to be used as the represen- tative values at the element scale. These factors result in a great degree of uncertainty in the simulation results. Since factors related to landsliding have different degree of importance in causing landsliding the introduced uncertainties may not be identical for the entire variables. The sensitivities of different parameters associated with landslid- ing were examined using the IDSSM. Since many variables are important for landslide occurrence effects of most of the soil and vegetation parameters were evaluated. To test for parameter uncertainty, one variable is altered while others were held constant and cumulative areas (percentage of the drainage area) with safety factor less than certain values were compared. The sensitivity analysis suggests that the safety factor is most sensitive to changes in soil cohesion, soil depth, and internal frictional an- gle. Changes in hydraulic conductivity greatly influenced ground water table and thus slope stability. Parameters such as soil unit weight and tree surcharge was less sen- sitive to landsliding. Considering the possible fine spatial variation of soil depth and hydraulic conductivity in a forest soil these two factors seem to produce large uncer- tainties. In forest soil, the presence of macropores and preferential flow presents

  8. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    Science.gov (United States)

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  9. Modelling Inland Flood Events for Hazard Maps in Taiwan

    Science.gov (United States)

    Ghosh, S.; Nzerem, K.; Sassi, M.; Hilberts, A.; Assteerawatt, A.; Tillmanns, S.; Mathur, P.; Mitas, C.; Rafique, F.

    2015-12-01

    Taiwan experiences significant inland flooding, driven by torrential rainfall from plum rain storms and typhoons during summer and fall. From last 13 to 16 years data, 3,000 buildings were damaged by such floods annually with a loss US$0.41 billion (Water Resources Agency). This long, narrow island nation with mostly hilly/mountainous topography is located at tropical-subtropical zone with annual average typhoon-hit-frequency of 3-4 (Central Weather Bureau) and annual average precipitation of 2502mm (WRA) - 2.5 times of the world's average. Spatial and temporal distributions of countrywide precipitation are uneven, with very high local extreme rainfall intensities. Annual average precipitation is 3000-5000mm in the mountainous regions, 78% of it falls in May-October, and the 1-hour to 3-day maximum rainfall are about 85 to 93% of the world records (WRA). Rivers in Taiwan are short with small upstream areas and high runoff coefficients of watersheds. These rivers have the steepest slopes, the shortest response time with rapid flows, and the largest peak flows as well as specific flood peak discharge (WRA) in the world. RMS has recently developed a countrywide inland flood model for Taiwan, producing hazard return period maps at 1arcsec grid resolution. These can be the basis for evaluating and managing flood risk, its economic impacts, and insured flood losses. The model is initiated with sub-daily historical meteorological forcings and calibrated to daily discharge observations at about 50 river gauges over the period 2003-2013. Simulations of hydrologic processes, via rainfall-runoff and routing models, are subsequently performed based on a 10000 year set of stochastic forcing. The rainfall-runoff model is physically based continuous, semi-distributed model for catchment hydrology. The 1-D wave propagation hydraulic model considers catchment runoff in routing and describes large-scale transport processes along the river. It also accounts for reservoir storage

  10. Statistical modeling of ground motion relations for seismic hazard analysis

    OpenAIRE

    Raschke, Mathias

    2012-01-01

    We introduce a new approach for ground motion relations (GMR) in the probabilistic seismic hazard analysis (PSHA), being influenced by the extreme value theory of mathematical statistics. Therein, we understand a GMR as a random function. We derive mathematically the principle of area-equivalence; wherein two alternative GMRs have an equivalent influence on the hazard if these GMRs have equivalent area functions. This includes local biases. An interpretation of the difference between these GM...

  11. Conceptual geoinformation model of natural hazards risk assessment

    Science.gov (United States)

    Kulygin, Valerii

    2016-04-01

    Natural hazards are the major threat to safe interactions between nature and society. The assessment of the natural hazards impacts and their consequences is important in spatial planning and resource management. Today there is a challenge to advance our understanding of how socio-economical and climate changes will affect the frequency and magnitude of hydro-meteorological hazards and associated risks. However, the impacts from different types of natural hazards on various marine and coastal economic activities are not of the same type. In this study, the conceptual geomodel of risk assessment is presented to highlight the differentiation by the type of economic activities in extreme events risk assessment. The marine and coastal ecosystems are considered as the objects of management, on the one hand, and as the place of natural hazards' origin, on the other hand. One of the key elements in describing of such systems is the spatial characterization of their components. Assessment of ecosystem state is based on ecosystem indicators (indexes). They are used to identify the changes in time. The scenario approach is utilized to account for the spatio-temporal dynamics and uncertainty factors. Two types of scenarios are considered: scenarios of using ecosystem services by economic activities and scenarios of extreme events and related hazards. The reported study was funded by RFBR, according to the research project No. 16-35-60043 mol_a_dk.

  12. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    International Nuclear Information System (INIS)

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures

  13. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    Science.gov (United States)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes

  14. Modelling the costs of natural hazards in games

    Science.gov (United States)

    Bostenaru-Dan, M.

    2012-04-01

    City are looked for today, including a development at the University of Torino called SimTorino, which simulates the development of the city in the next 20 years. The connection to another games genre as video games, the board games, will be investigated, since there are games on construction and reconstruction of a cathedral and its tower and a bridge in an urban environment of the middle ages based on the two novels of Ken Follett, "Pillars of the Earth" and "World Without End" and also more recent games, such as "Urban Sprawl" or the Romanian game "Habitat", dealing with the man-made hazard of demolition. A review of these games will be provided based on first hand playing experience. In games like "World without End" or "Pillars of the Earth", just like in the recently popular games of Zynga on social networks, construction management is done through providing "building" an item out of stylised materials, such as "stone", "sand" or more specific ones as "nail". Such approach could be used also for retrofitting buildings for earthquakes, in the series of "upgrade", not just for extension as it is currently in games, and this is what our research is about. "World without End" includes a natural disaster not so analysed today but which was judged by the author as the worst of manhood: the Black Death. The Black Death has effects and costs as well, not only modelled through action cards, but also on the built environment, by buildings remaining empty. On the other hand, games such as "Habitat" rely on role playing, which has been recently recognised as a way to bring games theory to decision making through the so-called contribution of drama, a way to solve conflicts through balancing instead of weighting, and thus related to Analytic Hierarchy Process. The presentation aims to also give hints on how to design a game for the problem of earthquake retrofit, translating the aims of the actors in such a process into role playing. Games are also employed in teaching of urban

  15. A non-additive negotiation model for utility computing markets

    OpenAIRE

    Macías Lloret, Mario; Guitart Fernández, Jordi

    2009-01-01

    Market-based resource allocation is a promising model for dealing with the growing Utility Computing environments, such as Grid or Cloud Computing. Agents that represent both service clients and providers meet in a market to negotiate the terms of the sale of resources. Additive negotiation models are extended because they are simple, but they are not valid for negotiations whose terms are not independent between them. This paper proposes a simple non-additive model for performing negotiat...

  16. Expert elicitation for a national-level volcano hazard model

    Science.gov (United States)

    Bebbington, Mark; Stirling, Mark; Cronin, Shane; Wang, Ting; Jolly, Gill

    2016-04-01

    The quantification of volcanic hazard at national level is a vital pre-requisite to placing volcanic risk on a platform that permits meaningful comparison with other hazards such as earthquakes. New Zealand has up to a dozen dangerous volcanoes, with the usual mixed degrees of knowledge concerning their temporal and spatial eruptive history. Information on the 'size' of the eruptions, be it in terms of VEI, volume or duration, is sketchy at best. These limitations and the need for a uniform approach lend themselves to a subjective hazard analysis via expert elicitation. Approximately 20 New Zealand volcanologists provided estimates for the size of the next eruption from each volcano and, conditional on this, its location, timing and duration. Opinions were likewise elicited from a control group of statisticians, seismologists and (geo)chemists, all of whom had at least heard the term 'volcano'. The opinions were combined via the Cooke classical method. We will report on the preliminary results from the exercise.

  17. DEVELOPMENT OF A CRASH RISK PROBABILITY MODEL FOR FREEWAYS BASED ON HAZARD PREDICTION INDEX

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2014-12-01

    Full Text Available This study presents a method for the identification of hazardous situations on the freeways. The hazard identification is done using a crash risk probability model. For this study, about 18 km long section of Eastern Freeway in Melbourne (Australia is selected as a test bed. Two categories of data i.e. traffic and accident record data are used for the analysis and modelling. In developing the crash risk probability model, Hazard Prediction Index is formulated in this study by the differences of traffic parameters with threshold values. Seven different prediction indices are examined and the best one is selected as crash risk probability model based on prediction error minimisation.

  18. Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    In natural hazard risk assessment, a resolution mismatch between hazard data and aggregated exposure data is often observed. A possible solution to this issue is the disaggregation of exposure data to match the spatial resolution of hazard data. Disaggregation models available in literature...... disaggregation model that considers the uncertainty in the disaggregation, taking basis in the scaled Dirichlet distribution. The proposed probabilistic disaggregation model is applied to a portfolio of residential buildings in the Canton Bern, Switzerland, subject to flood risk. Thereby, the model is verified...

  19. Spatial Distributed Seismicity Model of Seismic Hazard Mapping in the North-China Region: A Comparison with the GSHAP

    Science.gov (United States)

    Zhong, Q.; Shi, B.; Meng, L.

    2010-12-01

    The North China is one of the most seismically active regions in the mainland China. The moderate to large earthquakes have occurred here throughout history, resulting in huge losses of human life and properties. With the probabilistic seismic hazard analysis (PSHA) approach, we investigate the influence of different seismic environments, incorporating both near surface soil properties and distributed historical and modern seismicity. A simplified seismic source model, derived with the consideration of regional active fault distributions, is presented for the North China region. The spatial distributed seismicity model of PSHA is used to calculate the level of ground motion likely to be exceeded in a given time period. Following Frankel (1995) approach of circular Gaussian smoothing procedure, in the PSHA’s calculation, we proposed the fault-rupture-oriented elliptical Gaussian smoothing with the assumptions that earthquakes occur on faults or fault zones of past earthquakes to delineate the potential seismic zones (Lapajine et al., 2003). This is combined with regional active fault strike directions and the seismicity distribution patterns. Next Generation Attenuation model ((NGA), Boore et al., 2007) is used in generating hazard map for PGA with 2%, 5%, and 10 % probability of being exceeded in 50 years, and the resultant hazard map is compared with the result given by Global Seismic Hazard Assessment Project (GSHAP). There is general agreement for PGA distribution patterns between the results of this study and the GSHAP map that used the same seismic source zones. However, peak ground accelerations predicted in this study are typically 10-20% less than those of the GSHAP, and the seismic source models, such as fault distributions and regional seismicity used in the GSHAP seem to be oversimplified. We believe this study represents an improvement on prior seismic hazard evaluations for the region. In addition to the updated input data, we believe that, by

  20. Delayed geochemical hazard: Concept, digital model and case study

    Institute of Scientific and Technical Information of China (English)

    CHEN Ming; FENG Liu; Jacques Yvon

    2005-01-01

    Delayed Geochemical Hazard (DGH briefly) presents the whole process of a kind of serious ecological and environmental hazard caused by sudden reactivation and sharp release of long-term accumulated pollutant from stable species to active ones in soil or sediment system due to the change of physical-chemical conditions (such as temperature, pH, Eh, moisture, the concentrations of organic matters, etc.) or the decrease of environment capacity. The characteristics of DGH are discussed. The process of a typical DGH can be expressed as a nonlinear polynomial. The points where the derivative functions of the first and second orders of the polynomial reach zero, minimum and maximum are keys for risk assessment and harzard pridication.The process and mechanism of the hazard is due to the transform of pollutant among different species principally. The concepts of "total releasable content of pollutant", TRCP, and "total concentration of active specie", TCAS, are necessarily defined to describe the mechanism of DGH. The possibility of the temporal and spatial propagation is discussed. Case study shows that there exists a transform mechanism of "gradual release" and "chain reaction" among the species of the exchangeable and the bounds to carbonate, iron and manganese oxides and organic matter, thus causing the delayed geochemical hazard.

  1. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    International Nuclear Information System (INIS)

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study

  2. Comparison of empirical and data driven hydrometeorological hazard models on coastal cities of São Paulo, Brazil

    Science.gov (United States)

    Koga-Vicente, A.; Friedel, M. J.

    2010-12-01

    Every year thousands of people are affected by floods and landslide hazards caused by rainstorms. The problem is more serious in tropical developing countries because of the susceptibility as a result of the high amount of available energy to form storms, and the high vulnerability due to poor economic and social conditions. Predictive models of hazards are important tools to manage this kind of risk. In this study, a comparison of two different modeling approaches was made for predicting hydrometeorological hazards in 12 cities on the coast of São Paulo, Brazil, from 1994 to 2003. In the first approach, an empirical multiple linear regression (MLR) model was developed and used; the second approach used a type of unsupervised nonlinear artificial neural network called a self-organized map (SOM). By using twenty three independent variables of susceptibility (precipitation, soil type, slope, elevation, and regional atmospheric system scale) and vulnerability (distribution and total population, income and educational characteristics, poverty intensity, human development index), binary hazard responses were obtained. Model performance by cross-validation indicated that the respective MLR and SOM model accuracy was about 67% and 80%. Prediction accuracy can be improved by the addition of information, but the SOM approach is preferred because of sparse data and highly nonlinear relations among the independent variables.

  3. A Quasi-Poisson Approach on Modeling Accident Hazard Index for Urban Road Segments

    Directory of Open Access Journals (Sweden)

    Lu Ma

    2014-01-01

    Full Text Available In light of the recently emphasized studies on risk evaluation of crashes, accident counts under specific transportation facilities are adopted to reflect the chance of crash occurrence. The current study introduces more comprehensive measure with the supplement information of accidental harmfulness into the expression of accident risks which are also named Accident Hazard Index (AHI in the following context. Before the statistical analysis, datasets from various sources are integrated under a GIS platform and the corresponding procedures are presented as an illustrated example for similar analysis. Then, a quasi-Poisson regression model is suggested for analyses and the results show that the model is appropriate for dealing with overdispersed count data and several key explanatory variables were found to have significant impact on the estimation of AHI. In addition, the effect of weight on different severity levels of accidents is examined and the selection of the weight is also discussed.

  4. An Additive-Multiplicative Restricted Mean Residual Life Model

    DEFF Research Database (Denmark)

    Mansourvar, Zahra; Martinussen, Torben; Scheike, Thomas H.

    2016-01-01

    The mean residual life measures the expected remaining life of a subject who has survived up to a particular time. When survival time distribution is highly skewed or heavy tailed, the restricted mean residual life must be considered. In this paper, we propose an additive-multiplicative restricted...... mean residual life model to study the association between the restricted mean residual life function and potential regression covariates in the presence of right censoring. This model extends the proportional mean residual life model using an additive model as its covariate dependent baseline...

  5. Complex Modelling Scheme Of An Additive Manufacturing Centre

    Science.gov (United States)

    Popescu, Liliana Georgeta

    2015-09-01

    This paper presents a modelling scheme sustaining the development of an additive manufacturing research centre model and its processes. This modelling is performed using IDEF0, the resulting model process representing the basic processes required in developing such a centre in any university. While the activities presented in this study are those recommended in general, changes may occur in specific existing situations in a research centre.

  6. Marginal integration $M-$estimators for additive models

    OpenAIRE

    Boente, Graciela; Martinez, Alejandra

    2015-01-01

    Additive regression models have a long history in multivariate nonparametric regression. They provide a model in which each regression function depends only on a single explanatory variable allowing to obtain estimators at the optimal univariate rate. Beyond backfitting, marginal integration is a common procedure to estimate each component. In this paper, we propose a robust estimator of the additive components which combines local polynomials on the component to be estimated and marginal int...

  7. Debris flow hazard modelling on medium scale: Valtellina di Tirano, Italy

    Directory of Open Access Journals (Sweden)

    J. Blahut

    2010-11-01

    Full Text Available Debris flow hazard modelling at medium (regional scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal, and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy. The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R, developed at the University of Lausanne (Switzerland. An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise

  8. Linear non-threshold radiation hazards (LNT) model and the evaluation of the current model

    International Nuclear Information System (INIS)

    To introduce linear non-threshold (LNT) model used in study of the dose effect of radiation hazards and to evaluate current application of the model. The comprehensive analysis of the literatures, presents an objective points of view. Results: LNT model describes the biological effects induced by high dose is better than description of the biological effects induced by low doses m accuracy; repairable-conditionally repairable model in study of cell radiation effects can well take into account on cell survival curve on the conditions of high, medium and low radiation dose range; assessment model of effective dose of internal radiation based on the LNT assumptions and individual mean organ equivalent dose still exists many uncertainties, taking gender differences into account it is necessary to establish gender-specific voxel human model. Conclusion: The advantages and disadvantages of various models coexist. Before the birth of the new theory and new model following the current theories and assessment of radiation hazards LNT model is still the most scientific attitude and a wise choice. (authors)

  9. A double moral hazard model of organization design

    OpenAIRE

    Berkovitch, Elazar; Israel, Ronen; Spiegel, Yossi

    2007-01-01

    We develop a theory of organization design in which the firm's structure is chosen to mitigate moral hazard problems in the selection and the implementation of projects. For a given set of projects, the 'divisional structure' which gives each agent the full responsibility over a subset of projects is in general more efficient than the functional structure under which projects are implemented by teams of agents, each of whom specializes in one task. However, the ex post efficiency of the divis...

  10. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.

  11. Investigation of the Effect of Traffic Parameters on Road Hazard Using Classification Tree Model

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2012-09-01

    Full Text Available This paper presents a method for the identification of hazardous situations on the freeways. For this study, about 18 km long section of Eastern Freeway in Melbourne, Australia was selected as a test bed. Three categories of data i.e. traffic, weather and accident record data were used for the analysis and modelling. In developing the crash risk probability model, classification tree based model was developed in this study. In formulating the models, it was found that weather conditions did not have significant impact on accident occurrence so the classification tree was built using two traffic indices; traffic flow and vehicle speed only. The formulated classification tree is able to identify the possible hazard and non-hazard situations on freeway. The outcome of the study will aid the hazard mitigation strategies.

  12. A Mixture Proportional Hazards Model with Random Effects for Response Times in Tests

    Science.gov (United States)

    Ranger, Jochen; Kuhn, Jörg-Tobias

    2016-01-01

    In this article, a new model for test response times is proposed that combines latent class analysis and the proportional hazards model with random effects in a similar vein as the mixture factor model. The model assumes the existence of different latent classes. In each latent class, the response times are distributed according to a…

  13. Additive Intensity Regression Models in Corporate Default Analysis

    DEFF Research Database (Denmark)

    Lando, David; Medhat, Mamdouh; Nielsen, Mads Stenbo;

    2013-01-01

    We consider additive intensity (Aalen) models as an alternative to the multiplicative intensity (Cox) models for analyzing the default risk of a sample of rated, nonfinancial U.S. firms. The setting allows for estimating and testing the significance of time-varying effects. We use a variety...... of model checking techniques to identify misspecifications. In our final model, we find evidence of time-variation in the effects of distance-to-default and short-to-long term debt. Also we identify interactions between distance-to-default and other covariates, and the quick ratio covariate is significant...

  14. Using Set Model for Learning Addition of Integers

    Directory of Open Access Journals (Sweden)

    Umi Puji Lestari

    2015-07-01

    Full Text Available This study aims to investigate how set model can help students' understanding of addition of integers in fourth grade. The study has been carried out to 23 students and a teacher of IVC SD Iba Palembang in January 2015. This study is a design research that also promotes PMRI as the underlying design context and activity. Results showed that the use of set models that is packaged in activity of recording of financial transactions in two color chips and card game can help students to understand the concept of zero pair, addition with the same colored chips, and cancellation strategy.

  15. Process chain modeling and selection in an additive manufacturing context

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Stolfi, Alessandro; Mischkot, Michael

    2016-01-01

    This paper introduces a new two-dimensional approach to modeling manufacturing process chains. This approach is used to consider the role of additive manufacturing technologies in process chains for a part with micro scale features and no internal geometry. It is shown that additive manufacturing...... can compete with traditional process chains for small production runs. Combining both types of technology added cost but no benefit in this case. The new process chain model can be used to explain the results and support process selection, but process chain prototyping is still important for rapidly...

  16. Electroacoustics modeling of piezoelectric welders for ultrasonic additive manufacturing processes

    Science.gov (United States)

    Hehr, Adam; Dapino, Marcelo J.

    2016-04-01

    Ultrasonic additive manufacturing (UAM) is a recent 3D metal printing technology which utilizes ultrasonic vibrations from high power piezoelectric transducers to additively weld similar and dissimilar metal foils. CNC machining is used intermittent of welding to create internal channels, embed temperature sensitive components, sensors, and materials, and for net shaping parts. Structural dynamics of the welder and work piece influence the performance of the welder and part quality. To understand the impact of structural dynamics on UAM, a linear time-invariant model is used to relate system shear force and electric current inputs to the system outputs of welder velocity and voltage. Frequency response measurements are combined with in-situ operating measurements of the welder to identify model parameters and to verify model assumptions. The proposed LTI model can enhance process consistency, performance, and guide the development of improved quality monitoring and control strategies.

  17. A-optimal designs for an additive quadratic mixture model

    OpenAIRE

    Chan, LY; Guan, YN; Zhang, CQ

    1998-01-01

    Quadratic models are widely used in the analysis of experiments involving mixtures. This paper gives A-optimal designs for an additive quadratic mixture model for q ≥ 3 mixture components. It is proved that in these A-optimal designs, vertices of the simplex S q-1 are support points, and other support points shift gradually from barycentres of depth 1 to barycentres of depth 3 as q increases. A-optimal designs with minimal support are also discussed.

  18. Potential of weight of evidence modelling for gully erosion hazard assessment in Mbire District - Zimbabwe

    Science.gov (United States)

    Dube, F.; Nhapi, I.; Murwira, A.; Gumindoga, W.; Goldin, J.; Mashauri, D. A.

    Gully erosion is an environmental concern particularly in areas where landcover has been modified by human activities. This study assessed the extent to which the potential of gully erosion could be successfully modelled as a function of seven environmental factors (landcover, soil type, distance from river, distance from road, Sediment Transport Index (STI), Stream Power Index (SPI) and Wetness Index (WI) using a GIS-based Weight of Evidence Modelling (WEM) in the Mbire District of Zimbabwe. Results show that out of the studied seven factors affecting gully erosion, five were significantly correlated (p 0.05). A gully erosion hazard map showed that 78% of the very high hazard class area is within a distance of 250 m from rivers. Model validation indicated that 70% of the validation set of gullies were in the high hazard and very high hazard class. The resulting map of areas susceptible to gully erosion has a prediction accuracy of 67.8%. The predictive capability of the weight of evidence model in this study suggests that landcover, soil type, distance from river, STI and SPI are useful in creating a gully erosion hazard map but may not be sufficient to produce a valid map of gully erosion hazard.

  19. Traffic Incident Clearance Time and Arrival Time Prediction Based on Hazard Models

    Directory of Open Access Journals (Sweden)

    Yang beibei Ji

    2014-01-01

    Full Text Available Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an effective input for travel time prediction. In this paper, the hazard based prediction models are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.

  20. Lava Flow Hazard Modeling during the 2014-2015 Fogo eruption, Cape Verde

    Science.gov (United States)

    Del Negro, C.; Cappello, A.; Ganci, G.; Calvari, S.; Perez, N. M.; Hernandez Perez, P. A.; Victoria, S. S.; Cabral, J.

    2015-12-01

    Satellite remote sensing techniques and lava flow forecasting models have been combined to allow an ensemble response during effusive crises at poorly monitored volcanoes. Here, we use the HOTSAT volcano hot spot detection system that works with satellite thermal infrared data and the MAGFLOW lava flow emplacement model that considers the way in which effusion rate changes during an eruption, to forecast lava flow hazards during the 2014-2015 Fogo eruption. In many ways this was one of the major effusive eruption crises of recent years, since the lava flows actually invaded populated areas. HOTSAT is used to promptly analyze MODIS and SEVIRI data to output hot spot location, lava thermal flux, and effusion rate estimation. We use this output to drive the MAGFLOW simulations of lava flow paths and to update continuously flow simulations. Satellite-derived TADR estimates can be obtained in real time and lava flow simulations of several days of eruption can be calculated in a few minutes, thus making such a combined approach of paramount importance to provide timely forecasts of the areas that a lava flow could possibly inundate. In addition, such forecasting scenarios can be continuously updated in response to changes in the eruptive activity as detected by satellite imagery. We also show how Landsat-8 OLI and EO-1 ALI images complement the field observations for tracking the flow front position through time, and add considerable data on lava flow advancement to validate the results of numerical simulations. Our results thus demonstrate how the combination of satellite remote sensing and lava flow modeling can be effectively used during eruptive crises to produce realistic lava flow hazard scenarios and for assisting local authorities in making decisions during a volcanic eruption.

  1. Implications of different digital elevation models and preprocessing techniques to delineate debris flow inundation hazard zones in El Salvador

    Science.gov (United States)

    Anderson, E. R.; Griffin, R.; Irwin, D.

    2013-12-01

    . Optimized pit filling techniques use both cut and fill operations to minimize modifications of the original DEM. Satellite image interpretation and field surveying provide the baseline upon which to test the accuracy of each model simulation. By outlining areas that could potentially be inundated by debris flows, these efforts can be used to more accurately identify the places and assets immediately exposed to landslide hazards. We contextualize the results of the previous and ongoing efforts into how they may be incorporated into decision support systems. We also discuss if and how these analyses would have provided additional knowledge in the past, and identify specific recommendations as to how they could contribute to a more robust decision support system in the future.

  2. Snakes as hazards: modelling risk by chasing chimpanzees.

    Science.gov (United States)

    McGrew, William C

    2015-04-01

    Snakes are presumed to be hazards to primates, including humans, by the snake detection hypothesis (Isbell in J Hum Evol 51:1-35, 2006; Isbell, The fruit, the tree, and the serpent. Why we see so well, 2009). Quantitative, systematic data to test this idea are lacking for the behavioural ecology of living great apes and human foragers. An alternative proxy is snakes encountered by primatologists seeking, tracking, and observing wild chimpanzees. We present 4 years of such data from Mt. Assirik, Senegal. We encountered 14 species of snakes a total of 142 times. Almost two-thirds of encounters were with venomous snakes. Encounters occurred most often in forest and least often in grassland, and more often in the dry season. The hypothesis seems to be supported, if frequency of encounter reflects selective risk of morbidity or mortality.

  3. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    Science.gov (United States)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  4. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  5. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    Science.gov (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    location data. These results show a high concordance between the landslide inventory and the high susceptibility estimated zone with an adjustment of 95.1 % for ANN model and 89.4% for LR model. In addition, we make a comparative analysis of both techniques using the Receiver Operating Characteristic (ROC) curve, a graphical plot of the sensitivity vs. (1 - specificity) for a binary classifier system in function of its discrimination threshold, and calculating the Area Under the ROC (AUROC) value for each model. Finally, the previous models are used for the developing a new probabilistic landslide hazard map for future events. They are obtained combining the expected triggering factor (calculated earthquake ground motion) for a return period of 475 years with the susceptibility map.

  6. AN INSTRUCTURAL SYSTEM MODEL OF COASTAL MANAGEMENT TO THE WATER RELATED HAZARDS IN CHINA

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Coastal lowlands have large areas of hazard impact and relativelylow capacity of prevention to the water related hazards,which have been indicated by the wide-spread flood hazards,high percentages of land with high flood vulnerability.Increasing population pressure and the shift of resources exploitation from land to sea will force more and more coastal lowlands to be developed in the future,further enhancing the danger of water-related hazards.In this paper,the coastal lowlands in the northern Jiangsu province,China,were selected as a case study.The Interpretation Structural Model (ISM) was employed to analyze the direct and indirect impacts among the elements within the system,and thereby,to identify the causal elements,middle linkages,their expressions,and relations.

  7. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  8. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    Science.gov (United States)

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    the outburst of landslide-dammed lakes) remains a challenge: • The knowledge about the onset of the process is often limited (bathymetry of the lakes, subsurface water, properties of dam (content of ice), type of dam breach, understanding of process chains and interactions). • The size of glacial lakes may change rapidly but continuously, and many lakes break out within a short time after their development. Continuous monitoring is therefore required to keep updated on the existing hazards. • Also the outburst of small glacial lakes may lead to significant debris floods or even debris flows if there is plenty of erodible material available. • The available modeling software packages are of limited suitability for lake outburst floods: e.g. software developed by the hydrological community is specialized to simulate (debris) floods with input hydrographs on moderately steep flow channels and with lower sediment loads. In contrast to this, programs for rapid mass movements are better suited on steeper slopes and sudden onset of the movement. The typical characteristics of GLOFs are in between and vary for different channel sections. In summary, the major bottlenecks remain in deriving realistic or worst case scenarios and predicting their magnitude and area of impact. This mainly concerns uncertainties in the dam break process, involved volumes, erosion rates, changing rheologies, and the limited capabilities of available software packages to simulate process interactions and transformations such as the development of a hyperconcentrated flow into a debris flow. In addition, many areas prone to lake outburst floods are located in developing countries with a limited scope of the threatened population for decision-making and limited resources for mitigation.

  9. Three multimedia models used at hazardous and radioactive waste sites

    International Nuclear Information System (INIS)

    Multimedia models are used commonly in the initial phases of the remediation process where technical interest is focused on determining the relative importance of various exposure pathways. This report provides an approach for evaluating and critically reviewing the capabilities of multimedia models. This study focused on three specific models MEPAS Version 3.0, MMSOILS Version 2.2, and PRESTO-EPA-CPG Version 2.0. These models evaluate the transport and fate of contaminants from source to receptor through more than a single pathway. The presence of radioactive and mixed wastes at a site poses special problems. Hence, in this report, restrictions associated with the selection and application of multimedia models for sites contaminated with radioactive and mixed wastes are highlighted. This report begins with a brief introduction to the concept of multimedia modeling, followed by an overview of the three models. The remaining chapters present more technical discussions of the issues associated with each compartment and their direct application to the specific models. In these analyses, the following components are discussed: source term; air transport; ground water transport; overland flow, runoff, and surface water transport; food chain modeling; exposure assessment; dosimetry/risk assessment; uncertainty; default parameters. The report concludes with a description of evolving updates to the model; these descriptions were provided by the model developers

  10. Three multimedia models used at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; Fthenakis, V.M.; Holtzman, S.; Sun, L.C. [Brookhaven National Lab., Upton, NY (United States); Rambaugh, J.O.; Potter, S. [Geraghty and Miller, Inc., Plainview, NY (United States)

    1996-02-01

    Multimedia models are used commonly in the initial phases of the remediation process where technical interest is focused on determining the relative importance of various exposure pathways. This report provides an approach for evaluating and critically reviewing the capabilities of multimedia models. This study focused on three specific models MEPAS Version 3.0, MMSOILS Version 2.2, and PRESTO-EPA-CPG Version 2.0. These models evaluate the transport and fate of contaminants from source to receptor through more than a single pathway. The presence of radioactive and mixed wastes at a site poses special problems. Hence, in this report, restrictions associated with the selection and application of multimedia models for sites contaminated with radioactive and mixed wastes are highlighted. This report begins with a brief introduction to the concept of multimedia modeling, followed by an overview of the three models. The remaining chapters present more technical discussions of the issues associated with each compartment and their direct application to the specific models. In these analyses, the following components are discussed: source term; air transport; ground water transport; overland flow, runoff, and surface water transport; food chain modeling; exposure assessment; dosimetry/risk assessment; uncertainty; default parameters. The report concludes with a description of evolving updates to the model; these descriptions were provided by the model developers.

  11. Understanding Rasch Measurement: The Rasch Model, Additive Conjoint Measurement, and New Models of Probabilistic Measurement Theory.

    Science.gov (United States)

    Karabatsos, George

    2001-01-01

    Describes similarities and differences between additive conjoint measurement and the Rasch model, and formalizes some new nonparametric item response models that are, in a sense, probabilistic measurement theory models. Applies these new models to published and simulated data. (SLD)

  12. Hazardous Concentrations for Ecosystems (HCE): calculation with CATS models

    OpenAIRE

    Traas TP; Aldenberg T; Janse JH; Brock TCM; Roghair CJ; Rijksinstituut voor Volksgezondheid en Milieu (RIVM), Winand Staring Centrum (SC-DLO); LWD

    1995-01-01

    Dose-response functions were fitted on data from laboratory toxicity tests and were used to predict the response of functional groups in food webs. Direct effects of Chlorpyrifos (CPF), as observed in microcosm experiments, could be modelled adequately by incorporating dose-response functions in a CATS model. Indirect effects of CPF on functional groups, resulting from direct toxicity, could be predicted with the model too. The ecosystem response to toxicants was used to propose a quality sta...

  13. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  14. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    Science.gov (United States)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  15. Modeling and Testing Landslide Hazard Using Decision Tree

    Directory of Open Access Journals (Sweden)

    Mutasem Sh. Alkhasawneh

    2014-01-01

    Full Text Available This paper proposes a decision tree model for specifying the importance of 21 factors causing the landslides in a wide area of Penang Island, Malaysia. These factors are vegetation cover, distance from the fault line, slope angle, cross curvature, slope aspect, distance from road, geology, diagonal length, longitude curvature, rugosity, plan curvature, elevation, rain perception, soil texture, surface area, distance from drainage, roughness, land cover, general curvature, tangent curvature, and profile curvature. Decision tree models are used for prediction, classification, and factors importance and are usually represented by an easy to interpret tree like structure. Four models were created using Chi-square Automatic Interaction Detector (CHAID, Exhaustive CHAID, Classification and Regression Tree (CRT, and Quick-Unbiased-Efficient Statistical Tree (QUEST. Twenty-one factors were extracted using digital elevation models (DEMs and then used as input variables for the models. A data set of 137570 samples was selected for each variable in the analysis, where 68786 samples represent landslides and 68786 samples represent no landslides. 10-fold cross-validation was employed for testing the models. The highest accuracy was achieved using Exhaustive CHAID (82.0% compared to CHAID (81.9%, CRT (75.6%, and QUEST (74.0% model. Across the four models, five factors were identified as most important factors which are slope angle, distance from drainage, surface area, slope aspect, and cross curvature.

  16. Modeling contractor and company employee behavior in high hazard operation

    NARCIS (Netherlands)

    Lin, P.H.; Hanea, D.; Ale, B.J.M.

    2013-01-01

    The recent blow-out and subsequent environmental disaster in the Gulf of Mexico have highlighted a number of serious problems in scientific thinking about safety. Risk models have generally concentrated on technical failures, which are easier to model and for which there are more concrete data. Howe

  17. Testing exclusion restrictions and additive separability in sample selection models

    DEFF Research Database (Denmark)

    Huber, Martin; Mellace, Giovanni

    2014-01-01

    Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction of these......Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction...... of these assumptions by applying the approach of Huber and Mellace (Testing instrument validity for LATE identification based on inequality moment constraints, 2011) (for testing instrument validity under treatment endogeneity) to the sample selection framework. We show that the exclusion restriction and additive...... separability imply two testable inequality constraints that come from both point identifying and bounding the outcome distribution of the subpopulation that is always selected/observed. We apply the tests to two variables for which the exclusion restriction is frequently invoked in female wage regressions: non...

  18. Two-stage local M-estimation of additive models

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper studies local M-estimation of the nonparametric components of additive models.A two-stage local M-estimation procedure is proposed for estimating the additive components and their derivatives.Under very mild conditions,the proposed estimators of each additive component and its derivative are jointly asymptotically normal and share the same asymptotic distributions as they would be if the other components were known.The established asymptotic results also hold for two particular local M-estimations:the local least squares and least absolute deviation estimations.However,for general two-stage local M-estimation with continuous and nonlinear ψ-functions,its implementation is time-consuming.To reduce the computational burden,one-step approximations to the two-stage local M-estimators are developed.The one-step estimators are shown to achieve the same effciency as the fully iterative two-stage local M-estimators,which makes the two-stage local M-estimation more feasible in practice.The proposed estimators inherit the advantages and at the same time overcome the disadvantages of the local least-squares based smoothers.In addition,the practical implementation of the proposed estimation is considered in details.Simulations demonstrate the merits of the two-stage local M-estimation,and a real example illustrates the performance of the methodology.

  19. Two-stage local M-estimation of additive models

    Institute of Scientific and Technical Information of China (English)

    JIANG JianCheng; LI JianTao

    2008-01-01

    This paper studies local M-estimation of the nonparametric components of additive models. A two-stage local M-estimation procedure is proposed for estimating the additive components and their derivatives. Under very mild conditions, the proposed estimators of each additive component and its derivative are jointly asymptotically normal and share the same asymptotic distributions as they would be if the other components were known. The established asymptotic results also hold for two particular local M-estimations: the local least squares and least absolute deviation estimations. However,for general two-stage local M-estimation with continuous and nonlinear ψ-functions, its implementation is time-consuming. To reduce the computational burden, one-step approximations to the two-stage local M-estimators are developed. The one-step estimators are shown to achieve the same efficiency as the fully iterative two-stage local M-estimators, which makes the two-stage local M-estimation more feasible in practice. The proposed estimators inherit the advantages and at the same time overcome the disadvantages of the local least-squares based smoothers. In addition, the practical implementation of the proposed estimation is considered in details. Simulations demonstrate the merits of the two-stage local M-estimation, and a real example illustrates the performance of the methodology.

  20. Checking Fine and Gray Subdistribution Hazards Model with Cumulative Sums of Residuals

    OpenAIRE

    Li, Jianing; Scheike, Thomas H.; Zhang, Mei-Jie

    2014-01-01

    Recently, Fine and Gray (1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. T...

  1. Addition Table of Colours: Additive and Subtractive Mixtures Described Using a Single Reasoning Model

    Science.gov (United States)

    Mota, A. R.; Lopes dos Santos, J. M. B.

    2014-01-01

    Students' misconceptions concerning colour phenomena and the apparent complexity of the underlying concepts--due to the different domains of knowledge involved--make its teaching very difficult. We have developed and tested a teaching device, the addition table of colours (ATC), that encompasses additive and subtractive mixtures in a single…

  2. Efficient pan-European flood hazard modelling through a combination of statistical and physical models

    Science.gov (United States)

    Paprotny, Dominik; Morales Nápoles, Oswaldo

    2016-04-01

    Low-resolution hydrological models are often applied to calculate extreme river discharges and delimitate flood zones on continental and global scale. Still, the computational expense is very large and often limits the extent and depth of such studies. Here, we present a quick yet similarly accurate procedure for flood hazard assessment in Europe. Firstly, a statistical model based on Bayesian Networks is used. It describes the joint distribution of annual maxima of daily discharges of European rivers with variables describing the geographical characteristics of their catchments. It was quantified with 75,000 station-years of river discharge, as well as climate, terrain and land use data. The model's predictions of average annual maxima or discharges with certain return periods are of similar performance to physical rainfall-runoff models applied at continental scale. A database of discharge scenarios - return periods under present and future climate - was prepared for the majority of European rivers. Secondly, those scenarios were used as boundary conditions for one-dimensional (1D) hydrodynamic model SOBEK. Utilizing 1D instead of 2D modelling conserved computational time, yet gave satisfactory results. The resulting pan-European flood map was contrasted with some local high-resolution studies. Indeed, the comparison shows that, in overall, the methods presented here gave similar or better alignment with local studies than previously released pan-European flood map.

  3. Comparing the European (SHARE) and the reference Italian seismic hazard models

    Science.gov (United States)

    Visini, Francesco; Meletti, Carlo; D'Amico, Vera; Rovida, Andrea; Stucchi, Massimiliano

    2016-04-01

    A probabilistic seismic hazard evaluation for Europe has been recently released by the SHARE project (www.share-eu.org, Giardini et al., 2013; Woessner et al., 2015). A comparison between SHARE results for Italy and the official Italian seismic hazard model (MPS04, Stucchi et al., 2011), currently adopted by the building code, has been carried on to identify the main input elements that produce the differences between the two models. The SHARE model shows increased expected values (up to 70%) with respect to the MPS04 model for PGA with 10% probability of exceedance in 50 years. However, looking in detail at all output parameters of both the models, we observe that for spectral periods greater than 0.3 s, the reference PSHA for Italy proposes higher values than the SHARE model for many and large areas. This behaviour is mainly guided by the adoption of recent ground-motion prediction equations (GMPEs) that estimate higher values for PGA and for accelerations with periods lower than 0.3 s and lower values for higher periods with respect to older GMPEs used in MPS04. Another important set of tests consisted in analyzing separately the PSHA results obtained by the three source models adopted in SHARE (i.e., area sources, fault sources with background, and a refined smoothed seismicity model), whereas MPS04 only used area sources. Results show that, besides the strong impact of the GMPEs, the differences on the seismic hazard estimates among the three source models are relevant and, in particular, for some selected test sites, the fault-based model returns lowest estimates of seismic hazard. This result arises questions on the completeness of the fault database, their parameterization and assessment of activity rates as well as on the impact of the threshold magnitude between faults and background. Giardini D. et al., 2013. Seismic Hazard Harmonization in Europe (SHARE): Online Data Resource, doi:10.12686/SED-00000001-SHARE. Stucchi M. et al., 2011. Seismic Hazard

  4. Estimation and variable selection for generalized additive partial linear models

    KAUST Repository

    Wang, Li

    2011-08-01

    We study generalized additive partial linear models, proposing the use of polynomial spline smoothing for estimation of nonparametric functions, and deriving quasi-likelihood based estimators for the linear parameters. We establish asymptotic normality for the estimators of the parametric components. The procedure avoids solving large systems of equations as in kernel-based procedures and thus results in gains in computational simplicity. We further develop a class of variable selection procedures for the linear parameters by employing a nonconcave penalized quasi-likelihood, which is shown to have an asymptotic oracle property. Monte Carlo simulations and an empirical example are presented for illustration. © Institute of Mathematical Statistics, 2011.

  5. Additive manufacturing for consumer-centric business models

    DEFF Research Database (Denmark)

    Bogers, Marcel; Hadar, Ronen; Bilberg, Arne

    2016-01-01

    Digital fabrication—including additive manufacturing (AM), rapid prototyping and 3D printing—has the potential to revolutionize the way in which products are produced and delivered to the customer. Therefore, it challenges companies to reinvent their business model—describing the logic of creating......-centric to a consumer-centric value logic. A major shift includes a move from centralized to decentralized supply chains, where consumer goods manufacturers can implement a “hybrid” approach with a focus on localization and accessibility or develop a fully personalized model where the consumer effectively takes over...

  6. Multiscale Modeling of Powder Bed–Based Additive Manufacturing

    Science.gov (United States)

    Markl, Matthias; Körner, Carolin

    2016-07-01

    Powder bed fusion processes are additive manufacturing technologies that are expected to induce the third industrial revolution. Components are built up layer by layer in a powder bed by selectively melting confined areas, according to sliced 3D model data. This technique allows for manufacturing of highly complex geometries hardly machinable with conventional technologies. However, the underlying physical phenomena are sparsely understood and difficult to observe during processing. Therefore, an intensive and expensive trial-and-error principle is applied to produce components with the desired dimensional accuracy, material characteristics, and mechanical properties. This review presents numerical modeling approaches on multiple length scales and timescales to describe different aspects of powder bed fusion processes. In combination with tailored experiments, the numerical results enlarge the process understanding of the underlying physical mechanisms and support the development of suitable process strategies and component topologies.

  7. Multiscale Modeling of Powder Bed-Based Additive Manufacturing

    Science.gov (United States)

    Markl, Matthias; Körner, Carolin

    2016-07-01

    Powder bed fusion processes are additive manufacturing technologies that are expected to induce the third industrial revolution. Components are built up layer by layer in a powder bed by selectively melting confined areas, according to sliced 3D model data. This technique allows for manufacturing of highly complex geometries hardly machinable with conventional technologies. However, the underlying physical phenomena are sparsely understood and difficult to observe during processing. Therefore, an intensive and expensive trial-and-error principle is applied to produce components with the desired dimensional accuracy, material characteristics, and mechanical properties. This review presents numerical modeling approaches on multiple length scales and timescales to describe different aspects of powder bed fusion processes. In combination with tailored experiments, the numerical results enlarge the process understanding of the underlying physical mechanisms and support the development of suitable process strategies and component topologies.

  8. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    Science.gov (United States)

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological

  9. A NOVEL SOFT COMPUTING MODEL ON LANDSLIDE HAZARD ZONE MAPPING

    Directory of Open Access Journals (Sweden)

    Iqbal Quraishi

    2012-11-01

    Full Text Available The effect of landslide is very prominent in India as well as world over. In India North-East region and all the areas beneath the Himalayan range is prone to landslide. As state wise Uttrakhand, Himachal Pradesh and northern part of West Bengal are identified as a risk zone for landslide. In West Bengal, Darjeeling area is identified as our focus zone. There are several types of landslides depending upon various conditions. Most contributing factor of landslide is Earthquakes. Both field and the GIS data are very versatile and large in amount. Creating a proper data warehouse includes both Remote and field studies. Our proposed soft computing model merge the field and remote sensing data and create an optimized landslide susceptible map of the zone and also provide a broad risk assessment. It takes into account census and economic survey data as an input to calculate and predict the probable number of damaged houses, roads, other amenities including the effect on GDP. The model is highly customizable and tends to provide situation specific results. A fuzzy logic based approach has been considered to partially implement the model in terms of different parameter data sets to show the effectiveness of the proposed model.

  10. Hazardous Concentrations for Ecosystems (HCE): calculation with CATS models

    NARCIS (Netherlands)

    Traas TP; Aldenberg T; Janse JH; Brock TCM; Roghair CJ; Rijksinstituut voor; LWD

    1995-01-01

    Dose-response functions were fitted on data from laboratory toxicity tests and were used to predict the response of functional groups in food webs. Direct effects of Chlorpyrifos (CPF), as observed in microcosm experiments, could be modelled adequately by incorporating dose-response functions in a C

  11. Model Checking Vector Addition Systems with one zero-test

    CERN Document Server

    Bonet, Rémi; Leroux, Jérôme; Zeitoun, Marc

    2012-01-01

    We design a variation of the Karp-Miller algorithm to compute, in a forward manner, a finite representation of the cover (i.e., the downward closure of the reachability set) of a vector addition system with one zero-test. This algorithm yields decision procedures for several problems for these systems, open until now, such as place-boundedness or LTL model-checking. The proof techniques to handle the zero-test are based on two new notions of cover: the refined and the filtered cover. The refined cover is a hybrid between the reachability set and the classical cover. It inherits properties of the reachability set: equality of two refined covers is undecidable, even for usual Vector Addition Systems (with no zero-test), but the refined cover of a Vector Addition System is a recursive set. The second notion of cover, called the filtered cover, is the central tool of our algorithms. It inherits properties of the classical cover, and in particular, one can effectively compute a finite representation of this set, e...

  12. Additive functions in boolean models of gene regulatory network modules.

    Science.gov (United States)

    Darabos, Christian; Di Cunto, Ferdinando; Tomassini, Marco; Moore, Jason H; Provero, Paolo; Giacobini, Mario

    2011-01-01

    Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity

  13. Utilizing NASA Earth Observations to Model Volcanic Hazard Risk Levels in Areas Surrounding the Copahue Volcano in the Andes Mountains

    Science.gov (United States)

    Keith, A. M.; Weigel, A. M.; Rivas, J.

    2014-12-01

    Copahue is a stratovolcano located along the rim of the Caviahue Caldera near the Chile-Argentina border in the Andes Mountain Range. There are several small towns located in proximity of the volcano with the two largest being Banos Copahue and Caviahue. During its eruptive history, it has produced numerous lava flows, pyroclastic flows, ash deposits, and lahars. This isolated region has steep topography and little vegetation, rendering it poorly monitored. The need to model volcanic hazard risk has been reinforced by recent volcanic activity that intermittently released several ash plumes from December 2012 through May 2013. Exposure to volcanic ash is currently the main threat for the surrounding populations as the volcano becomes more active. The goal of this project was to study Copahue and determine areas that have the highest potential of being affected in the event of an eruption. Remote sensing techniques were used to examine and identify volcanic activity and areas vulnerable to experiencing volcanic hazards including volcanic ash, SO2 gas, lava flow, pyroclastic density currents and lahars. Landsat 7 Enhanced Thematic Mapper Plus (ETM+), Landsat 8 Operational Land Imager (OLI), EO-1 Advanced Land Imager (ALI), Terra Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Shuttle Radar Topography Mission (SRTM), ISS ISERV Pathfinder, and Aura Ozone Monitoring Instrument (OMI) products were used to analyze volcanic hazards. These datasets were used to create a historic lava flow map of the Copahue volcano by identifying historic lava flows, tephra, and lahars both visually and spectrally. Additionally, a volcanic risk and hazard map for the surrounding area was created by modeling the possible extent of ash fallout, lahars, lava flow, and pyroclastic density currents (PDC) for future eruptions. These model results were then used to identify areas that should be prioritized for disaster relief and evacuation orders.

  14. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    Science.gov (United States)

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  15. Geo-additive modelling of malaria in Burundi

    Directory of Open Access Journals (Sweden)

    Gebhardt Albrecht

    2011-08-01

    Full Text Available Abstract Background Malaria is a major public health issue in Burundi in terms of both morbidity and mortality, with around 2.5 million clinical cases and more than 15,000 deaths each year. It is still the single main cause of mortality in pregnant women and children below five years of age. Because of the severe health and economic burden of malaria, there is still a growing need for methods that will help to understand the influencing factors. Several studies/researches have been done on the subject yielding different results as which factors are most responsible for the increase in malaria transmission. This paper considers the modelling of the dependence of malaria cases on spatial determinants and climatic covariates including rainfall, temperature and humidity in Burundi. Methods The analysis carried out in this work exploits real monthly data collected in the area of Burundi over 12 years (1996-2007. Semi-parametric regression models are used. The spatial analysis is based on a geo-additive model using provinces as the geographic units of study. The spatial effect is split into structured (correlated and unstructured (uncorrelated components. Inference is fully Bayesian and uses Markov chain Monte Carlo techniques. The effects of the continuous covariates are modelled by cubic p-splines with 20 equidistant knots and second order random walk penalty. For the spatially correlated effect, Markov random field prior is chosen. The spatially uncorrelated effects are assumed to be i.i.d. Gaussian. The effects of climatic covariates and the effects of other spatial determinants are estimated simultaneously in a unified regression framework. Results The results obtained from the proposed model suggest that although malaria incidence in a given month is strongly positively associated with the minimum temperature of the previous months, regional patterns of malaria that are related to factors other than climatic variables have been identified

  16. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

    DEFF Research Database (Denmark)

    Nielsen, Jan; Parner, Erik

    2010-01-01

    In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

  17. Evaluation and hydrological modelization in the natural hazard prevention

    International Nuclear Information System (INIS)

    Soil degradation affects negatively his functions as a base to produce food, to regulate the hydrological cycle and the environmental quality. All over the world soil degradation is increasing partly due to lacks or deficiencies in the evaluations of the processes and causes of this degradation on each specific situation. The processes of soil physical degradation are manifested through several problems as compaction, runoff, hydric and Eolic erosion, landslides with collateral effects in situ and in the distance, often with disastrous consequences as foods, landslides, sedimentations, droughts, etc. These processes are frequently associated to unfavorable changes into the hydrologic processes responsible of the water balance and soil hydric regimes, mainly derived to soil use changes and different management practices and climatic changes. The evaluation of these processes using simple simulation models; under several scenarios of climatic change, soil properties and land use and management; would allow to predict the occurrence of this disastrous processes and consequently to select and apply the appropriate practices of soil conservation to eliminate or reduce their effects. This simulation models require, as base, detailed climatic information and hydrologic soil properties data. Despite of the existence of methodologies and commercial equipment (each time more sophisticated and precise) to measure the different physical and hydrological soil properties related with degradation processes, most of them are only applicable under really specific or laboratory conditions. Often indirect methodologies are used, based on relations or empiric indexes without an adequate validation, that often lead to expensive mistakes on the evaluation of soil degradation processes and their effects on natural disasters. It could be preferred simple field methodologies, direct and adaptable to different soil types and climates and to the sample size and the spatial variability of the

  18. Global river flood hazard maps: hydraulic modelling methods and appropriate uses

    Science.gov (United States)

    Townend, Samuel; Smith, Helen; Molloy, James

    2014-05-01

    Flood hazard is not well understood or documented in many parts of the world. Consequently, the (re-)insurance sector now needs to better understand where the potential for considerable river flooding aligns with significant exposure. For example, international manufacturing companies are often attracted to countries with emerging economies, meaning that events such as the 2011 Thailand floods have resulted in many multinational businesses with assets in these regions incurring large, unexpected losses. This contribution addresses and critically evaluates the hydraulic methods employed to develop a consistent global scale set of river flood hazard maps, used to fill the knowledge gap outlined above. The basis of the modelling approach is an innovative, bespoke 1D/2D hydraulic model (RFlow) which has been used to model a global river network of over 5.3 million kilometres. Estimated flood peaks at each of these model nodes are determined using an empirically based rainfall-runoff approach linking design rainfall to design river flood magnitudes. The hydraulic model is used to determine extents and depths of floodplain inundation following river bank overflow. From this, deterministic flood hazard maps are calculated for several design return periods between 20-years and 1,500-years. Firstly, we will discuss the rationale behind the appropriate hydraulic modelling methods and inputs chosen to produce a consistent global scaled river flood hazard map. This will highlight how a model designed to work with global datasets can be more favourable for hydraulic modelling at the global scale and why using innovative techniques customised for broad scale use are preferable to modifying existing hydraulic models. Similarly, the advantages and disadvantages of both 1D and 2D modelling will be explored and balanced against the time, computer and human resources available, particularly when using a Digital Surface Model at 30m resolution. Finally, we will suggest some

  19. Rainfall Hazards Prevention based on a Local Model Forecasting System

    Science.gov (United States)

    Buendia, F.; Ojeda, B.; Buendia Moya, G.; Tarquis, A. M.; Andina, D.

    2009-04-01

    Rainfall is one of the most important events of human life and society. Some rainfall phenomena like floods or hailstone are a threat to the agriculture, business and even life. However in the meteorological observatories there are methods to detect and alarm about this kind of events, nowadays the prediction techniques based on synoptic measurements need to be improved to achieve medium term feasible forecasts. Any deviation in the measurements or in the model description makes the forecast to diverge in time from the real atmosphere evolution. In this paper the advances in a local rainfall forecasting system based on time series estimation with General Regression Neural Networks are presented. The system is introduced, explaining the measurements, methodology and the current state of the development. The aim of the work is to provide a complementary criteria to the current forecast systems, based on the daily atmosphere observation and tracking over a certain place.

  20. Flood Hazard Mapping Combining Hydrodynamic Modeling and Multi Annual Remote Sensing data

    Directory of Open Access Journals (Sweden)

    Laura Giustarini

    2015-10-01

    Full Text Available This paper explores a method to combine the time and space continuity of a large-scale inundation model with discontinuous satellite microwave observations, for high-resolution flood hazard mapping. The assumption behind this approach is that hydraulic variables computed from continuous spatially-distributed hydrodynamic modeling and observed as discrete satellite-derived flood extents are correlated in time, so that probabilities can be transferred from the model series to the observations. A prerequisite is, therefore, the existence of a significant correlation between a modeled variable (i.e., flood extent or volume and the synchronously-observed flood extent. If this is the case, the availability of model simulations over a long time period allows for a robust estimate of non-exceedance probabilities that can be attributed to corresponding synchronously-available satellite observations. The generated flood hazard map has a spatial resolution equal to that of the satellite images, which is higher than that of currently available large scale inundation models. The method was applied on the Severn River (UK, using the outputs of a global inundation model provided by the European Centre for Medium-range Weather Forecasts and a large collection of ENVISAT ASAR imagery. A comparison between the hazard map obtained with the proposed method and with a more traditional numerical modeling approach supports the hypothesis that combining model results and satellite observations could provide advantages for high-resolution flood hazard mapping, provided that a sufficient number of remote sensing images is available and that a time correlation is present between variables derived from a global model and obtained from satellite observations.

  1. Building a risk-targeted regional seismic hazard model for South-East Asia

    Science.gov (United States)

    Woessner, J.; Nyst, M.; Seyhan, E.

    2015-12-01

    The last decade has tragically shown the social and economic vulnerability of countries in South-East Asia to earthquake hazard and risk. While many disaster mitigation programs and initiatives to improve societal earthquake resilience are under way with the focus on saving lives and livelihoods, the risk management sector is challenged to develop appropriate models to cope with the economic consequences and impact on the insurance business. We present the source model and ground motions model components suitable for a South-East Asia earthquake risk model covering Indonesia, Malaysia, the Philippines and Indochine countries. The source model builds upon refined modelling approaches to characterize 1) seismic activity from geologic and geodetic data on crustal faults and 2) along the interface of subduction zones and within the slabs and 3) earthquakes not occurring on mapped fault structures. We elaborate on building a self-consistent rate model for the hazardous crustal fault systems (e.g. Sumatra fault zone, Philippine fault zone) as well as the subduction zones, showcase some characteristics and sensitivities due to existing uncertainties in the rate and hazard space using a well selected suite of ground motion prediction equations. Finally, we analyze the source model by quantifying the contribution by source type (e.g., subduction zone, crustal fault) to typical risk metrics (e.g.,return period losses, average annual loss) and reviewing their relative impact on various lines of businesses.

  2. [Critical of the additive model of the randomized controlled trial].

    Science.gov (United States)

    Boussageon, Rémy; Gueyffier, François; Bejan-Angoulvant, Theodora; Felden-Dominiak, Géraldine

    2008-01-01

    Randomized, double-blind, placebo-controlled clinical trials are currently the best way to demonstrate the clinical effectiveness of drugs. Its methodology relies on the method of difference (John Stuart Mill), through which the observed difference between two groups (drug vs placebo) can be attributed to the pharmacological effect of the drug being tested. However, this additive model can be questioned in the event of statistical interactions between the pharmacological and the placebo effects. Evidence in different domains has shown that the placebo effect can influence the effect of the active principle. This article evaluates the methodological, clinical and epistemological consequences of this phenomenon. Topics treated include extrapolating results, accounting for heterogeneous results, demonstrating the existence of several factors in the placebo effect, the necessity to take these factors into account for given symptoms or pathologies, as well as the problem of the "specific" effect.

  3. Combining multiple nondestructive inspection images with a generalized additive model

    International Nuclear Information System (INIS)

    In this paper, multiple nondestructive inspection (NDI) images are combined with a generalized additive model to achieve a more precise and reliable assessment of hidden corrosion in aircraft lap joints. Two inspection techniques are considered in this study. One is the conventional multi-frequency eddy current testing technique and the other is the pulsed eddy current technique. To characterize the thickness loss or equivalently to achieve a quantitative measure of corrosion, multiple NDI images are fused to produce a thickness map that reflected the amount of corrosion damage. These results are further compared with corresponding digital x-ray thickness maps, which are obtained by mapping the remaining thickness after the specimen is dissembled and all the corrosion products are cleaned. Experimental results demonstrate that the proposed algorithms outperform the traditional calibration method aligned with a single testing approach

  4. Large Historical Earthquakes and Tsunami Hazards in the Western Mediterranean: Source Characteristics and Modelling

    Science.gov (United States)

    Harbi, Assia; Meghraoui, Mustapha; Belabbes, Samir; Maouche, Said

    2010-05-01

    The western Mediterranean region was the site of numerous large earthquakes in the past. Most of these earthquakes are located at the East-West trending Africa-Eurasia plate boundary and along the coastline of North Africa. The most recent recorded tsunamigenic earthquake occurred in 2003 at Zemmouri-Boumerdes (Mw 6.8) and generated ~ 2-m-high tsunami wave. The destructive wave affected the Balearic Islands and Almeria in southern Spain and Carloforte in southern Sardinia (Italy). The earthquake provided a unique opportunity to gather instrumental records of seismic waves and tide gauges in the western Mediterranean. A database that includes a historical catalogue of main events, seismic sources and related fault parameters was prepared in order to assess the tsunami hazard of this region. In addition to the analysis of the 2003 records, we study the 1790 Oran and 1856 Jijel historical tsunamigenic earthquakes (Io = IX and X, respectively) that provide detailed observations on the heights and extension of past tsunamis and damage in coastal zones. We performed the modelling of wave propagation using NAMI-DANCE code and tested different fault sources from synthetic tide gauges. We observe that the characteristics of seismic sources control the size and directivity of tsunami wave propagation on both northern and southern coasts of the western Mediterranean.

  5. Using the RBFN model and GIS technique to assess wind erosion hazards of Inner Mongolia, China

    Science.gov (United States)

    Shi, Huading; Liu, Jiyuan; Zhuang, Dafang; Hu, Yunfeng

    2006-08-01

    Soil wind erosion is the primary process and the main driving force for land desertification and sand-dust storms in arid and semi-arid areas of Northern China. Many researchers have paid more attention to this issue. This paper select Inner Mongolia autonomous region as the research area, quantify the various indicators affecting the soil wind erosion, using the GIS technology to extract the spatial data, and construct the RBFN (Radial Basis Function Network) model for assessment of wind erosion hazard. After training the sample data of the different levels of wind erosion hazard, we get the parameters of the model, and then assess the wind erosion hazard. The result shows that in the Southern parts of Inner Mongolia wind erosion hazard are very severe, counties in the middle regions of Inner Mongolia vary from moderate to severe, and in eastern are slight. The comparison of the result with other researches shows that the result is in conformity with actual conditions, proving the reasonability and applicability of the RBFN model.

  6. Versatility of cooperative transcriptional activation: a thermodynamical modeling analysis for greater-than-additive and less-than-additive effects.

    Directory of Open Access Journals (Sweden)

    Till D Frank

    Full Text Available We derive a statistical model of transcriptional activation using equilibrium thermodynamics of chemical reactions. We examine to what extent this statistical model predicts synergy effects of cooperative activation of gene expression. We determine parameter domains in which greater-than-additive and less-than-additive effects are predicted for cooperative regulation by two activators. We show that the statistical approach can be used to identify different causes of synergistic greater-than-additive effects: nonlinearities of the thermostatistical transcriptional machinery and three-body interactions between RNA polymerase and two activators. In particular, our model-based analysis suggests that at low transcription factor concentrations cooperative activation cannot yield synergistic greater-than-additive effects, i.e., DNA transcription can only exhibit less-than-additive effects. Accordingly, transcriptional activity turns from synergistic greater-than-additive responses at relatively high transcription factor concentrations into less-than-additive responses at relatively low concentrations. In addition, two types of re-entrant phenomena are predicted. First, our analysis predicts that under particular circumstances transcriptional activity will feature a sequence of less-than-additive, greater-than-additive, and eventually less-than-additive effects when for fixed activator concentrations the regulatory impact of activators on the binding of RNA polymerase to the promoter increases from weak, to moderate, to strong. Second, for appropriate promoter conditions when activator concentrations are increased then the aforementioned re-entrant sequence of less-than-additive, greater-than-additive, and less-than-additive effects is predicted as well. Finally, our model-based analysis suggests that even for weak activators that individually induce only negligible increases in promoter activity, promoter activity can exhibit greater-than-additive

  7. Time-aggregation effects on the baseline of continuous-time and discrete-time hazard models

    NARCIS (Netherlands)

    ter Hofstede, F.; Wedel, M.

    1999-01-01

    In this study we reinvestigate the effect of time-aggregation for discrete- and continuous-time hazard models. We reanalyze the results of a previous Monte Carlo study by ter Hofstede and Wedel (1998), in which the effects of time-aggregation on the parameter estimates of hazard models were investig

  8. Advances in National Capabilities for Consequence Assessment Modeling of Airborne Hazards

    Energy Technology Data Exchange (ETDEWEB)

    Nasstrom, J; Sugiyama, G; Foster, K; Larsen, S; Kosovic, B; Eme, B; Walker, H; Goldstein, P; Lundquist, J; Pobanz, B; Fulton, J

    2007-11-26

    This paper describes ongoing advancement of airborne hazard modeling capabilities in support of multiple agencies through the National Atmospheric Release Advisory Center (NARAC) and the Interagency Atmospheric Modeling and Atmospheric Assessment Center (IMAAC). A suite of software tools developed by Lawrence Livermore National Laboratory (LLNL) and collaborating organizations includes simple stand-alone, local-scale plume modeling tools for end user's computers, Web- and Internet-based software to access advanced 3-D flow and atmospheric dispersion modeling tools and expert analysis from the national center at LLNL, and state-of-the-science high-resolution urban models and event reconstruction capabilities.

  9. A Mathematical Model for the Industrial Hazardous Waste Location-Routing Problem

    Directory of Open Access Journals (Sweden)

    Omid Boyer

    2013-01-01

    Full Text Available Technology progress is a cause of industrial hazardous wastes increasing in the whole world . Management of hazardous waste is a significant issue due to the imposed risk on environment and human life. This risk can be a result of location of undesirable facilities and also routing hazardous waste. In this paper a biobjective mixed integer programing model for location-routing industrial hazardous waste with two objectives is developed. First objective is total cost minimization including transportation cost, operation cost, initial investment cost, and cost saving from selling recycled waste. Second objective is minimization of transportation risk. Risk of population exposure within bandwidth along route is used to measure transportation risk. This model can help decision makers to locate treatment, recycling, and disposal centers simultaneously and also to route waste between these facilities considering risk and cost criteria. The results of the solved problem prove conflict between two objectives. Hence, it is possible to decrease the cost value by marginally increasing the transportation risk value and vice versa. A weighted sum method is utilized to combine two objectives function into one objective function. To solve the problem GAMS software with CPLEX solver is used. The problem is applied in Markazi province in Iran.

  10. Critical load analysis in hazard assessment of metals using a Unit World Model.

    Science.gov (United States)

    Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L

    2011-09-01

    A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic

  11. Techniques, advances, problems and issues in numerical modelling of landslide hazard

    CERN Document Server

    Van Asch, Theo; Van Beek, Ludovicus; Amitrano, David

    2007-01-01

    Slope movements (e.g. landslides) are dynamic systems that are complex in time and space and closely linked to both inherited and current preparatory and triggering controls. It is not yet possible to assess in all cases conditions for failure, reactivation and rapid surges and successfully simulate their transient and multi-dimensional behaviour and development, although considerable progress has been made in isolating many of the key variables and elementary mechanisms and to include them in physically-based models for landslide hazard assessments. Therefore, the objective of this paper is to review the state-of-the-art in the understanding of landslide processes and to identify some pressing challenges for the development of our modelling capabilities in the forthcoming years for hazard assessment. This paper focuses on the special nature of slope movements and the difficulties related to simulating their complex time-dependent behaviour in mathematical, physically-based models. It analyses successively th...

  12. A nonparametric dynamic additive regression model for longitudinal data

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas H.

    2000-01-01

    dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models......dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models...

  13. An Uncertain Wage Contract Model with Adverse Selection and Moral Hazard

    Directory of Open Access Journals (Sweden)

    Xiulan Wang

    2014-01-01

    it can be characterized as an uncertain variable. Moreover, the employee's effort is unobservable to the employer, and the employee can select her effort level to maximize her utility. Thus, an uncertain wage contract model with adverse selection and moral hazard is established to maximize the employer's expected profit. And the model analysis mainly focuses on the equivalent form of the proposed wage contract model and the optimal solution to this form. The optimal solution indicates that both the employee's effort level and the wage increase with the employee's ability. Lastly, a numerical example is given to illustrate the effectiveness of the proposed model.

  14. Generalized Additive Modelling of Mixed Distribution Markov Models with Application to Melbourne's Rainfall.

    OpenAIRE

    Hyndman, R. J.; Grunwald, G. K.

    1999-01-01

    We consider modelling time series using a generalized additive model with first- order Markov structure and mixed transition density having a discrete component at zero and a continuous component with positive sample space. Such models have application, for example, in modelling daily occurrence and intensity of rainfall, and in modelling the number and size of insurance claims. We show how these methods extend the usual sinusoidal seasonal assumption in standard chain- dependent models by as...

  15. Data Model for Multi Hazard Risk Assessment Spatial Support Decision System

    Science.gov (United States)

    Andrejchenko, Vera; Bakker, Wim; van Westen, Cees

    2014-05-01

    The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The

  16. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    Science.gov (United States)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  17. Percolation model with an additional source of disorder

    Science.gov (United States)

    Kundu, Sumanta; Manna, S. S.

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p . Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R1 and R2 of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R1-R2 plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is pc(sq) , the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R ∈{0 ,R0} and a percolation transition is observed with R0 as the control variable, similar to the site occupation probability.

  18. Model and Method for Multiobjective Time-Dependent Hazardous Material Transportation

    Directory of Open Access Journals (Sweden)

    Zhen Zhou

    2014-01-01

    Full Text Available In most of the hazardous material transportation problems, risk factors are assumed to be constant, which ignores the fact that they can vary with time throughout the day. In this paper, we deal with a novel time-dependent hazardous material transportation problem via lane reservation, in which the dynamic nature of transportation risk in the real-life traffic environment is taken into account. We first develop a multiobjective mixed integer programming (MIP model with two conflicting objectives: minimizing the impact on the normal traffic resulting from lane reservation and minimizing the total transportation risk. We then present a cut-and-solve based ε-constraint method to solve this model. Computational results indicate that our method outperforms the ε-constraint method based on optimization software package CPLEX.

  19. The Impact of the Subduction Modeling Beneath Calabria on Seismic Hazard

    Science.gov (United States)

    Morasca, P.; Johnson, W. J.; Del Giudice, T.; Poggi, P.; Traverso, C.; Parker, E. J.

    2014-12-01

    The aim of this work is to better understand the influence of subduction beneath Calabria on seismic hazard, as very little is known about present-day kinematics and the seismogenic potential of the slab interface in the Calabrian Arc region. This evaluation is significant because, depending on stress conditions, subduction zones can vary from being fully coupled to almost entirely decoupled with important consequences in the seismic hazard assessment. Although the debate is still open about the current kinematics of the plates and microplates lying in the region and the degree of coupling of Ionian lithosphere beneath Calabria, GPS data suggest that this subduction is locked in its interface sector. Also the lack of instrumentally recorded thrust earthquakes suggests this zone is locked. The current seismotectonic model developed for the Italian National territory is simplified in this area and does not reflect the possibility of locked subduction beneath the Calabria that could produce infrequent, but very large earthquakes associated with the subduction interface. Because of this we have conducted an independent seismic source analysis to take into account the influence of subduction as part of a regional seismic hazard analysis. Our final model includes two separate provinces for the subduction beneath the Calabria: inslab and interface. From a geometrical point of view the interface province is modeled with a depth between 20-50 km and a dip of 20°, while the inslab one dips 70° between 50 -100 km. Following recent interpretations we take into account that the interface subduction is possibly locked and, in such a case, large events could occur as characteristic earthquakes. The results of the PSHA analysis show that the subduction beneath the Calabrian region has an influence in the total hazard for this region, especially for long return periods. Regional seismotectonic models for this region should account for subduction.

  20. Forward induced seismic hazard assessment: application to a synthetic seismicity catalogue from hydraulic stimulation modelling

    Science.gov (United States)

    Hakimhashemi, Amir Hossein; Yoon, Jeoung Seok; Heidbach, Oliver; Zang, Arno; Grünthal, Gottfried

    2014-07-01

    The M w 3.2-induced seismic event in 2006 due to fluid injection at the Basel geothermal site in Switzerland was the starting point for an ongoing discussion in Europe on the potential risk of hydraulic stimulation in general. In particular, further development of mitigation strategies of induced seismic events of economic concern became a hot topic in geosciences and geoengineering. Here, we present a workflow to assess the hazard of induced seismicity in terms of occurrence rate of induced seismic events. The workflow is called Forward Induced Seismic Hazard Assessment (FISHA) as it combines the results of forward hydromechanical-numerical models with methods of time-dependent probabilistic seismic hazard assessment. To exemplify FISHA, we use simulations of four different fluid injection types with various injection parameters, i.e. injection rate, duration and style of injection. The hydromechanical-numerical model applied in this study represents a geothermal reservoir with preexisting fractures where a routine of viscous fluid flow in porous media is implemented from which flow and pressure driven failures of rock matrix and preexisting fractures are simulated, and corresponding seismic moment magnitudes are computed. The resulting synthetic catalogues of induced seismicity, including event location, occurrence time and magnitude, are used to calibrate the magnitude completeness M c and the parameters a and b of the frequency-magnitude relation. These are used to estimate the time-dependent occurrence rate of induced seismic events for each fluid injection scenario. In contrast to other mitigation strategies that rely on real-time data or already obtained catalogues, we can perform various synthetic experiments with the same initial conditions. Thus, the advantage of FISHA is that it can quantify hazard from numerical experiments and recommend a priori a stimulation type that lowers the occurrence rate of induced seismic events. The FISHA workflow is rather

  1. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    OpenAIRE

    Álvarez-Gómez, J. A.; Í. Aniel-Quiroga; O. Q. Gutiérrez-Gutiérrez; J. Larreynaga; González, M.; M. Castro; F. Gavidia; Aguirre-Ayerbe, I.; P. González-Riancho; Carreño, E

    2013-01-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has approximately a length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there have been 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and hundreds of victims. The hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached from both Probabilistic and D...

  2. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models.

    OpenAIRE

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; O. Q. Gutiérrez-Gutiérrez; J. Larreynaga; González, M.; M. Castro; F. Gavidia; Aguirre-Ayerbe, I.; P. González-Riancho; Carreño, E

    2013-01-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and determinis...

  3. The timing of disability insurance application: a choice-based semiparametric hazard model

    OpenAIRE

    Richard V. Burkhauser; Butler, J. S.; Yang-Woo Kim

    1996-01-01

    We use a choice-based subsample of Social Security Disability Insurance applicants from the 1978 Social Security Survey of Disability and Work to test the importance of policy variables on the timing of application for disability insurance benefits following the onset of a work limiting health condition. We correct for choice-based sampling by extending the Manski-Lerman (1977) correction to the likelihood function of our continuous time hazard model defined with semiparametric unmeasured het...

  4. Objective assessment of source models for seismic hazard studies : with a worked example from UK data

    OpenAIRE

    R. M. W. Musson; Winter, P.W.

    2012-01-01

    Up to now, the search for increased reliability in probabilistic seismic hazard analysis (PSHA) has concentrated on ways of assessing expert opinion and subjective judgement. Although in some areas of PSHA subjective opinion is unavoidable, there is a danger that assessment procedures and review methods contribute further subjective judgements on top of those already elicited. It is helpful to find techniques for objectively assessing seismic source models that show what the interpretations p...

  5. Linear non-threshold (LNT) radiation hazards model and its evaluation

    International Nuclear Information System (INIS)

    In order to introduce linear non-threshold (LNT) model used in study on the dose effect of radiation hazards and to evaluate its application, the analysis of comprehensive literatures was made. The results show that LNT model is more suitable to describe the biological effects in accuracy for high dose than that for low dose. Repairable-conditionally repairable model of cell radiation effects can be well taken into account on cell survival curve in the all conditions of high, medium and low absorbed dose range. There are still many uncertainties in assessment model of effective dose of internal radiation based on the LNT assumptions and individual mean organ equivalent, and it is necessary to establish gender-specific voxel human model, taking gender differences into account. From above, the advantages and disadvantages of various models coexist. Before the setting of the new theory and new model, LNT model is still the most scientific attitude. (author)

  6. A seismic source zone model for the seismic hazard assessment of the Italian territory

    OpenAIRE

    Meletti, C.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Milano-Pavia, Milano, Italia; Galadini, F.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Milano-Pavia, Milano, Italia; Valensise, G.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Roma1, Roma, Italia; Stucchi, M.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Milano-Pavia, Milano, Italia; Basili, R.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Roma1, Roma, Italia; Barba, S.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Roma1, Roma, Italia; Vannucci, G.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Bologna, Bologna, Italia; Boschi, E.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione AC, Roma, Italia

    2008-01-01

    We designed a new seismic source model for Italy to be used as an input for country-wide probabilistic seismic hazard assessment (PSHA) in the frame of the compilation of a new national reference map. We started off by reviewing existing models available for Italy and for other European countries, then discussed the main open issues in the current practice of seismogenic zoning. The new model, termed ZS9, is largely based on data collected in the past 10 years, including historical eart...

  7. Checking Fine and Gray subdistribution hazards model with cumulative sums of residuals

    DEFF Research Database (Denmark)

    Li, Jianing; Scheike, Thomas; Zhang, Mei Jie

    2015-01-01

    estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. The proposed goodness-of-fit test procedures are based on the cumulative sums...... of residuals, which validate the model in three aspects: (1) proportionality of hazard ratio, (2) the linear functional form and (3) the link function. For each assumption testing, we provide a p-values and a visualized plot against the null hypothesis using a simulation-based approach. We also consider...

  8. Avalanche Hazard Mapping with Satellite Data and a Digital Elevation Model

    Directory of Open Access Journals (Sweden)

    Urs Gruber

    1995-04-01

    Full Text Available Today avalanche hazard mapping is a very time-consuming affair. To map large remote areas, a method based on satellite imagery and digital elevation model has been developed. For this purpose, two test-sites in the Swiss Apls were selected. To simulate the avalanche hazard, the existing Salm-Voellmy model was modified to the computer environment and extended to the characteristics of avalanches within forested terrain. The forests were classified with Landsat-TM data. So far, only a single forest-class was established. The separation of forest, shrub, and non-forested area along the timberline poses a problem. On the other hand, a classification of small openings and avalanche tracks within the forest could be achieved. A comparison with the existing avalanche cadastral map revealed that 85 per cent of the risk areas were correctly classified. On the other hand, the separation into the defined red and blue danger zones was not satisfactory. For the model's application to become operational, further improvements are needed. However, the general approach is very promising, and should lead to more reliable hazard maps for planning purposes, as well as to new and better insights into the mutual effects between snow and forest.

  9. Seismic rupture modelling, strong motion prediction and seismic hazard assessment: fundamental and applied approaches

    International Nuclear Information System (INIS)

    The defence to obtain the 'Habilitation a Diriger des Recherches' is a synthesis of the research work performed since the end of my Ph D. thesis in 1997. This synthesis covers the two years as post doctoral researcher at the Bureau d'Evaluation des Risques Sismiques at the Institut de Protection (BERSSIN), and the seven consecutive years as seismologist and head of the BERSSIN team. This work and the research project are presented in the framework of the seismic risk topic, and particularly with respect to the seismic hazard assessment. Seismic risk combines seismic hazard and vulnerability. Vulnerability combines the strength of building structures and the human and economical consequences in case of structural failure. Seismic hazard is usually defined in terms of plausible seismic motion (soil acceleration or velocity) in a site for a given time period. Either for the regulatory context or the structural specificity (conventional structure or high risk construction), seismic hazard assessment needs: to identify and locate the seismic sources (zones or faults), to characterize their activity, to evaluate the seismic motion to which the structure has to resist (including the site effects). I specialized in the field of numerical strong-motion prediction using high frequency seismic sources modelling and forming part of the IRSN allowed me to rapidly working on the different tasks of seismic hazard assessment. Thanks to the expertise practice and the participation to the regulation evolution (nuclear power plants, conventional and chemical structures), I have been able to work on empirical strong-motion prediction, including site effects. Specific questions related to the interface between seismologists and structural engineers are also presented, especially the quantification of uncertainties. This is part of the research work initiated to improve the selection of the input ground motion in designing or verifying the stability of structures. (author)

  10. Large scale debris-flow hazard assessment: a geotechnical approach and GIS modelling

    Directory of Open Access Journals (Sweden)

    G. Delmonaco

    2003-01-01

    Full Text Available A deterministic distributed model has been developed for large-scale debris-flow hazard analysis in the basin of River Vezza (Tuscany Region – Italy. This area (51.6 km 2 was affected by over 250 landslides. These were classified as debris/earth flow mainly involving the metamorphic geological formations outcropping in the area, triggered by the pluviometric event of 19 June 1996. In the last decades landslide hazard and risk analysis have been favoured by the development of GIS techniques permitting the generalisation, synthesis and modelling of stability conditions on a large scale investigation (>1:10 000. In this work, the main results derived by the application of a geotechnical model coupled with a hydrological model for the assessment of debris flows hazard analysis, are reported. This analysis has been developed starting by the following steps: landslide inventory map derived by aerial photo interpretation, direct field survey, generation of a database and digital maps, elaboration of a DTM and derived themes (i.e. slope angle map, definition of a superficial soil thickness map, geotechnical soil characterisation through implementation of a backanalysis on test slopes, laboratory test analysis, inference of the influence of precipitation, for distinct return times, on ponding time and pore pressure generation, implementation of a slope stability model (infinite slope model and generalisation of the safety factor for estimated rainfall events with different return times. Such an approach has allowed the identification of potential source areas of debris flow triggering. This is used to detected precipitation events with estimated return time of 10, 50, 75 and 100 years. The model shows a dramatic decrease of safety conditions for the simulation when is related to a 75 years return time rainfall event. It corresponds to an estimated cumulated daily intensity of 280–330 mm. This value can be considered the hydrological triggering

  11. Progress in NTHMP Hazard Assessment

    Science.gov (United States)

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  12. Perspectives on open access high resolution digital elevation models to produce global flood hazard layers

    Directory of Open Access Journals (Sweden)

    Christopher Charles Sampson

    2016-01-01

    Full Text Available Global flood hazard models have recently become a reality thanks to the release of open access global digital elevation models, the development of simplified and highly efficient flow algorithms, and the steady increase in computational power. In this commentary we argue that although the availability of open access global terrain data has been critical in enabling the development of such models, the relatively poor resolution and precision of these data now limit significantly our ability to estimate flood inundation and risk for the majority of the planet’s surface. The difficulty of deriving an accurate ‘bare-earth’ terrain model due to the interaction of vegetation and urban structures with the satellite-based remote sensors means that global terrain data are often poorest in the areas where people, property (and thus vulnerability are most concentrated. Furthermore, the current generation of open access global terrain models are over a decade old and many large floodplains, particularly those in developing countries, have undergone significant change in this time. There is therefore a pressing need for a new generation of high resolution and high vertical precision open access global digital elevation models to allow significantly improved global flood hazard models to be developed.

  13. Concentration addition, independent action and generalized concentration addition models for mixture effect prediction of sex hormone synthesis in vitro

    DEFF Research Database (Denmark)

    Hadrup, Niels; Taxvig, Camilla; Pedersen, Mikael;

    2013-01-01

    of the concentration addition (CA), independent action (IA) and generalized concentration addition (GCA) models. First we measured effects of single chemicals and mixtures thereof on steroid synthesis in H295R cells. Then single chemical data were applied to the models; predictions of mixture effects were calculated......, for which the models could not be applied, was identified. In addition, the data indicate that in non-potency adjusted mixtures the effects cannot always be accounted for by single chemicals.......Humans are concomitantly exposed to numerous chemicals. An infinite number of combinations and doses thereof can be imagined. For toxicological risk assessment the mathematical prediction of mixture effects, using knowledge on single chemicals, is therefore desirable. We investigated pros and cons...

  14. Additive interaction in survival analysis

    DEFF Research Database (Denmark)

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise;

    2012-01-01

    is the relevant measure of interest. Multiplicative survival models, such as the Cox proportional hazards model, are often used to estimate the association between exposure and risk of disease in prospective studies. In Cox models, deviations from additivity have usually been assessed by surrogate measures...... of additive interaction derived from multiplicative models-an approach that is both counter-intuitive and sometimes invalid. This paper presents a straightforward and intuitive way of assessing deviation from additivity of effects in survival analysis by use of the additive hazards model. The model directly...... estimates the absolute size of the deviation from additivity and provides confidence intervals. In addition, the model can accommodate both continuous and categorical exposures and models both exposures and potential confounders on the same underlying scale. To illustrate the approach, we present...

  15. Modeling continuous self-report measures of perceived emotion using generalized additive mixed models.

    Science.gov (United States)

    McKeown, Gary J; Sneddon, Ian

    2014-03-01

    Emotion research has long been dominated by the "standard method" of displaying posed or acted static images of facial expressions of emotion. While this method has been useful, it is unable to investigate the dynamic nature of emotion expression. Although continuous self-report traces have enabled the measurement of dynamic expressions of emotion, a consensus has not been reached on the correct statistical techniques that permit inferences to be made with such measures. We propose generalized additive models and generalized additive mixed models as techniques that can account for the dynamic nature of such continuous measures. These models allow us to hold constant shared components of responses that are due to perceived emotion across time, while enabling inference concerning linear differences between groups. The generalized additive mixed model approach is preferred, as it can account for autocorrelation in time series data and allows emotion decoding participants to be modeled as random effects. To increase confidence in linear differences, we assess the methods that address interactions between categorical variables and dynamic changes over time. In addition, we provide comments on the use of generalized additive models to assess the effect size of shared perceived emotion and discuss sample sizes. Finally, we address additional uses, the inference of feature detection, continuous variable interactions, and measurement of ambiguity.

  16. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    Science.gov (United States)

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models.

  17. Richly parameterized linear models additive, time series, and spatial models using random effects

    CERN Document Server

    Hodges, James S

    2013-01-01

    A First Step toward a Unified Theory of Richly Parameterized Linear ModelsUsing mixed linear models to analyze data often leads to results that are mysterious, inconvenient, or wrong. Further compounding the problem, statisticians lack a cohesive resource to acquire a systematic, theory-based understanding of models with random effects.Richly Parameterized Linear Models: Additive, Time Series, and Spatial Models Using Random Effects takes a first step in developing a full theory of richly parameterized models, which would allow statisticians to better understand their analysis results. The aut

  18. Mathematical Decision Models Applied for Qualifying and Planning Areas Considering Natural Hazards and Human Dealing

    Science.gov (United States)

    Anton, Jose M.; Grau, Juan B.; Tarquis, Ana M.; Sanchez, Elena; Andina, Diego

    2014-05-01

    The authors were involved in the use of some Mathematical Decision Models, MDM, to improve knowledge and planning about some large natural or administrative areas for which natural soils, climate, and agro and forest uses where main factors, but human resources and results were important, natural hazards being relevant. In one line they have contributed about qualification of lands of the Community of Madrid, CM, administrative area in centre of Spain containing at North a band of mountains, in centre part of Iberian plateau and river terraces, and also Madrid metropolis, from an official study of UPM for CM qualifying lands using a FAO model from requiring minimums of a whole set of Soil Science criteria. The authors set first from these criteria a complementary additive qualification, and tried later an intermediate qualification from both using fuzzy logic. The authors were also involved, together with colleagues from Argentina et al. that are in relation with local planners, for the consideration of regions and of election of management entities for them. At these general levels they have adopted multi-criteria MDM, used a weighted PROMETHEE, and also an ELECTRE-I with the same elicited weights for the criteria and data, and at side AHP using Expert Choice from parallel comparisons among similar criteria structured in two levels. The alternatives depend on the case study, and these areas with monsoon climates have natural hazards that are decisive for their election and qualification with an initial matrix used for ELECTRE and PROMETHEE. For the natural area of Arroyos Menores at South of Rio Cuarto town, with at North the subarea of La Colacha, the loess lands are rich but suffer now from water erosions forming regressive ditches that are spoiling them, and use of soils alternatives must consider Soil Conservation and Hydraulic Management actions. The use of soils may be in diverse non compatible ways, as autochthonous forest, high value forest, traditional

  19. Primary circuit iodine model addition to IMPAIR-3

    Energy Technology Data Exchange (ETDEWEB)

    Osetek, D.J.; Louie, D.L.Y. [Los Alamos Technical Associates, Inc., Albuquerque, NM (United States); Guntay, S.; Cripps, R. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1996-12-01

    As part of a continuing effort to provide the U.S. Department of Energy (DOE) Advanced Reactor Severe Accident Program (ARSAP) with complete iodine analysis capability, a task was undertaken to expand the modeling of IMPAIR-3, an iodine chemistry code. The expanded code will enable the DOE to include detailed iodine behavior in the assessment of severe accident source terms used in the licensing of U.S. Advanced Light Water Reactors (ALWRs). IMPAIR-3 was developed at the Paul Scherrer Institute (PSI), Switzerland, and has been used by ARSAP for the past two years to analyze containment iodine chemistry for ALWR source term analyses. IMPAIR-3 is primarily a containment code but the iodine chemistry inside the primary circuit (the Reactor Coolant System or RCS) may influence the iodine species released into the the containment; therefore, a RCS iodine chemistry model must be implemented in IMPAIR-3 to ensure thorough source term analysis. The ARSAP source term team and the PSI IMPAIR-3 developers are working together to accomplish this task. This cooperation is divided into two phases. Phase I, taking place in 1996, involves developing a stand-alone RCS iodine chemistry program called IMPRCS (IMPAIR -Reactor Coolant System). This program models a number of the chemical and physical processes of iodine that are thought to be important at conditions of high temperature and pressure in the RCS. In Phase II, which is tentatively scheduled for 1997, IMPRCS will be implemented as a subroutine in IMPAIR-3. To ensure an efficient calculation, an interface/tracking system will be developed to control the use of the RCS model from the containment model. These two models will be interfaced in such a way that once the iodine is released from the RCS, it will no longer be tracked by the RCS model but will be tracked by the containment model. All RCS thermal-hydraulic parameters will be provided by other codes. (author) figs., tabs., refs.

  20. MODELING OF THE BUILDING LOCAL PROTECTION (SHELTER – IN PLACE INCLUDING SORBTION OF THE HAZARDOUS CONTAMINANT ON INDOOR SURFACES

    Directory of Open Access Journals (Sweden)

    N. N. Belyayev

    2014-05-01

    Full Text Available Purpose. Chemically hazardous objects, where toxic substances are used, manufactured and stored, and also main lines, on which the hazardous materials transportation is conducted, pose potential sources of atmosphere accidental pollution.Development of the CFD model for evaluating the efficiency of the building local protection from hazardous substantives ingress by using air curtain and sorption/desorption of hazardous substance on indoor surfaces. Methodology. To solve the problem of hydrodynamic interaction of the air curtain with wind flow and considering the building influence on this process the model of ideal fluid is used. In order to calculate the transfer process of the hazardous substance in the atmosphere an equation of convection-diffusion transport of impurities is applied. To calculate the process of indoors air pollution under leaking of foul air Karisson & Huber model is used. This model takes into account the sorption of the hazardous substance at various indoors surfaces. For the numerical integration of the model equations differential methods are used. Findings. In this paper we construct an efficient CFD model of evaluating the effectiveness of the buildings protection against ingress of hazardous substances through the use of an air curtain. On the basis of the built model a computational experiment to assess the effectiveness of this protection method under varying the location of the air curtain relative to the building was carried out. Originality. A new model was developed to compute the effectiveness of the air curtain supply to reduce the toxic chemical concentration inside the building. Practical value. The developed model can be used for design of the building local protection against ingress of hazardous substances.

  1. Modelling dissimilarity: generalizing ultrametric and additive tree representations.

    Science.gov (United States)

    Hubert, L; Arabie, P; Meulman, J

    2001-05-01

    Methods for the hierarchical clustering of an object set produce a sequence of nested partitions such that object classes within each successive partition are constructed from the union of object classes present at the previous level. Any such sequence of nested partitions can in turn be characterized by an ultrametric. An approach to generalizing an (ultrametric) representation is proposed in which the nested character of the partition sequence is relaxed and replaced by the weaker requirement that the classes within each partition contain objects consecutive with respect to a fixed ordering of the objects. A method for fitting such a structure to a given proximity matrix is discussed, along with several alternative strategies for graphical representation. Using this same ultrametric extension, additive tree representations can also be generalized by replacing the ultrametric component in the decomposition of an additive tree (into an ultrametric and a centroid metric). A common numerical illustration is developed and maintained throughout the paper. PMID:11393895

  2. Non-additive model for specific heat of electrons

    Science.gov (United States)

    Anselmo, D. H. A. L.; Vasconcelos, M. S.; Silva, R.; Mello, V. D.

    2016-10-01

    By using non-additive Tsallis entropy we demonstrate numerically that one-dimensional quasicrystals, whose energy spectra are multifractal Cantor sets, are characterized by an entropic parameter, and calculate the electronic specific heat, where we consider a non-additive entropy Sq. In our method we consider an energy spectra calculated using the one-dimensional tight binding Schrödinger equation, and their bands (or levels) are scaled onto the [ 0 , 1 ] interval. The Tsallis' formalism is applied to the energy spectra of Fibonacci and double-period one-dimensional quasiperiodic lattices. We analytically obtain an expression for the specific heat that we consider to be more appropriate to calculate this quantity in those quasiperiodic structures.

  3. Energy consumption model of Binder-jetting additive manufacturing processes

    OpenAIRE

    Xu, Xin; METEYER, Simon; PERRY, Nicolas; ZHAO, Yaoyao Fiona

    2014-01-01

    Considering the potential for new product design possibilities and the reduction of environmental impacts, Additive Manufacturing (AM) processes are considered to possess significant advantages for automotive, aerospace and medical equipment industries. One of the commercial AM techniques is Binder-Jetting (BJ). This technique can be used to process a variety of materials including stainless steel, ceramic, polymer and glass. However, there is very limited research about this AM technology on...

  4. Measurement Error in Proportional Hazards Models for Survival Data with Long-term Survivors

    Institute of Scientific and Technical Information of China (English)

    Xiao-bing ZHAO; Xian ZHOU

    2012-01-01

    This work studies a proportional hazards model for survival data with "long-term survivors",in which covariates are subject to linear measurement error.It is well known that the na?ve estimators from both partial and full likelihood methods are inconsistent under this measurement error model.For measurement error models,methods of unbiased estimating function and corrected likelihood have been proposed in the literature.In this paper,we apply the corrected partial and full likelihood approaches to estimate the model and obtain statistical inference from survival data with long-term survivors.The asymptotic properties of the estimators are established.Simulation results illustrate that the proposed approaches provide useful tools for the models considered.

  5. Additional Research Needs to Support the GENII Biosphere Models

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen

    2013-11-30

    In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models. It is recommended that priorities be set by NRC staff to guide selection of the most useful improvements in a cost-effective manner. Suggestions are made based on relatively easy and inexpensive changes, and longer-term more costly studies. In the short term, there are several improved model formulations that could be applied to the GENII suite of codes to make them more generally useful. • Implementation of the separation of the translocation and weathering processes • Implementation of an improved model for carbon-14 from non-atmospheric sources • Implementation of radon exposure pathways models • Development of a KML processor for the output report generator module data that are calculated on a grid that could be superimposed upon digital maps for easier presentation and display • Implementation of marine mammal models (manatees, seals, walrus, whales, etc.). Data needs in the longer term require extensive (and potentially expensive) research. Before picking any one radionuclide or food type, NRC staff should perform an in-house review of current and anticipated environmental analyses to select “dominant” radionuclides of interest to allow setting of cost-effective priorities for radionuclide- and pathway-specific research. These include • soil-to-plant uptake studies for oranges and other citrus fruits, and • Development of models for evaluation of radionuclide concentration in highly-processed foods such as oils and sugars. Finally, renewed

  6. The impact of hazardous industrial facilities on housing prices: A comparison of parametric and semiparametric hedonic price models

    DEFF Research Database (Denmark)

    Grislain-Letrémy, Céline; Katossky, Arthur

    2014-01-01

    The willingness of households to pay for prevention against industrial risks can be revealed by real estate markets. By using very rich microdata, we study housing prices in the vicinity of hazardous industries near three important French cities. We show that the impact of hazardous plants...... on the housing values strongly differs among these three areas, even if the areas all surround chemical and petrochemical industries. We compare the results from both standard parametric and more flexible, semiparametric models of hedonic property. We show that the parametric model might structurally lead...... to important biases in the estimated value of the impact of hazardous plants on housing values....

  7. Web-based Services for Earth Observing and Model Data in National Applications and Hazards

    Science.gov (United States)

    Kafatos, M.; Boybeyi, Z.; Cervone, G.; di, L.; Sun, D.; Yang, C.; Yang, R.

    2005-12-01

    The ever-growing large volumes of Earth system science data, collected by Earth observing platforms, in situ stations and as model output data, are increasingly being used by discipline scientists and by wider classes of users. In particular, applications of Earth system science data to environmental and hazards as well as other national applications, require tailored or specialized data, as well as web-based tools and infrastructure. The latter are driven by applications and usage drivers which include ease of access, visualization of complex data, ease of producing value-added data, GIS and open source analysis usage, metadata, etc. Here we present different aspects of such web-based services and access, and discuss several applications in the hazards and environmental areas, including earthquake signatures and observations and model runs of hurricanes. Examples and lessons learned from the consortium Mid-Atlantic Geospatial Information Consortium will be presented. We discuss a NASA-funded, open source on-line data analysis system that is being applied to climate studies for the ESIP Federation. Since enhanced, this project and the next-generation Metadata Integrated Data Analysis System allow users not only to identify data but also to generate new data products on-the-fly. The functionalities extend from limited predefined functions, to sophisticated functions described by general-purposed GrADS (Grid Analysis and Display System) commands. The Federation system also allows third party data products to be combined with local data. Software component are available for converting the output from MIDAS (OPenDAP) into OGC compatible software. The on-going Grid efforts at CEOSR and LAITS in the School of Computational Sciences (SCS) include enhancing the functions of Globus to provide support for a geospatial system so the system can share the computing power to handle problems with different peak access times and improve the stability and flexibility of a rapid

  8. TRENT2D WG: a smart web infrastructure for debris-flow modelling and hazard assessment

    Science.gov (United States)

    Zorzi, Nadia; Rosatti, Giorgio; Zugliani, Daniel; Rizzi, Alessandro; Piffer, Stefano

    2016-04-01

    Mountain regions are naturally exposed to geomorphic flows, which involve large amounts of sediments and induce significant morphological modifications. The physical complexity of this class of phenomena represents a challenging issue for modelling, leading to elaborate theoretical frameworks and sophisticated numerical techniques. In general, geomorphic-flows models proved to be valid tools in hazard assessment and management. However, model complexity seems to represent one of the main obstacles to the diffusion of advanced modelling tools between practitioners and stakeholders, although the UE Flood Directive (2007/60/EC) requires risk management and assessment to be based on "best practices and best available technologies". Furthermore, several cutting-edge models are not particularly user-friendly and multiple stand-alone software are needed to pre- and post-process modelling data. For all these reasons, users often resort to quicker and rougher approaches, leading possibly to unreliable results. Therefore, some effort seems to be necessary to overcome these drawbacks, with the purpose of supporting and encouraging a widespread diffusion of the most reliable, although sophisticated, modelling tools. With this aim, this work presents TRENT2D WG, a new smart modelling solution for the state-of-the-art model TRENT2D (Armanini et al., 2009, Rosatti and Begnudelli, 2013), which simulates debris flows and hyperconcentrated flows adopting a two-phase description over a mobile bed. TRENT2D WG is a web infrastructure joining advantages offered by the software-delivering model SaaS (Software as a Service) and by WebGIS technology and hosting a complete and user-friendly working environment for modelling. In order to develop TRENT2D WG, the model TRENT2D was converted into a service and exposed on a cloud server, transferring computational burdens from the user hardware to a high-performing server and reducing computational time. Then, the system was equipped with an

  9. Technical Work Plan for: Additional Multiscale Thermohydrologic Modeling

    International Nuclear Information System (INIS)

    The primary objective of Revision 04 of the MSTHM report is to provide TSPA with revised repository-wide MSTHM analyses that incorporate updated percolation flux distributions, revised hydrologic properties, updated IEDs, and information pertaining to the emplacement of transport, aging, and disposal (TAD) canisters. The updated design information is primarily related to the incorporation of TAD canisters, but also includes updates related to superseded IEDs describing emplacement drift cross-sectional geometry and layout. The intended use of the results of Revision 04 of the MSTHM report, as described in this TWP, is to predict the evolution of TH conditions (temperature, relative humidity, liquid-phase saturation, and liquid-phase flux) at specified locations within emplacement drifts and in the adjoining near-field host rock along all emplacement drifts throughout the repository. This information directly supports the TSPA for the nominal and seismic scenarios. The revised repository-wide analyses are required to incorporate updated parameters and design information and to extend those analyses out to 1,000,000 years. Note that the previous MSTHM analyses reported in Revision 03 of Multiscale Thermohydrologic Model (BSC 2005 [DIRS 173944]) only extend out to 20,000 years. The updated parameters are the percolation flux distributions, including incorporation of post-10,000-year distributions, and updated calibrated hydrologic property values for the host-rock units. The applied calibrated hydrologic properties will be an updated version of those available in Calibrated Properties Model (BSC 2004 [DIRS 169857]). These updated properties will be documented in an Appendix of Revision 03 of UZ Flow Models and Submodels (BSC 2004 [DIRS 169861]). The updated calibrated properties are applied because they represent the latest available information. The reasonableness of applying the updated calibrated' properties to the prediction of near-fieldin-drift TH conditions

  10. Timing of Effort and Reward: Three-Sided Moral Hazard in a Continuous-Time Model

    OpenAIRE

    Jun Yang

    2010-01-01

    This paper studies a three-sided moral hazard problem with one agent exerting up-front effort and two agents exerting ongoing effort in a continuous-time model. The agents' efforts jointly affect the probability of survival and thus the expected cash flow of the project. In the optimal contract, the timing of payments reflects the timing of effort: payments for up-front effort precede payments for ongoing effort. Several patterns are possible for the cash allocation between the two agents wit...

  11. Tsunami Hazard Preventing Based Land Use Planning Model Using GIS Techniques in Muang Krabi, Thailand

    Directory of Open Access Journals (Sweden)

    Abdul Salam Soomro

    2012-10-01

    Full Text Available The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region.

  12. Tsunami hazard preventing based land use planing model using GIS technique in Muang Krabi, Thailand

    International Nuclear Information System (INIS)

    The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems) based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region. (author)

  13. Models of magma-aquifer interactions and their implications for hazard assessment

    Science.gov (United States)

    Strehlow, Karen; Gottsmann, Jo; Tumi Gudmundsson, Magnús

    2014-05-01

    Kverkfjöll and in October on White Island, New Zealand. The latter is only one example of these natural attractions that are visited by thousands of tourists every year. Additionally, these systems are increasingly used for energy generation. Phreatic explosions pose a serious risk to people and infrastructure nearby, and they are hard to predict. To improve risk assessment in hydrothermal areas, we assessed historical records and literature with regard to the frequency and mechanisms of hydrothermal explosions. Complemented by numerical models this study wants to answer the question: What determines the change of a safe to a dangerous behaviour of the system, i.e. the change from silent degassing to explosions? Our project aims to widen our knowledge base on the complex coupling of magmatic and hydrological systems, which provides further insight into the subsurface processes at volcanic systems and will aid future risk assessment and eruption forecasting.

  14. Comparison of additive (absolute) risk projection models and multiplicative (relative) risk projection models in estimating radiation-induced lifetime cancer risk

    International Nuclear Information System (INIS)

    Lifetime cancer risk estimates depend on risk projection models. While the increasing lengths of follow-up observation periods of atomic bomb survivors in Hiroshima and Nagasaki bring about changes in cancer risk estimates, the validity of the two risk projection models, the additive risk projection model (AR) and multiplicative risk projection model (MR), comes into question. This paper compares the lifetime risk or loss of life-expectancy between the two projection models on the basis of BEIR-III report or recently published RERF report. With Japanese cancer statistics the estimates of MR were greater than those of AR, but a reversal of these results was seen when the cancer hazard function for India was used. When we investigated the validity of the two projection models using epidemiological human data and animal data, the results suggested that MR was superior to AR with respect to temporal change, but there was little evidence to support its validity. (author)

  15. CalTOX, a multimedia total exposure model for hazardous-waste sites

    International Nuclear Information System (INIS)

    CalTOX has been developed as a spreadsheet model to assist in health-risk assessments that address contaminated soils and the contamination of adjacent air, surface water, sediments, and ground water. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify and reduce uncertainty in multimedia, multiple-pathway exposure models. This report provides an overview of the CalTOX model components, lists the objectives of the model, describes the philosophy under which the model was developed, identifies the chemical classes for which the model can be used, and describes critical sensitivities and uncertainties. The multimedia transport and transformation model is a dynamic model that can be used to assess time-varying concentrations of contaminants introduced initially to soil layers or for contaminants released continuously to air or water. This model assists the user in examining how chemical and landscape properties impact both the ultimate route and quantity of human contact. Multimedia, multiple pathway exposure models are used in the CalTOX model to estimate average daily potential doses within a human population in the vicinity of a hazardous substances release site. The exposure models encompass twenty-three exposure pathways. The exposure assessment process consists of relating contaminant concentrations in the multimedia model compartments to contaminant concentrations in the media with which a human population has contact (personal air, tap water, foods, household dusts soils, etc.). The average daily dose is the product of the exposure concentrations in these contact media and an intake or uptake factor that relates the concentrations to the distributions of potential dose within the population

  16. "Developing a multi hazard air quality forecasting model for Santiago, Chile"

    Science.gov (United States)

    Mena, M. A.; Delgado, R.; Hernandez, R.; Saide, P. E.; Cienfuegos, R.; Pinochet, J. I.; Molina, L. T.; Carmichael, G. R.

    2013-05-01

    Santiago, Chile has reduced annual particulate matter from 69ug/m3 (in 1989) to 25ug/m3 (in 2012), mostly by forcing industry, the transport sector, and the residential heating sector to adopt stringent emission standards to be able to operate under bad air days. Statistical forecasting has been used to predict bad air days, and pollution control measures in Santiago, Chile, for almost two decades. Recently an operational PM2.5 deterministic model has been implemented using WRF-Chem. The model was developed by the University of Iowa and is run at the Chilean Meteorological Office. Model configuration includes high resolution emissions gridding (2km) and updated population distribution using 2008 data from LANDSCAN. The model is run using a 2 day spinup with a 5 day forecast. This model has allowed a preventive approach in pollution control measures, as episodes are the results of multiple days of bad dispersion. Decreeing air pollution control measures in advance of bad air days resulted in a reduction of 40% of alert days (80ug/m3 mean 24h PM2.5) and 66% of "preemergency days" (110ug/m3 mean 24h PM2.5) from 2011 to 2012, despite similar meteorological conditions. This model will be deployed under a recently funded Center for Natural Disaster Management, and include other meteorological hazards such as flooding, high temperature, storm waves, landslides, UV radiation, among other parameters. This paper will present the results of operational air quality forecasting, and the methodology that will be used to transform WRF-Chem into a multi hazard forecasting system.

  17. A "mental models" approach to the communication of subsurface hydrology and hazards

    Science.gov (United States)

    Gibson, Hazel; Stewart, Iain S.; Pahl, Sabine; Stokes, Alison

    2016-05-01

    Communicating information about geological and hydrological hazards relies on appropriately worded communications targeted at the needs of the audience. But what are these needs, and how does the geoscientist discern them? This paper adopts a psychological "mental models" approach to assess the public perception of the geological subsurface, presenting the results of attitudinal studies and surveys in three communities in the south-west of England. The findings reveal important preconceptions and misconceptions regarding the impact of hydrological systems and hazards on the geological subsurface, notably in terms of the persistent conceptualisation of underground rivers and the inferred relations between flooding and human activity. The study demonstrates how such mental models can provide geoscientists with empirical, detailed and generalised data of perceptions surrounding an issue, as well reveal unexpected outliers in perception that they may not have considered relevant, but which nevertheless may locally influence communication. Using this approach, geoscientists can develop information messages that more directly engage local concerns and create open engagement pathways based on dialogue, which in turn allow both geoscience "experts" and local "non-experts" to come together and understand each other more effectively.

  18. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  19. Two statistical models for long term seismic hazard assessment in Vrancea, Romania

    International Nuclear Information System (INIS)

    Intermediate-depth earthquakes have occurred frequently in Vrancea, Romania and caused severe damages. To understand the regularity of earthquake occurrence and to predict future earthquakes, we analyzed M ≥7.0 earthquakes during the period of 1500 - 2000 using earthquake catalogue ROMPLUS. Firstly, we have attempted to assess the long-term seismic hazards in Vrancea using a stress-release (SR) model which models the elastic rebound theory in a stochastic process. Renewal models were also applied to the same data set, but these did not perform as well as the SR-model. The SR-model has identified that the probability of an M≥7.0 earthquake occurring in Vrancea in a 5-year period exceeds 40% by the end of this decade. Secondly, we have proposed the periodic upward migration model, in which 1) the first M7 earthquake occurs at a deeper segment of the seismic region at the beginning of each century, 2) the second one occurs at a middle segment at the midst of each century, and 3) the third one occurs at a shallower segment at the end of each century. 4) The above activity repeats every century. We could demonstrate using AIC that this model is better than a uniform Poisson model in time and space. (authors)

  20. ADVANCES IN RENEWAL DECISION-MAKING UTILISING THE PROPORTIONAL HAZARDS MODEL WITH VIBRATION COVARIATES

    Directory of Open Access Journals (Sweden)

    Pieter-Jan Vlok

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Increased competitiveness in the production world necessitates improved maintenance strategies to increase availabilities and drive down cost . The maintenance engineer is thus faced with the need to make more intelligent pre ventive renewal decisions . Two of the main techniques to achieve this is through Condition Monitoring (such as vibrat ion monitoring and oil anal ysis and Statistical Failure Analysis (typically using probabilistic techniques . The present paper discusses these techniques, their uses and weaknesses and then presents th e Proportional Hazard Model as an solution to most of these weaknesses. It then goes on to compare the results of the different techniques in monetary terms, using a South African case study. This comparison shows clearly that the Proportional Hazards Model is sup erior to the present t echniques and should be the preferred model for many actual maintenance situations.

    AFRIKAANSE OPSOMMING: Verhoogde vlakke van mededinging in die produksie omgewing noodsaak verbeterde instandhouding strategies om beskikbaarheid van toerusting te verhoog en koste te minimeer. Instandhoudingsingenieurs moet gevolglik meer intellegente voorkomende hernuwings besluite neem. Twee prominente tegnieke om hierdie doelwit te bereik is Toestandsmonitering (soos vibrasie monitering of olie analise en Statistiese Falingsanalise (gewoonlik m.b.v. probabilistiese metodes. In hierdie artikel beskou ons beide hierdie tegnieke, hulle gebruike en tekortkominge en stel dan die Proporsionele Gevaarkoers Model voor as 'n oplossing vir meeste van die tekortkominge. Die artikel vergelyk ook die verskillende tegnieke in geldelike terme deur gebruik te maak van 'n Suid-Afrikaanse gevalle studie. Hierdie vergelyking wys duidelik-uit dat die Proporsionele Gevaarkoers Model groter beloft e inhou as die huidige tegni eke en dat dit die voorkeur oplossing behoort te wees in baie werklike instandhoudings situasies.

  1. Transferability of regional permafrost disturbance susceptibility modelling using generalized linear and generalized additive models

    Science.gov (United States)

    Rudy, Ashley C. A.; Lamoureux, Scott F.; Treitz, Paul; van Ewijk, Karin Y.

    2016-07-01

    To effectively assess and mitigate risk of permafrost disturbance, disturbance-prone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape characteristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Peninsula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed locations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) > 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Additionally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results indicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of disturbances were

  2. Quantification of Inter-Tsunami Model Variability for Hazard Assessment Studies

    Science.gov (United States)

    Catalan, P. A.; Alcantar, A.; Cortés, P. I.

    2014-12-01

    There is a wide range of numerical models capable of modeling tsunamis, most of which have been properly validated and verified against standard benchmark cases and particular field or laboratory cases studies. Consequently, these models are regularly used as essential tools in estimating the tsunami hazard on coastal communities by scientists, or consulting companies, and treating model results in a deterministic way. Most of these models are derived from the same set of equations, typically the Non Linear Shallow Water Equations, to which ad-hoc terms are added to include physical effects such as friction, the Coriolis force, and others. However, not very often these models are used in unison to address the variability in the results. Therefore, in this contribution, we perform a high number of simulations using a set of numerical models and quantify the variability in the results. In order to reduce the influence of input data on the results, a single tsunami scenario is used over a common bathymetry. Next, we perform model comparisons as to asses sensitivity to changes in grid resolution and Manning roughness coefficients. Results are presented either as intra-model comparisons (sensitivity to changes using the same model) and inter-model comparisons (sensitivity to changing models). For the case tested, it was observed that most models reproduced fairly consistently the arrival and periodicity of the tsunami waves. However, variations in amplitude, characterized by the standard-deviation between model runs, could be as large as the mean signal. This level of variability is considered too large for deterministic assessment, reinforcing the idea that uncertainty needs to be included in such studies.

  3. Socio-economic vulnerability to natural hazards - proposal for an indicator-based model

    Science.gov (United States)

    Eidsvig, U.; McLean, A.; Vangelsten, B. V.; Kalsnes, B.; Ciurean, R. L.; Argyroudis, S.; Winter, M.; Corominas, J.; Mavrouli, O. C.; Fotopoulou, S.; Pitilakis, K.; Baills, A.; Malet, J. P.

    2012-04-01

    Vulnerability assessment, with respect to natural hazards, is a complex process that must consider multiple dimensions of vulnerability, including both physical and social factors. Physical vulnerability refers to conditions of physical assets, and may be modeled by the intensity and magnitude of the hazard, the degree of physical protection provided by the natural and built environment, and the physical robustness of the exposed elements. Social vulnerability refers to the underlying factors leading to the inability of people, organizations, and societies to withstand impacts from the natural hazards. Social vulnerability models can be used in combination with physical vulnerability models to estimate both direct losses, i.e. losses that occur during and immediately after the impact, as well as indirect losses, i.e. long-term effects of the event. Direct impact of a landslide typically includes casualties and damages to buildings and infrastructure while indirect losses may e.g. include business closures or limitations in public services. The direct losses are often assessed using physical vulnerability indicators (e.g. construction material, height of buildings), while indirect losses are mainly assessed using social indicators (e.g. economical resources, demographic conditions). Within the EC-FP7 SafeLand research project, an indicator-based method was proposed to assess relative socio-economic vulnerability to landslides. The indicators represent the underlying factors which influence a community's ability to prepare for, deal with, and recover from the damage associated with landslides. The proposed model includes indicators representing demographic, economic and social characteristics as well as indicators representing the degree of preparedness and recovery capacity. Although the model focuses primarily on the indirect losses, it could easily be extended to include more physical indicators which account for the direct losses. Each indicator is individually

  4. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  5. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  6. Additive gamma frailty models with applications to competing risks in related individuals

    DEFF Research Database (Denmark)

    Eriksson, Frank; Scheike, Thomas

    2015-01-01

    Epidemiological studies of related individuals are often complicated by the fact that follow-up on the event type of interest is incomplete due to the occurrence of other events. We suggest a class of frailty models with cause-specific hazards for correlated competing events in related individual......-truncation as occurring in the Nordic twin registers. The performance in finite samples is investigated by simulations and an example on prostate cancer in twins is provided for illustration.......Epidemiological studies of related individuals are often complicated by the fact that follow-up on the event type of interest is incomplete due to the occurrence of other events. We suggest a class of frailty models with cause-specific hazards for correlated competing events in related individuals...

  7. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models.

    Science.gov (United States)

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; Gutiérrez-Gutiérrez, O. Q.; Larreynaga, J.; González, M.; Castro, M.; Gavidia, F.; Aguirre-Ayerbe, I.; González-Riancho, P.; Carreño, E.

    2013-11-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and deterministic methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold: on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high-resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps, and from the elevation in the near shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific Basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences-finite volumes numerical model in this work, based on the linear and non-linear shallow water equations, to simulate a total of 24 earthquake-generated tsunami scenarios. Our results show that at the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results

  8. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    Directory of Open Access Journals (Sweden)

    J. A. Álvarez-Gómez

    2013-05-01

    Full Text Available El Salvador is the smallest and most densely populated country in Central America; its coast has approximately a length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there have been 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and hundreds of victims. The hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached from both Probabilistic and Deterministic Methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold, on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps and from the elevation in the near-shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences – finite volumes numerical model in this work, based on the Linear and Non-linear Shallow Water Equations, to simulate a total of 24 earthquake generated tsunami scenarios. In the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results obtained

  9. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    Science.gov (United States)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve

  10. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    Science.gov (United States)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  11. Development of models to inform a national Daily Landslide Hazard Assessment for Great Britain

    Science.gov (United States)

    Dijkstra, Tom A.; Reeves, Helen J.; Dashwood, Claire; Pennington, Catherine; Freeborough, Katy; Mackay, Jonathan D.; Uhlemann, Sebastian S.; Chambers, Jonathan E.; Wilkinson, Paul B.

    2015-04-01

    were combined with records of observed landslide events to establish which antecedent effective precipitation (AEP) signatures of different duration could be used as a pragmatic proxy for the occurrence of landslides. It was established that 1, 7, and 90 days AEP provided the most significant correlations and these were used to calculate the probability of at least one landslide occurring. The method was then extended over the period 2006 to 2014 and the results evaluated against observed occurrences. It is recognised that AEP is a relatively poor proxy for simulating effective stress conditions along potential slip surfaces. However, the temporal pattern of landslide probability compares well to the observed occurrences and provides a potential benefit to assist with the DLHA. Further work is continuing to fine-tune the model for landslide type, better spatial resolution of effective precipitation input and cross-reference to models that capture changes in water balance and conditions along slip surfaces. The latter is facilitated by intensive research at several field laboratories, such as the Hollin Hill site in Yorkshire, England. At this site, a decade of activity has generated a broad range of research and a wealth of data. This paper reports on one example of recent work; the characterisation of near surface hydrology using infiltration experiments where hydrological pathways are captured, among others, by electrical resistivity tomography. This research, which has further developed our understanding of soil moisture movement in a heterogeneous landslide complex, has highlighted the importance of establishing detailed ground models to enable determination of landslide potential at high resolution. In turn, the knowledge gained through this research is used to enhance the expertise for the daily landslide hazard assessments at a national scale.

  12. Modeling the cardiovascular system using a nonlinear additive autoregressive model with exogenous input

    Science.gov (United States)

    Riedl, M.; Suhrbier, A.; Malberg, H.; Penzel, T.; Bretthauer, G.; Kurths, J.; Wessel, N.

    2008-07-01

    The parameters of heart rate variability and blood pressure variability have proved to be useful analytical tools in cardiovascular physics and medicine. Model-based analysis of these variabilities additionally leads to new prognostic information about mechanisms behind regulations in the cardiovascular system. In this paper, we analyze the complex interaction between heart rate, systolic blood pressure, and respiration by nonparametric fitted nonlinear additive autoregressive models with external inputs. Therefore, we consider measurements of healthy persons and patients suffering from obstructive sleep apnea syndrome (OSAS), with and without hypertension. It is shown that the proposed nonlinear models are capable of describing short-term fluctuations in heart rate as well as systolic blood pressure significantly better than similar linear ones, which confirms the assumption of nonlinear controlled heart rate and blood pressure. Furthermore, the comparison of the nonlinear and linear approaches reveals that the heart rate and blood pressure variability in healthy subjects is caused by a higher level of noise as well as nonlinearity than in patients suffering from OSAS. The residue analysis points at a further source of heart rate and blood pressure variability in healthy subjects, in addition to heart rate, systolic blood pressure, and respiration. Comparison of the nonlinear models within and among the different groups of subjects suggests the ability to discriminate the cohorts that could lead to a stratification of hypertension risk in OSAS patients.

  13. Uncertainty quantification in satellite-driven modeling to forecast lava flow hazards

    Science.gov (United States)

    Ganci, Gaetana; Bilotta, Giuseppe; Cappello, Annalisa; Herault, Alexis; Zago, Vito; Del Negro, Ciro

    2016-04-01

    Over the last decades satellite-based remote sensing and data processing techniques have proved well suited to complement field observations to provide timely event detection for volcanic effusive events, as well as extraction of parameters allowing lava flow tracking. In parallel with this, physics-based models for lava flow simulations have improved enormously and are now capable of fast, accurate simulations, which are increasingly driven by, or validated using, satellite-derived parameters such as lava flow discharge rates. Together, these capabilities represent a prompt strategy with immediate applications to the real time monitoring and hazard assessment of effusive eruptions, but two important key issues still need to be addressed, to improve its effectiveness: (i) the provision of source term parameters and their uncertainties, (ii) how uncertainties in source terms propagate into the model outputs. We here address these topics considering uncertainties in satellite-derived products obtained by the HOTSAT thermal monitoring system (e.g. hotspot pixels, radiant heat flux, effusion rate) and evaluating how these uncertainties affect lava flow hazard scenarios by inputting them into the MAGFLOW physics-based model for lava flow simulations. Particular attention is given to topography and cloud effect on satellite-derived products as well as to the frequency of their acquisitions (GEO vs LEO). We also investigate how the DEM resolution impact final scenarios from both the numerical and physical points of view. To evaluate these effects, three different kinds of well documented eruptions occurred at Mt Etna are taken into account: a short-lived paroxysmal event, i.e. the 11-13 Jan 2011 lava fountain, a long lasting eruption, i.e. the 2008-2009 eruption, and a short effusive event, i.e. the 14-24 July 2006 eruption.

  14. Weather modeling for hazard and consequence assessment operations during the 2006 Winter Olympic Games

    Science.gov (United States)

    Hayes, P.; Trigg, J. L.; Stauffer, D.; Hunter, G.; McQueen, J.

    2006-05-01

    Consequence assessment (CA) operations are those processes that attempt to mitigate negative impacts of incidents involving hazardous materials such as chemical, biological, radiological, nuclear, and high explosive (CBRNE) agents, facilities, weapons, or transportation. Incident types range from accidental spillage of chemicals at/en route to/from a manufacturing plant, to the deliberate use of radiological or chemical material as a weapon in a crowded city. The impacts of these incidents are highly variable, from little or no impact to catastrophic loss of life and property. Local and regional scale atmospheric conditions strongly influence atmospheric transport and dispersion processes in the boundary layer, and the extent and scope of the spread of dangerous materials in the lower levels of the atmosphere. Therefore, CA personnel charged with managing the consequences of CBRNE incidents must have detailed knowledge of current and future weather conditions to accurately model potential effects. A meteorology team was established at the U.S. Defense Threat Reduction Agency (DTRA) to provide weather support to CA personnel operating DTRA's CA tools, such as the Hazard Prediction and Assessment Capability (HPAC) tool. The meteorology team performs three main functions: 1) regular provision of meteorological data for use by personnel using HPAC, 2) determination of the best performing medium-range model forecast for the 12 - 48 hour timeframe and 3) provision of real-time help-desk support to users regarding acquisition and use of weather in HPAC CA applications. The normal meteorology team operations were expanded during a recent modeling project which took place during the 2006 Winter Olympic Games. The meteorology team took advantage of special weather observation datasets available in the domain of the Winter Olympic venues and undertook a project to improve weather modeling at high resolution. The varied and complex terrain provided a special challenge to the

  15. Developing a functional model for cities impacted by a natural hazard: application to a city affected by flooding

    OpenAIRE

    Bambara, G.; Peyras, L.; Felix, H.; Serre, D.

    2015-01-01

    The experience feedback on a crisis that hit a city is frequently used as a "recollection" tool. To capitalize information about an experience feedback from the cities that have been affected by a natural hazard, the authors propose in this study a functional model to model scenarios of city crises. In this model, the city, considered as a complex system, was modelled using a functional analysis method. Based on such modelling, two risk analysis methods (Failure Mode and Eff...

  16. Probabilistic forecasts of debris-flow hazard at the regional scale with a combination of models.

    Science.gov (United States)

    Malet, Jean-Philippe; Remaître, Alexandre

    2015-04-01

    Debris flows are one of the many active slope-forming processes in the French Alps, where rugged and steep slopes mantled by various slope deposits offer a great potential for triggering hazardous events. A quantitative assessment of debris-flow hazard requires the estimation, in a probabilistic framework, of the spatial probability of occurrence of source areas, the spatial probability of runout areas, the temporal frequency of events, and their intensity. The main objective of this research is to propose a pipeline for the estimation of these quantities at the region scale using a chain of debris-flow models. The work uses the experimental site of the Barcelonnette Basin (South French Alps), where 26 active torrents have produced more than 150 debris-flow events since 1850 to develop and validate the methodology. First, a susceptibility assessment is performed to identify the debris-flow prone source areas. The most frequently used approach is the combination of environmental factors with GIS procedures and statistical techniques, integrating or not, detailed event inventories. Based on a 5m-DEM and derivatives, and information on slope lithology, engineering soils and landcover, the possible source areas are identified with a statistical logistic regression model. The performance of the statistical model is evaluated with the observed distribution of debris-flow events recorded after 1850 in the study area. The source areas in the three most active torrents (Riou-Bourdoux, Faucon, Sanières) are well identified by the model. Results are less convincing for three other active torrents (Bourget, La Valette and Riou-Chanal); this could be related to the type of debris-flow triggering mechanism as the model seems to better spot the open slope debris-flow source areas (e.g. scree slopes), but appears to be less efficient for the identification of landslide-induced debris flows. Second, a susceptibility assessment is performed to estimate the possible runout distance

  17. Doubly stochastic models for volcanic hazard assessment at Campi Flegrei caldera

    CERN Document Server

    Bevilacqua, Andrea

    2016-01-01

    This study provides innovative mathematical models for assessing the eruption probability and associated volcanic hazards, and applies them to the Campi Flegrei caldera in Italy. Throughout the book, significant attention is devoted to quantifying the sources of uncertainty affecting the forecast estimates. The Campi Flegrei caldera is certainly one of the world’s highest-risk volcanoes, with more than 70 eruptions over the last 15,000 years, prevalently explosive ones of varying magnitude, intensity and vent location. In the second half of the twentieth century the volcano apparently once again entered a phase of unrest that continues to the present. Hundreds of thousands of people live inside the caldera and over a million more in the nearby city of Naples, making a future eruption of Campi Flegrei an event with potentially catastrophic consequences at the national and European levels.

  18. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    Science.gov (United States)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two

  19. Stiffness Model of a 3-DOF Parallel Manipulator with Two Additional Legs

    OpenAIRE

    Yu, Guang; Wu, Jun; Wang, Liping

    2014-01-01

    This paper investigates the stiffness modelling of a 3-DOF parallel manipulator with two additional legs. The stiffness model in six directions of the 3-DOF parallel manipulator with two additional legs is derived by performing condensation of DOFs for the joint connection and treatment of the fixed-end connections. Moreover, this modelling method is used to derive the stiffness model of the manipulator with zero/one additional legs. Two performance indices are given to compare the stiffness ...

  20. A general additive-multiplicative rates model for recurrent event data

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this article, we propose a general additive-multiplicative rates model for recurrent event data. The proposed model includes the additive rates and multiplicative rates models as special cases. For the inference on the model parameters, estimating equation approaches are developed, and asymptotic properties of the proposed estimators are established through modern empirical process theory. In addition, an illustration with multiple-infection data from a clinic study on chronic granulomatous disease is provided.

  1. Development of a Probabilistic Tornado Wind Hazard Model for the Continental United States Volume I: Main Report

    International Nuclear Information System (INIS)

    Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526, in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States

  2. Lava flow hazard modeling during the 2014-2015 Fogo eruption, Cape Verde

    Science.gov (United States)

    Cappello, Annalisa; Ganci, Gaetana; Calvari, Sonia; Pérez, Nemesio M.; Hernández, Pedro A.; Silva, Sónia V.; Cabral, Jeremias; Del Negro, Ciro

    2016-04-01

    Satellite remote sensing techniques and lava flow forecasting models have been combined to enable a rapid response during effusive crises at poorly monitored volcanoes. Here we used the HOTSAT satellite thermal monitoring system and the MAGFLOW lava flow emplacement model to forecast lava flow hazards during the 2014-2015 Fogo eruption. In many ways this was one of the major effusive eruption crises of recent years, since the lava flows actually invaded populated areas. Combining satellite data and modeling allowed mapping of the probable evolution of lava flow fields while the eruption was ongoing and rapidly gaining as much relevant information as possible. HOTSAT was used to promptly analyze MODIS and SEVIRI data to output hot spot location, lava thermal flux, and effusion rate estimation. This output was used to drive the MAGFLOW simulations of lava flow paths and to continuously update flow simulations. We also show how Landsat 8 OLI and EO-1 ALI images complement the field observations for tracking the flow front position through time and adding considerable data on lava flow advancement to validate the results of numerical simulations. The integration of satellite data and modeling offers great promise in providing a unified and efficient system for global assessment and real-time response to effusive eruptions, including (i) the current state of the effusive activity, (ii) the probable evolution of the lava flow field, and (iii) the potential impact of lava flows.

  3. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  4. AschFlow - A dynamic landslide run-out model for medium scale hazard analysis.

    Science.gov (United States)

    Luna, Byron Quan; Blahut, Jan; van Asch, Theo; van Westen, Cees; Kappes, Melanie

    2015-04-01

    Landslides and debris flow hazard assessments require a scale-dependent analysis in order to mitigate damage and other negative consequences at the respective scales of occurrence. Medium or large scale landslide run-out modelling for many possible landslide initiation areas has been a cumbersome task in the past. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the run-out models to compute the displacement with a large amount of individual initiation areas (computational exhaustive). Most of the existing physically based run-out models have complications in handling such situations and therefore empirical methods have been used as a practical mean to predict landslides mobility at a medium scale (1:10,000 to 1:50,000). In this context, a simple medium scale numerical model for rapid mass movements in urban and mountainous areas was developed. The deterministic nature of the approach makes it possible to calculate the velocity, height and increase in mass by erosion, resulting in the estimation of various forms of impacts exerted by debris flows at the medium scale The established and implemented model ("AschFlow") is a 2-D one-phase continuum model that simulates, the entrainment, spreading and deposition process of a landslide or debris flow at a medium scale. The flow is thus treated as a single phase material, whose behavior is controlled by rheology (e.g. Voellmy or Bingham). The developed regional model "AschFlow" was applied and evaluated in well documented areas with known past debris flow events.

  5. Methodologies, models and parameters for environmental, impact assessment of hazardous and radioactive contaminants; Metodologias, modelos y parametros para evaluacion del impacto ambiental de contaminantes peligrosos y radiactivos

    Energy Technology Data Exchange (ETDEWEB)

    Aguero, A.; Cancio, D.; Garcia-Olivares, A.; Romero, L.; Pinedo, P.; Robles, B.; Rodriguez, J.; Simon, I.; Suanez, A.

    2003-07-01

    An Environmental Impact Assessment Methodology to assess the impact arising from contaminants present in hazardous and radioactive wastes has been developed. Taking into account of the background information on legislation, waste categories and contaminants inventory, and disposal, recycling and waste treatment options, an Environmental Impact Assessment Methodology (MEIA) is proposed. This is applicable to (i) several types of solid wastes (hazardous, radioactive and mixed wastes; (ii) several management options (recycling and temporal and final storage (in shallow and deep disposal)), (iii) several levels of data availability. Conceptual and mathematical models and software tools needed for the application of the MEIA have been developed. Bearing in mind that this is a complex process, both the models and tools have to be developed following an iterative approaches, involving refinement of the models and go as to better correspond the described system. The selection of suitable parameters for the models is based on information derived from field and laboratory measurements and experiments, nd then applying a data elicitation protocol.. It is shown an application performed for a hypothetical shallow radioactive waste disposal facility (test case), with all the steps of the MEIA applied sequentially. In addition, the methodology is applied to an actual cases of waste management for hazardous wastes from the coal fuel cycle, demonstrating several possibilities for application of the MEIA from a practical perspective. The experience obtained in the development of the work shows that the use of the MEIA for the assessment of management options for hazardous and radioactive wastes gives important advantages, simplifying the execution of the assessment, its tracability and the dissemination of methodology assessment results to to other interested parties. (Author)

  6. Quantitative hazard assessment at Vulcano (Aeolian islands): integration of geology, event statistics and physical modelling

    Science.gov (United States)

    Dellino, Pierfrancesco; de Astis, Gianfilippo; La Volpe, Luigi; Mele, Daniela; Sulpizio, Roberto

    2010-05-01

    The analysis of stratigraphy and of pyroclastic deposits particle features allowed the reconstruction of the volcanic history of La Fossa di Vulcano. An eruptive scenario driven by superficial phreatomagmatic explosions emerged. A statistical analysis of the pyroclastic Successions led to define a repetitive sequence of dilute pyroclastic density currents as the most probable events at short term, followed by fallout of dense ballistic blocks. The scale of such events is related to the amount of magma involved in each explosion. Events involving a million of cubic meters of magma are probable in view of what happened in the most recent eruptions. They led to the formation of hundreds of meters thick dilute pyroclastic density currents, moving down the volcano slope at velocities exceeding 50 m/sec. The dispersion of desnity currents affected the whole Vulcano Porto area, the Vulcanello area and also overrode the Fossa Caldera's rim, spreading over the Piano area. Similarly, older pyroclastic deposits erupted at different times (Piano Grotte dei Rossi formation, ~20-7.7 ka) from vents within La Fossa Caldera and before La Fossa Cone formation. They also were phreatomagmatic in origin and fed dilute pyroclastic density currents (PDC). They represent the eruptions with the highest magnitude on the Island. Therefore, for the aim of hazard assessment, these deposits from La Fossa Cone and La Fossa Caldera were used to depict eruptive scenarios at short term and at long term. On the base of physical models that make use of pyroclastic deposits particle features, the impact parameters for each scenario have been calculated. They are dynamic pressure and particle volumetric concentration of density currents, and impact energy of ballistic blocks. On this base, a quantitative hazard map is presented, which could be of direct use for territory planning and for the calculation of the expected damage.

  7. Local models for rainstorm-induced hazard analysis on Mediterranean river-torrential geomorphological systems

    Directory of Open Access Journals (Sweden)

    N. Diodato

    2004-01-01

    Full Text Available Damaging hydrogeomorphological events are defined as one or more simultaneous phenomena (e.g. accelerated erosions, landslides, flash floods and river floods, occurring in a spatially and temporal random way and triggered by rainfall with different intensity and extent. The storm rainfall values are highly dependent on weather condition and relief. However, the impact of rainstorms in Mediterranean mountain environments depend mainly on climatic fluctuations in the short and long term, especially in rainfall quantity. An algorithm for the characterisation of this impact, called Rainfall Hazard Index (RHI, is developed with a less expensive methodology. In RHI modelling, we assume that the river-torrential system has adapted to the natural hydrological regime, and a sudden fluctuation in this regime, especially those exceeding thresholds for an acceptable range of flexibility, may have disastrous consequences for the mountain environment. RHI integrate two rainfall variables based upon storm depth current and historical data, both of a fixed duration, and a one-dimensionless parameter representative of the degree ecosystem flexibility. The approach was applied to a test site in the Benevento river-torrential landscape, Campania (Southern Italy. So, a database including data from 27 events which have occurred during an 77-year period (1926-2002 was compared with Benevento-station RHI(24h, for a qualitative validation. Trends in RHIx for annual maximum storms of duration 1, 3 and 24h were also examined. Little change is observed at the 3- and 24-h duration of a storm, but a significant increase results in hazard of a short and intense storm (RHIx(1h, in agreement with a reduction in return period for extreme rainfall events.

  8. Identifying model pollutants to investigate biodegradation of hazardous XOCs in WWTPs

    Energy Technology Data Exchange (ETDEWEB)

    Press-Kristensen, Kaare; Ledin, Anna; Schmidt, Jens Ejbye; Henze, Mogens [Department of Environment and Resources, Technical University of Denmark Building 115, 2800 Lyngby (Denmark)

    2007-02-01

    Xenobiotic organic compounds (XOCs) in wastewater treatment plant (WWTP) effluents might cause toxic effects in ecosystems. Several investigations have emphasized biodegradation as an important removal mechanism to reduce pollution with XOCs from WWTP effluents. The aim of the study was to design a screening tool to identify and select hazardous model pollutants for the further investigation of biodegradation in WWTPs. The screening tool consists of three criteria: The XOC is present in WWTP effluents, the XOC constitutes an intolerable risk in drinking water or the environment, and the XOC is expected to be biodegradable in WWTPs. The screening tool was tested on bisphenol A (BPA), carbamazepine (CBZ), di(2ethylhexyl)-phthalate (DEHP), 17{beta}-estradiol (E2), estrone (E1), 17{alpha}-ethinyloetradiol (EE2), ibuprofen, naproxen, nonylphenol (NP), and octylphenol (OP). BPA, DEHP, E2, E1, EE2, and NP passed all criteria in the screening tool and were selected as model pollutants. OP did not pass the filter and was rejected as model pollutant. CBZ, ibuprofen, and naproxen were not finally evaluated due to insufficient data. (author)

  9. Mediation Analysis with Survival Outcomes: Accelerated Failure Time Versus Proportional Hazards Models

    Directory of Open Access Journals (Sweden)

    Lois A Gelfand

    2016-03-01

    Full Text Available Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH and fully parametric accelerated failure time (AFT approaches for illustration.Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively under varied data conditions, some including censoring. A simulated data set illustrates the findings.Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome – underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG.Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  10. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    International Nuclear Information System (INIS)

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships

  11. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Harper, Bryan [Oregon State University (United States); Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan [Pacific Northwest National Laboratory (United States); Tang, Kaizhi [Intelligent Automation, Inc. (United States); Heredia-Langner, Alejandro [Pacific Northwest National Laboratory (United States); Lins, Roberto [CPqAM, Oswaldo Cruz Foundation, FIOCRUZ-PE (Brazil); Harper, Stacey, E-mail: stacey.harper@oregonstate.edu [Oregon State University (United States)

    2015-06-15

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships.

  12. A class of additive-accelerated means regression models for recurrent event data

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In this article, we propose a class of additive-accelerated means regression models for analyzing recurrent event data. The class includes the proportional means model, the additive rates model, the accelerated failure time model, the accelerated rates model and the additive-accelerated rate model as special cases. The new model offers great flexibility in formulating the effects of covariates on the mean functions of counting processes while leaving the stochastic structure completely unspecified. For the inference on the model parameters, estimating equation approaches are derived and asymptotic properties of the proposed estimators are established. In addition, a technique is provided for model checking. The finite-sample behavior of the proposed methods is examined through Monte Carlo simulation studies, and an application to a bladder cancer study is illustrated.

  13. A forecasting and forewarning model for methane hazard in working face of coal mine based on LS-SVM

    Energy Technology Data Exchange (ETDEWEB)

    Shu-gang Cao; Yan-bao Liu; Yan-ping Wang [Chongqing University, Chongqing (China). Key Laboratory for the Exploitation of Southwest Resources and the Environmental Disaster Control Engineering, Ministry of Education

    2008-06-15

    To improve the precision and reliability in predicting the methane hazard in a working face of a coal mine, a forecasting and forewarning model for methane hazard is proposed based on the least square support vector (LS-SVM) multi-classifier and regression machine. For the forecasting model, the methane concentration can be considered as a nonlinear time series and the time series analysis method is adopted to predict the change in methane concentration using LS-SVM regression. For the forewarning model, which is based on the forecasting results, by the multi-classification method of LS-SVM, the methane hazard was identified to four grades: normal, attention, warning and danger. According to the forewarning results, corresponding measures are taken. The model was used to forecast and forewarn the K9 working face. The results obtained by LS-SVM regression show that the forecasting have a high precision and forewarning results based on a LS-SVM multi-classifier are credible. Therefore, it is an effective model building method for continuous prediction of methane concentration and hazard forewarning in the working face. 20 refs., 2 figs., 3 tabs.

  14. A forecasting and forewarning model for methane hazard in working face of coal mine based on LS-SVM

    Institute of Scientific and Technical Information of China (English)

    CAO Shu-gang; LIU Yan-bao; WANG Yan-ping

    2008-01-01

    To improve the precision and reliability in predicting methane hazard in working face of coal mine, we have proposed a forecasting and forewarning model for methane hazard based on the least square support vector (LS-SVM) multi-classifier and regression machine. For the forecasting model, the methane concentration can be considered as a nonlinear time series and the time series analysis method is adopted to predict the change in methane concentration using LS-SVM regression. For the forewarning model, which is based on the forecasting results, by the multi-classification method of LS-SVM, the methane hazard was identified to four grades: normal, attention, warning and danger. According to the forewarning results, corresponding measures are taken. The model was used to forecast and forewarn the K9 working face. The results obtained by LS-SVM regression show that the forecast- ing have a high precision and forewarning results based on a LS-SVM multi-classifier are credible. Therefore, it is an effective model building method for continuous prediction of methane concentration and hazard forewarning in working face.

  15. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  16. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    Science.gov (United States)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  17. Modeling and hazard mapping of complex cascading mass movement processes: the case of glacier lake 513, Carhuaz, Peru

    Science.gov (United States)

    Schneider, Demian; Huggel, Christian; García, Javier; Ludeña, Sebastian; Cochachin, Alejo

    2013-04-01

    The Cordilleras in Peru are especially vulnerable to, and affected by impacts from climate change. Local communities and cities often exist directly within the reach of major hazard potentials such as lake outburst floods (aluviones), mud-/debris flows (huaycos) or large rock-/ice avalanches. They have been repeatedly and strongly affected these regions over the last decades and since the last century, and thousands of people have been killed. One of the most recent events in the Cordillera Blanca occurred on 11 April 2010, when a rock/ice avalanche from the top of Hualcán mountain, NE of the town of Carhuaz impacted the glacier lake 513 (Laguna 513), caused displacement waves and triggered an outburst flood wave. The flow repeatedly transformed from debris flow to hyperconcentrated flow and eventually caused significant damage in Carhuaz. This event was motivation to start early warning and prevention efforts to reduce risks related to ice/rock avalanches and glacier lake outburst floods (GLOF). One of the basic components of an early warning system is the assessment, understanding and communication of relevant hazards and risks. Here we report on the methodology and results of generating GLOF related hazard maps for Carhuaz based on numerical modeling and field work. This exercise required an advanced concept and implementation of different mass movement models. Specifically, numerical models were applied for simulating avalanche flow, avalanche lake impact, displacement wave generation and lake overtopping, and eventually flow propagation of the outburst flood with changing rheology between debris flow and hyperconcentrated flows. We adopted a hazard mapping procedure slightly adjusted adjusted from guidelines developed in Switzerland and in the Andes region. A methodology has thereby been developed to translate results from numerical mass movement modeling into hazard maps. The resulting hazard map was verified and adjusted during field work. This study shows

  18. Reliability estimation and remaining useful lifetime prediction for bearing based on proportional hazard model

    Institute of Scientific and Technical Information of China (English)

    王鹭; 张利; 王学芝

    2015-01-01

    As the central component of rotating machine, the performance reliability assessment and remaining useful lifetime prediction of bearing are of crucial importance in condition-based maintenance to reduce the maintenance cost and improve the reliability. A prognostic algorithm to assess the reliability and forecast the remaining useful lifetime (RUL) of bearings was proposed, consisting of three phases. Online vibration and temperature signals of bearings in normal state were measured during the manufacturing process and the most useful time-dependent features of vibration signals were extracted based on correlation analysis (feature selection step). Time series analysis based on neural network, as an identification model, was used to predict the features of bearing vibration signals at any horizons (feature prediction step). Furthermore, according to the features, degradation factor was defined. The proportional hazard model was generated to estimate the survival function and forecast the RUL of the bearing (RUL prediction step). The positive results show that the plausibility and effectiveness of the proposed approach can facilitate bearing reliability estimation and RUL prediction.

  19. Extended FRAM by Integrating with Model Checking to Effectively Explore Hazard Evolution

    Directory of Open Access Journals (Sweden)

    Guihuan Duan

    2015-01-01

    Full Text Available Functional Resonance Analysis Method (FRAM, which defines a systemic framework to model complex systems from the perspective of function and views accidents as emergent phenomenon of function’s variability, is playing an increasingly significant role in the development of systemic accident theory. However, as FRAM is typically taken as a theoretic method, there is a lack of specific approaches or supportive tools to bridge the theory and practice. To fill the gap and contribute to the development of FRAM, (1 function’s variability was described further, with the rules of interaction among variability of different functions being determined and (2 the technology of model checking (MC was used for the analysis of function’s variability to automatically search the potential paths that could lead to hazards. By means of MC, system’s behaviors (normal or abnormal are simulated and the counter example(s that violates the safety constraints and requirements can be provided, if there is any, to improve the system design. The extended FRAM approach was applied to a typical air accident analysis, with more details drawn than the conclusions in the accident report issued officially by Agenzia Nazionale per la Sicurezza del Volo (ANSV.

  20. Preparing a seismic hazard model for Switzerland: the view from PEGASOS Expert Group 3 (EG1c)

    Energy Technology Data Exchange (ETDEWEB)

    Musson, R. M. W. [British Geological Survey, West Mains Road, Edinburgh, EH9 3LA (United Kingdom); Sellami, S. [Swiss Seismological Service, ETH-Hoenggerberg, Zuerich (Switzerland); Bruestle, W. [Regierungspraesidium Freiburg, Abt. 9: Landesamt fuer Geologie, Rohstoffe und Bergbau, Ref. 98: Landeserdbebendienst, Freiburg im Breisgau (Germany)

    2009-05-15

    The seismic hazard model used in the PEGASOS project for assessing earth-quake hazard at four NPP sites was a composite of four sub-models, each produced by a team of three experts. In this paper, one of these models is described in detail by the authors. A criticism sometimes levelled at probabilistic seismic hazard studies is that the process by which seismic source zones are arrived at is obscure, subjective and inconsistent. Here, we attempt to recount the stages by which the model evolved, and the decisions made along the way. In particular, a macro-to-micro approach was used, in which three main stages can be described. The first was the characterisation of the overall kinematic model, the 'big picture' of regional seismo-genesis. Secondly, this was refined to a more detailed seismotectonic model. Lastly, this was used as the basis of individual sources, for which parameters can be assessed. Some basic questions had also to be answered about aspects of the approach to modelling to be used: for instance, is spatial smoothing an appropriate tool to apply? Should individual fault sources be modelled in an intra-plate environment? Also, the extent to which alternative modelling decisions should be expressed in a logic tree structure has to be considered. (author)

  1. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    Science.gov (United States)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  2. Stiffness Model of a 3-DOF Parallel Manipulator with Two Additional Legs

    Directory of Open Access Journals (Sweden)

    Guang Yu

    2014-10-01

    Full Text Available This paper investigates the stiffness modelling of a 3-DOF parallel manipulator with two additional legs. The stiffness model in six directions of the 3-DOF parallel manipulator with two additional legs is derived by performing condensation of DOFs for the joint connection and treatment of the fixed-end connections. Moreover, this modelling method is used to derive the stiffness model of the manipulator with zero/one additional legs. Two performance indices are given to compare the stiffness of the parallel manipulators with two additional legs with those of the manipulators with zero/one additional legs. The method not only can be used to derive the stiffness model of a redundant parallel manipulator, but also to model the stiffness of non-redundant parallel manipulators.

  3. Bankruptcy prediction : static logit and discrete hazard models incorporating macoreconomic dependencies and industry effects

    OpenAIRE

    Sheikh, Suleman; Yahya, Muhammad

    2015-01-01

    In this thesis, we present firm default prediction models based on firm financial statements and macroeconomic variables. We seek to develop reliable models to forecast out-of-sample default probability, and we are particularly interested in exploring the impact of incorporating macroeconomic variables and industry effects. To the best of our knowledge, this is the first study to account for both macroeconomic dependencies and industry effects in one analysis. Additionally, we ...

  4. Tsunami hazard assessment along the French Mediterranean coast : detailed modeling of tsunami impacts for the ALDES project

    Science.gov (United States)

    Quentel, E.; Loevenbruck, A.; Hébert, H.

    2012-04-01

    The catastrophic 2004 tsunami drew the international community's attention to tsunami risk in all basins where tsunamis occurred but no warning system exists. Consequently, under the coordination of UNESCO, France decided to create a regional center, called CENALT, for the north-east Atlantic and the western Mediterranean. This warning system, which should be operational by 2012, is set up by the CEA in collaboration with the SHOM and the CNRS. The French authorities are in charge of the top-down alert system including the local alert dissemination. In order to prepare the appropriate means and measures, they initiated the ALDES (Alerte Descendante) project to which the CEA also contributes. It aims at examining along the French Mediterranean coast the tsunami risk related to earthquakes and landslides. In addition to the evaluation at regional scale, it includes the detailed studies of 3 selected sites; the local alert system will be designed for one of them : the French Riviera. In this project, our main task at CEA consists in assessing tsunami hazard related to seismic sources using numerical modeling. Past tsunamis have affected the west Mediterranean coast but are too few and poorly documented to provide a suitable database. Thus, a synthesis of earthquakes representative of the tsunamigenic seismic activity and prone to induce the largest impact to the French coast is performed based on historical data, seismotectonics and first order models. The North Africa Margin, the Ligurian and the South Tyrrhenian Seas are considered as the main tsunamigenic zones. In order to forecast the most important plausible effects, the magnitudes are estimated by enhancing to some extent the largest known values. Our hazard estimation is based on the simulation of the induced tsunamis scenarios performed with the CEA code. The 3 sites have been chosen according to the regional hazard studies, coastal typology elements and the appropriate DTMs (Digital Terrain Models). The

  5. Multiprocessing and Correction Algorithm of 3D-models for Additive Manufacturing

    Science.gov (United States)

    Anamova, R. R.; Zelenov, S. V.; Kuprikov, M. U.; Ripetskiy, A. V.

    2016-07-01

    This article addresses matters related to additive manufacturing preparation. A layer-by-layer model presentation was developed on the basis of a routing method. Methods for correction of errors in the layer-by-layer model presentation were developed. A multiprocessing algorithm for forming an additive manufacturing batch file was realized.

  6. Tsunami Hazard Assessment: Source regions of concern to U.S. interests derived from NOAA Tsunami Forecast Model Development

    Science.gov (United States)

    Eble, M. C.; uslu, B. U.; Wright, L.

    2013-12-01

    Synthetic tsunamis generated from source regions around the Pacific Basin are analyzed in terms of their relative impact on United States coastal locations.. The region of tsunami origin is as important as the expected magnitude and the predicted inundation for understanding tsunami hazard. The NOAA Center for Tsunami Research has developed high-resolution tsunami models capable of predicting tsunami arrival time and amplitude of waves at each location. These models have been used to conduct tsunami hazard assessments to assess maximum impact and tsunami inundation for use by local communities in education and evacuation map development. Hazard assessment studies conducted for Los Angeles, San Francisco, Crescent City, Hilo, and Apra Harbor are combined with results of tsunami forecast model development at each of seventy-five locations. Complete hazard assessment, identifies every possible tsunami variation from a pre-computed propagation database. Study results indicate that the Eastern Aleutian Islands and Alaska are the most likely regions to produce the largest impact on the West Coast of the United States, while the East Philippines and Mariana trench regions impact Apra Harbor, Guam. Hawaii appears to be impacted equally from South America, Alaska and the Kuril Islands.

  7. Slope instability induced by volcano-tectonics as an additional source of hazard in active volcanic areas: the case of Ischia island (Italy)

    Science.gov (United States)

    Della Seta, Marta; Marotta, Enrica; Orsi, Giovanni; de Vita, Sandro; Sansivero, Fabio; Fredi, Paola

    2012-01-01

    Ischia is an active volcanic island in the Gulf of Naples whose history has been dominated by a caldera-forming eruption (ca. 55 ka) and resurgence phenomena that have affected the caldera floor and generated a net uplift of about 900 m since 33 ka. The results of new geomorphological, stratigraphical and textural investigations of the products of gravitational movements triggered by volcano-tectonic events have been combined with the information arising from a reinterpretation of historical chronicles on natural phenomena such as earthquakes, ground deformation, gravitational movements and volcanic eruptions. The combined interpretation of all these data shows that gravitational movements, coeval to volcanic activity and uplift events related to the long-lasting resurgence, have affected the highly fractured marginal portions of the most uplifted Mt. Epomeo blocks. Such movements, mostly occurring since 3 ka, include debris avalanches; large debris flows (lahars); smaller mass movements (rock falls, slumps, debris and rock slides, and small debris flows); and deep-seated gravitational slope deformation. The occurrence of submarine deposits linked with subaerial deposits of the most voluminous mass movements clearly shows that the debris avalanches impacted on the sea. The obtained results corroborate the hypothesis that the behaviour of the Ischia volcano is based on an intimate interplay among magmatism, resurgence dynamics, fault generation, seismicity, slope oversteepening and instability, and eruptions. They also highlight that volcano-tectonically triggered mass movements are a potentially hazardous phenomena that have to be taken into account in any attempt to assess volcanic and related hazards at Ischia. Furthermore, the largest mass movements could also flow into the sea, generating tsunami waves that could impact on the island's coast as well as on the neighbouring and densely inhabited coast of the Neapolitan area.

  8. Examining school-based bullying interventions using multilevel discrete time hazard modeling.

    Science.gov (United States)

    Ayers, Stephanie L; Wagaman, M Alex; Geiger, Jennifer Mullins; Bermudez-Parsai, Monica; Hedberg, E C

    2012-10-01

    Although schools have been trying to address bullying by utilizing different approaches that stop or reduce the incidence of bullying, little remains known about what specific intervention strategies are most successful in reducing bullying in the school setting. Using the social-ecological framework, this paper examines school-based disciplinary interventions often used to deliver consequences to deter the reoccurrence of bullying and aggressive behaviors among school-aged children. Data for this study are drawn from the School-Wide Information System (SWIS) with the final analytic sample consisting of 1,221 students in grades K - 12 who received an office disciplinary referral for bullying during the first semester. Using Kaplan-Meier Failure Functions and Multi-level discrete time hazard models, determinants of the probability of a student receiving a second referral over time were examined. Of the seven interventions tested, only Parent-Teacher Conference (AOR = 0.65, p connection between the students' mesosystems as well as utilizing disciplinary strategies that take into consideration student's microsystem roles.

  9. Modeling of Natural Coastal Hazards in Puerto Rico in Support of Emergency Management and Coastal Planning

    Science.gov (United States)

    Mercado, A., Jr.

    2015-12-01

    The island of Puerto Rico is not only located in the so-called Caribbean hurricane alley, but is also located in a tsunami prone region. And both phenomena have affected the island. For the past few years we have undergone the task of upgrading the available coastal flood maps due to storm surges and tsunamis. This has been done taking advantage of new Lidar-derived, high resolution, topography and bathymetry and state-of-the-art models (MOST for tsunamis and ADCIRC/SWAN for storm surges). The tsunami inundation maps have been converted to evacuation maps. In tsunamis we are also working in preparing hazard maps due to tsunami currents inside ports, bays, and marinas. The storm surge maps include two scenarios of sea level rise: 0.5 and 1.0 m above Mean High Water. All maps have been adopted by the Puerto Rico State Emergency Management Agency, and are publicly available through the Internet. It is the purpose of this presentation to summarize how it has been done, the spin-off applications they have generated, and how we plan to improve coastal flooding predictions.

  10. On the predictive information criteria for model determination in seismic hazard analysis

    Science.gov (United States)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  11. Simulating floods : On the application of a 2D-hydraulic model for flood hazard and risk assessment

    OpenAIRE

    Alkema, D.

    2007-01-01

    Over the last decades, river floods in Europe seem to occur more frequently and are causing more and more economic and emotional damage. Understanding the processes causing flooding and the development of simulation models to evaluate countermeasures to control that damage are important issues. This study deals with the application of a 2D hydraulic flood propagation model for flood hazard and risk assessment. It focuses on two components: 1) how well does it predict the spatial-dynamic chara...

  12. A framework for modeling clustering in natural hazard catastrophe risk management and the implications for re/insurance loss perspectives

    OpenAIRE

    Khare, S; Bonazzi, A.; C. Mitas; S. Jewson

    2014-01-01

    In this paper, we present a novel framework for modelling clustering in natural hazard risk models. The framework we present is founded on physical principles where large-scale oscillations in the physical system is the source of non-Poissonian (clustered) frequency behaviour. We focus on a particular mathematical implementation of the "Super-Cluster" methodology that we introduce. This mathematical framework has a number of advantages including tunability...

  13. Numerical Stress Field Modelling: from geophysical observations toward volcano hazard assessment

    Science.gov (United States)

    Currenti, Gilda; Coco, Armando; Privitera, Emanuela

    2015-04-01

    . Numerical results show the contribution of groundwater head gradients associated with topographically induced flow and pore-pressure changes, providing a quantitative estimate for deformation and failure of volcano edifice. The comparison between the predictions of the model and the observations can provide valuable insights about the stress state of the volcano and, hence, about the likelihood of an impending eruption. This innovative approach opens up new perspectives in geodetic inverse modelling and poses the basis for future development in a volcano hazard assessment based on a critical combination of geophysical observations and numerical modelling.

  14. Modelling poverty by not modelling poverty: an application of a simultaneous hazards approach to the UK

    OpenAIRE

    Aassve, Arnstein; Burgess, Simon; Dickson, Matt; Propper, Carol

    2006-01-01

    We pursue an economic approach to analysing poverty. This requires a focus on the variables that individuals can influence, such as forming or dissolving a union or having children. We argue that this indirect approach to modelling poverty is the right way to bring economic tools to bear on the issue. In our implementation of this approach, we focus on endogenous demographic and employment transitions as the driving forces behind changes in poverty. We construct a dataset covering event histo...

  15. Time Series Forecasting by using Seasonal Autoregressive Integrated Moving Average: Subset, Multiplicative or Additive Model

    Directory of Open Access Journals (Sweden)

    Suhartono

    2011-01-01

    Full Text Available Problem statement: Most of Seasonal Autoregressive Integrated Moving Average (SARIMA models that used for forecasting seasonal time series are multiplicative SARIMA models. These models assume that there is a significant parameter as a result of multiplication between nonseasonal and seasonal parameters without testing by certain statistical test. Moreover, most popular statistical software such as MINITAB and SPSS only has facility to fit a multiplicative model. The aim of this research is to propose a new procedure for indentifying the most appropriate order of SARIMA model whether it involves subset, multiplicative or additive order. In particular, the study examined whether a multiplicative parameter existed in the SARIMA model. Approach: Theoretical derivation about Autocorrelation (ACF and Partial Autocorrelation (PACF functions from subset, multiplicative and additive SARIMA model was firstly discussed and then R program was used to create the graphics of these theoretical ACF and PACF. Then, two monthly datasets were used as case studies, i.e. the international airline passenger data and series about the number of tourist arrivals to Bali, Indonesia. The model identification step to determine the order of ARIMA model was done by using MINITAB program and the model estimation step used SAS program to test whether the model consisted of subset, multiplicative or additive order. Results: The theoretical ACF and PACF showed that subset, multiplicative and additive SARIMA models have different patterns, especially at the lag as a result of multiplication between non-seasonal and seasonal lags. Modeling of the airline data yielded a subset SARIMA model as the best model, whereas an additive SARIMA model is the best model for forecasting the number of tourist arrivals to Bali. Conclusion: Both of case studies showed that a multiplicative SARIMA model was not the best model for forecasting these data. The comparison evaluation showed that subset

  16. Flow-R, a model for susceptibility mapping of debris flows and other gravitational hazards at a regional scale

    Directory of Open Access Journals (Sweden)

    P. Horton

    2013-04-01

    Full Text Available The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM. The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time

  17. Numerical modeling of debris avalanches at Nevado de Toluca (Mexico): implications for hazard evaluation and mapping

    Science.gov (United States)

    Grieco, F.; Capra, L.; Groppelli, G.; Norini, G.

    2007-05-01

    The present study concerns the numerical modeling of debris avalanches on the Nevado de Toluca Volcano (Mexico) using TITAN2D simulation software, and its application to create hazard maps. Nevado de Toluca is an andesitic to dacitic stratovolcano of Late Pliocene-Holocene age, located in central México near to the cities of Toluca and México City; its past activity has endangered an area with more than 25 million inhabitants today. The present work is based upon the data collected during extensive field work finalized to the realization of the geological map of Nevado de Toluca at 1:25,000 scale. The activity of the volcano has developed from 2.6 Ma until 10.5 ka with both effusive and explosive events; the Nevado de Toluca has presented long phases of inactivity characterized by erosion and emplacement of debris flow and debris avalanche deposits on its flanks. The largest epiclastic events in the history of the volcano are wide debris flows and debris avalanches, occurred between 1 Ma and 50 ka, during a prolonged hiatus in eruptive activity. Other minor events happened mainly during the most recent volcanic activity (less than 50 ka), characterized by magmatic and tectonic-induced instability of the summit dome complex. According to the most recent tectonic analysis, the active transtensive kinematics of the E-W Tenango Fault System had a strong influence on the preferential directions of the last three documented lateral collapses, which generated the Arroyo Grande and Zaguàn debris avalanche deposits towards E and Nopal debris avalanche deposit towards W. The analysis of the data collected during the field work permitted to create a detailed GIS database of the spatial and temporal distribution of debris avalanche deposits on the volcano. Flow models, that have been performed with the software TITAN2D, developed by GMFG at Buffalo, were entirely based upon the information stored in the geological database. The modeling software is built upon equations

  18. Modelling of Hazards Effect on Safety Integrity of Open Transmission Systems

    OpenAIRE

    Karol Rástočný; Mária Franeková; Peter Holečko; Iveta Zolotová

    2016-01-01

    The paper is concerned with safety appraisal of safety-related communication systems (SRComSs) with open transmission system, where except in addition to message transmission integrity also confidentiality is recommended to be provided. The authors focused on safety analysis of safety-related messages transmission secured using cryptographic and safety code mechanisms and on the possibilities of modelling safety-related industrial communication system, where a high safety integrity level SIL3...

  19. Considering the Epistemic Uncertainties of the Variogram Model in Locating Additional Exploratory Drillholes

    Directory of Open Access Journals (Sweden)

    Saeed Soltani

    2015-06-01

    Full Text Available To enhance the certainty of the grade block model, it is necessary to increase the number of exploratory drillholes and collect more data from the deposit. The inputs of the process of locating these additional drillholes include the variogram model parameters, locations of the samples taken from the initial drillholes, and the geological block model. The uncertainties of these inputs will lead to uncertainties in the optimal locations of additional drillholes. Meanwhile, the locations of the initial data are crisp, but the variogram model parameters and the geological model have uncertainties due to the limitation of the number of initial data. In this paper, effort has been made to consider the effects of variogram uncertainties on the optimal location of additional drillholes using the fuzzy kriging and solve the locating problem with the genetic algorithm (GA optimization method.A bauxite deposit case study has shown the efficiency of the proposed model.

  20. Digital elevation models in the marine domain: investigating the offshore tsunami hazard from submarine landslides

    Science.gov (United States)

    Tappin, David R.

    2015-04-01

    the resolution necessary to identify the hazard from landslides, particularly along convergent margins where this hazard is the greatest. Multibeam mapping of the deep seabed requires low frequency sound sources that, because of their corresponding low resolution, cannot produce the detail required to identify the finest scale features. In addition, outside of most countries, there are not the repeat surveys that allow seabed changes to be identified. Perhaps only japan has this data. In the near future as research budgets shrink and ship time becomes ever expensive new strategies will have to be used to make best use of the vessels available. Remote AUV technology is almost certainly the answer, and should be increasingly utilised to map the seabed while the mother ship is better used to carry out other duties, such as sampling or seismic data acquisition. This will have the advantage in the deep ocean of acquiring higher resolution data from high frequency multibeams. This talk presents on a number of projects that show the evolution of the use of MBES in mapping submarine landslides since the PNG tsunami. Data from PNG is presented, together with data from Japan, Hawaii and the NE Atlantic. New multibeam acquisition methodologies are also discussed.

  1. Subduction zone and crustal dynamics of western Washington; a tectonic model for earthquake hazards evaluation

    Science.gov (United States)

    Stanley, Dal; Villaseñor, Antonio; Benz, Harley

    1999-01-01

    The Cascadia subduction zone is extremely complex in the western Washington region, involving local deformation of the subducting Juan de Fuca plate and complicated block structures in the crust. It has been postulated that the Cascadia subduction zone could be the source for a large thrust earthquake, possibly as large as M9.0. Large intraplate earthquakes from within the subducting Juan de Fuca plate beneath the Puget Sound region have accounted for most of the energy release in this century and future such large earthquakes are expected. Added to these possible hazards is clear evidence for strong crustal deformation events in the Puget Sound region near faults such as the Seattle fault, which passes through the southern Seattle metropolitan area. In order to understand the nature of these individual earthquake sources and their possible interrelationship, we have conducted an extensive seismotectonic study of the region. We have employed P-wave velocity models developed using local earthquake tomography as a key tool in this research. Other information utilized includes geological, paleoseismic, gravity, magnetic, magnetotelluric, deformation, seismicity, focal mechanism and geodetic data. Neotectonic concepts were tested and augmented through use of anelastic (creep) deformation models based on thin-plate, finite-element techniques developed by Peter Bird, UCLA. These programs model anelastic strain rate, stress, and velocity fields for given rheological parameters, variable crust and lithosphere thicknesses, heat flow, and elevation. Known faults in western Washington and the main Cascadia subduction thrust were incorporated in the modeling process. Significant results from the velocity models include delineation of a previously studied arch in the subducting Juan de Fuca plate. The axis of the arch is oriented in the direction of current subduction and asymmetrically deformed due to the effects of a northern buttress mapped in the velocity models. This

  2. System Dynamics Model to develop resilience management strategies for lifelines exposed to natural hazards

    Science.gov (United States)

    Pagano, Alessandro; Pluchinotta, Irene; Giordano, Raffaele; Vurro, Michele

    2016-04-01

    Resilience has recently become a key concept, and a crucial paradigm in the analysis of the impacts of natural disasters, mainly concerning Lifeline Systems (LS). Indeed, the traditional risk management approaches require a precise knowledge of all potential hazards and a full understanding of the interconnections among different infrastructures, based on past events and trends analysis. Nevertheless, due to the inner complexity of LS, their interconnectedness and the dynamic context in which they operate (i.e. technology, economy and society), it is difficult to gain a complete comprehension of the processes influencing vulnerabilities and threats. Therefore, resilience thinking addresses the complexities of large integrated systems and the uncertainty of future threats, emphasizing the absorbing, adapting and responsive behavior of the system. Resilience thinking approaches are focused on the capability of the system to deal with the unforeseeable. The increasing awareness of the role played by LS, has led governmental agencies and institutions to develop resilience management strategies. Risk prone areas, such as cities, are highly dependent on infrastructures providing essential services that support societal functions, safety, economic prosperity and quality of life. Among the LS, drinking water supply is critical for supporting citizens during emergency and recovery, since a disruption could have a range of serious societal impacts. A very well-known method to assess LS resilience is the TOSE approach. The most interesting feature of this approach is the integration of four dimensions: Technical, Organizational, Social and Economic. Such issues are all concurrent to the resilience level of an infrastructural system, and should be therefore quantitatively assessed. Several researches underlined that the lack of integration among the different dimensions, composing the resilience concept, may contribute to a mismanagement of LS in case of natural disasters

  3. Assessment of erosion hazard after recurrence fires with the RUSLE 3D MODEL

    Science.gov (United States)

    Vecín-Arias, Daniel; Palencia, Covadonga; Fernández Raga, María

    2016-04-01

    The objective of this work is to calculate if there is more soil erosion after the recurrence of several forest fires on an area. To that end, it has been studied an area of 22 130 ha because has a high frequency of fires. This area is located in the northwest of the Iberian Peninsula. The assessment of erosion hazard was calculated in several times using Geographic Information Systems (GIS).The area have been divided into several plots according to the number of times they have been burnt in the past 15 years. Due to the complexity that has to make a detailed study of a so large field and that there are not information available anually, it is necessary to select the more interesting moments. In august 2012 it happened the most agressive and extensive fire of the area. So the study was focused on the erosion hazard for 2011 and 2014, because they are the date before and after from the fire of 2012 in which there are orthophotos available. RUSLE3D model (Revised Universal Soil Loss Equation) was used to calculate maps erosion losses. This model improves the traditional USLE (Wischmeier and D., 1965) because it studies the influence of the concavity / convexity (Renard et al., 1997), and improves the estimation of the slope factor LS (Renard et al., 1991). It is also one of the most commonly used models in literatura (Mitasova et al., 1996; Terranova et al., 2009). The tools used are free and accessible, using GIS "gvSIG" (http://www.gvsig.com/es) and the metadata were taken from Spatial Data Infrastructure of Spain webpage (IDEE, 2016). However the RUSLE model has many critics as some authors who suggest that only serves to carry out comparisons between areas, and not for the calculation of absolute soil loss data. These authors argue that in field measurements the actual recovered eroded soil can suppose about one-third of the values obtained with the model (Šúri et al., 2002). The study of the area shows that the error detected by the critics could come from

  4. Assessment of erosion hazard after recurrence fires with the RUSLE 3D MODEL

    Science.gov (United States)

    Vecín-Arias, Daniel; Palencia, Covadonga; Fernández Raga, María

    2016-04-01

    The objective of this work is to calculate if there is more soil erosion after the recurrence of several forest fires on an area. To that end, it has been studied an area of 22 130 ha because has a high frequency of fires. This area is located in the northwest of the Iberian Peninsula. The assessment of erosion hazard was calculated in several times using Geographic Information Systems (GIS).The area have been divided into several plots according to the number of times they have been burnt in the past 15 years. Due to the complexity that has to make a detailed study of a so large field and that there are not information available anually, it is necessary to select the more interesting moments. In august 2012 it happened the most agressive and extensive fire of the area. So the study was focused on the erosion hazard for 2011 and 2014, because they are the date before and after from the fire of 2012 in which there are orthophotos available. RUSLE3D model (Revised Universal Soil Loss Equation) was used to calculate maps erosion losses. This model improves the traditional USLE (Wischmeier and D., 1965) because it studies the influence of the concavity / convexity (Renard et al., 1997), and improves the estimation of the slope factor LS (Renard et al., 1991). It is also one of the most commonly used models in literatura (Mitasova et al., 1996; Terranova et al., 2009). The tools used are free and accessible, using GIS "gvSIG" (http://www.gvsig.com/es) and the metadata were taken from Spatial Data Infrastructure of Spain webpage (IDEE, 2016). However the RUSLE model has many critics as some authors who suggest that only serves to carry out comparisons between areas, and not for the calculation of absolute soil loss data. These authors argue that in field measurements the actual recovered eroded soil can suppose about one-third of the values obtained with the model (Šúri et al., 2002). The study of the area shows that the error detected by the critics could come from

  5. The influence of dispersing additive on the paraffin crystallization in model systems

    Science.gov (United States)

    Gorshkov, A. M.; Tien Thang, Pham; Shishmina, L. V.; Chekantseva, L. V.

    2015-11-01

    The work is dedicated to investigation of the influence of dispersing additive on the paraffin crystallization in model systems. A new method to determine the paraffin saturation point of transparent solutions based on the phenomenon of light scattering has been proposed. The linear relationship between the values of critical micelle concentrations of the additive and the quantity of paraffin in solution has been obtained. The influence of the model system composition on the paraffin crystallization has been studied.

  6. Regularization for Generalized Additive Mixed Models by Likelihood-Based Boosting

    OpenAIRE

    Groll, Andreas; Tutz, Gerhard

    2012-01-01

    With the emergence of semi- and nonparametric regression the generalized linear mixed model has been expanded to account for additive predictors. In the present paper an approach to variable selection is proposed that works for generalized additive mixed models. In contrast to common procedures it can be used in high-dimensional settings where many covariates are available and the form of the influence is unknown. It is constructed as a componentwise boosting method and hence is able to pe...

  7. Degree of multicollinearity and variables involved in linear dependence in additive-dominant models

    Directory of Open Access Journals (Sweden)

    Juliana Petrini

    2012-12-01

    Full Text Available The objective of this work was to assess the degree of multicollinearity and to identify the variables involved in linear dependence relations in additive-dominant models. Data of birth weight (n=141,567, yearling weight (n=58,124, and scrotal circumference (n=20,371 of Montana Tropical composite cattle were used. Diagnosis of multicollinearity was based on the variance inflation factor (VIF and on the evaluation of the condition indexes and eigenvalues from the correlation matrix among explanatory variables. The first model studied (RM included the fixed effect of dam age class at calving and the covariates associated to the direct and maternal additive and non-additive effects. The second model (R included all the effects of the RM model except the maternal additive effects. Multicollinearity was detected in both models for all traits considered, with VIF values of 1.03 - 70.20 for RM and 1.03 - 60.70 for R. Collinearity increased with the increase of variables in the model and the decrease in the number of observations, and it was classified as weak, with condition index values between 10.00 and 26.77. In general, the variables associated with additive and non-additive effects were involved in multicollinearity, partially due to the natural connection between these covariables as fractions of the biological types in breed composition.

  8. A framework for modeling clustering in natural hazard catastrophe risk management and the implications for re/insurance loss perspectives

    Science.gov (United States)

    Khare, S.; Bonazzi, A.; Mitas, C.; Jewson, S.

    2014-08-01

    In this paper, we present a novel framework for modelling clustering in natural hazard risk models. The framework we present is founded on physical principles where large-scale oscillations in the physical system is the source of non-Poissonian (clustered) frequency behaviour. We focus on a particular mathematical implementation of the "Super-Cluster" methodology that we introduce. This mathematical framework has a number of advantages including tunability to the problem at hand, as well as the ability to model cross-event correlation. Using European windstorm data as an example, we provide evidence that historical data show strong evidence of clustering. We then develop Poisson and clustered simulation models for the data, demonstrating clearly the superiority of the clustered model which we have implemented using the Poisson-Mixtures approach. We then discuss the implications of including clustering in models of prices on catXL contracts, one of the most commonly used mechanisms for transferring risk between primary insurers and reinsurers. This paper provides a number of new insights into the impact clustering has on modelled catXL contract prices. The simple model presented in this paper provides an insightful starting point for practicioners of natural hazard risk modelling.

  9. A framework for modeling clustering in natural hazard catastrophe risk management and the implications for re/insurance loss perspectives

    Directory of Open Access Journals (Sweden)

    S. Khare

    2014-08-01

    Full Text Available In this paper, we present a novel framework for modelling clustering in natural hazard risk models. The framework we present is founded on physical principles where large-scale oscillations in the physical system is the source of non-Poissonian (clustered frequency behaviour. We focus on a particular mathematical implementation of the "Super-Cluster" methodology that we introduce. This mathematical framework has a number of advantages including tunability to the problem at hand, as well as the ability to model cross-event correlation. Using European windstorm data as an example, we provide evidence that historical data show strong evidence of clustering. We then develop Poisson and clustered simulation models for the data, demonstrating clearly the superiority of the clustered model which we have implemented using the Poisson-Mixtures approach. We then discuss the implications of including clustering in models of prices on catXL contracts, one of the most commonly used mechanisms for transferring risk between primary insurers and reinsurers. This paper provides a number of new insights into the impact clustering has on modelled catXL contract prices. The simple model presented in this paper provides an insightful starting point for practicioners of natural hazard risk modelling.

  10. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    OpenAIRE

    Álvarez-Gómez, José Antonio; Aniel-Quiroga Zorrilla, Íñigo; Gutiérrez Gutiérrez, Omar Quetzalcóatl; Larreynaga Murcia, Jeniffer; González Rodríguez, Ernesto Mauricio; M. Castro; Gavidia Medina, Francisco; Aguirre Ayerbe, Ignacio; González-Riancho Calzada, Pino; Carreño Herrero, Emilio

    2013-01-01

    ABSTRACT. El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate lenght of 320 km, 29 municipalities and more than 700.000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and deterministic m...

  11. Using a fire propagation model to assess the efficiency of prescribed burning in reducing the fire hazard

    OpenAIRE

    Cassagne, Nathalie; Pimont, François; Dupuy, Jean-Luc; Linn, Rodman R.; Marell, Anders; Oliveri, Chloe; Rigolot, Eric

    2011-01-01

    We examined how fire hazard was affected by prescribed burning and fuel recovery over the first six years following treatment. Eight common Mediterranean fuel complexes managed by means of prescribed burning in limestone Provence (South-Eastern France) were studied, illustrating forest and woodland, garrigue and grassland situations. The coupled atmosphere-wildfire behaviour model FIRETEC was used to simulate fire behaviour (ROS, intensity) in these complex vegetations. The temporal threshold...

  12. The chaos and control of a food chain model supplying additional food to top-predator

    International Nuclear Information System (INIS)

    Highlights: • We propose a chaotic food chain model supplying additional food to top-predator. • Local and global stability conditions are derived in presence of additional food. • Chaos is controlled only by increasing quantity of additional food. • System enters into periodic region and depicts Hopf bifurcations supplying additional food. • This an application of non-chemical methods for controlling chaos. -- Abstract: The control and management of chaotic population is one of the main objectives for constructing mathematical model in ecology today. In this paper, we apply a technique of controlling chaotic predator–prey population dynamics by supplying additional food to top-predator. We formulate a three species predator–prey model supplying additional food to top-predator. Existence conditions and local stability criteria of equilibrium points are determined analytically. Persistence conditions for the system are derived. Global stability conditions of interior equilibrium point is calculated. Theoretical results are verified through numerical simulations. Phase diagram is presented for various quality and quantity of additional food. One parameter bifurcation analysis is done with respect to quality and quantity of additional food separately keeping one of them fixed. Using MATCONT package, we derive the bifurcation scenarios when both the parameters quality and quantity of additional food vary together. We predict the existence of Hopf point (H), limit point (LP) and branch point (BP) in the model for suitable supply of additional food. We have computed the regions of different dynamical behaviour in the quantity–quality parametric plane. From our study we conclude that chaotic population dynamics of predator prey system can be controlled to obtain regular population dynamics only by supplying additional food to top predator. This study is aimed to introduce a new non-chemical chaos control mechanism in a predator–prey system with the

  13. Model of quasi-ideal cascade with an additional feed flow and losses of working substances

    International Nuclear Information System (INIS)

    A mathematical model for the quasi-ideal cascade with an additional feed flow and losses of' working substances was established. Analytical relations to calculate the relative product and waste flows, component concentrations in the product and waste flows and the total substance flow in this cascade model were obtained by solving cascade equations. Cascade calculations were performed for separation of the recycled uranium. It was analyzed that the effects of loss factor and ratio between base and additional flows on the product concentration of cascade, in which the natural uranium was fed as a base feed flow and the recycled uranium as an additional one. (authors)

  14. ADDITIVE-MULTIPLICATIVE MODEL FOR RISK ESTIMATION IN THE PRODUCTION OF ROCKET AND SPACE TECHNICS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2014-10-01

    Full Text Available For the first time we have developed a general additive-multiplicative model of the risk estimation (to estimate the probabilities of risk events. In the two-level system in the lower level the risk estimates are combined additively, on the top – in a multiplicative way. Additive-multiplicative model was used for risk estimation for (1 implementation of innovative projects at universities (with external partners, (2 the production of new innovative products, (3 the projects for creation of rocket and space equipmen

  15. Characterizing the danger of in-channel river hazards using LIDAR and a 2D hydrodynamic model

    Science.gov (United States)

    Strom, M. A.; Pasternack, G. B.

    2014-12-01

    Despite many injuries and deaths each year worldwide, no analytically rigorous attempt exists to characterize and quantify the dangers to boaters, swimmers, fishermen, and other river enthusiasts. While designed by expert boaters, the International Scale of River Difficulty provides a whitewater classification that uses qualitative descriptions and subjective scoring. The purpose of this study was to develop an objective characterization of in-channel hazard dangers across spatial scales from a single boulder to an entire river segment for application over a wide range of discharges and use in natural hazard assessment and mitigation, recreational boating safety, and river science. A process-based conceptualization of river hazards was developed, and algorithms were programmed in R to quantify the associated dangers. Danger indicators included the passage proximity and reaction time posed to boats and swimmers in a river by three hazards: emergent rocks, submerged rocks, and hydraulic jumps or holes. The testbed river was a 12.2 km mixed bedrock-alluvial section of the upper South Yuba River between Lake Spaulding and Washington, CA in the Sierra Mountains. The segment has a mean slope of 1.63%, with 8 reaches varying from 1.07% to 3.30% slope and several waterfalls. Data inputs to the hazard analysis included sub-decimeter aerial color imagery, airborne LIDAR of the river corridor, bathymetric data, flow inputs, and a stage-discharge relation for the end of the river segment. A key derived data product was the location and configuration of boulders and boulder clusters as these were potential hazards. Two-dimensional hydrodynamic modeling was used to obtain the meter-scale spatial pattern of depth and velocity at discharges ranging from baseflow to modest flood stages. Results were produced for four discharges and included the meter-scale spatial pattern of the passage proximity and reaction time dangers for each of the three hazards investigated. These results

  16. Introducing Geoscience Students to Numerical Modeling of Volcanic Hazards: The example of Tephra2 on VHub.org

    Directory of Open Access Journals (Sweden)

    Leah M. Courtland

    2012-07-01

    Full Text Available The Tephra2 numerical model for tephra fallout from explosive volcanic eruptions is specifically designed to enable students to probe ideas in model literacy, including code validation and verification, the role of simplifying assumptions, and the concepts of uncertainty and forecasting. This numerical model is implemented on the VHub.org website, a venture in cyberinfrastructure that brings together volcanological models and educational materials. The VHub.org resource provides students with the ability to explore and execute sophisticated numerical models like Tephra2. We present a strategy for using this model to introduce university students to key concepts in the use and evaluation of Tephra2 for probabilistic forecasting of volcanic hazards. Through this critical examination students are encouraged to develop a deeper understanding of the applicability and limitations of hazard models. Although the model and applications are intended for use in both introductory and advanced geoscience courses, they could easily be adapted to work in other disciplines, such as astronomy, physics, computational methods, data analysis, or computer science.

  17. A hydro-sedimentary modelling system for flash flood propagation and hazard estimation under different agricultural practices

    Directory of Open Access Journals (Sweden)

    N. N. Kourgialas

    2013-10-01

    Full Text Available A modelling system for the estimation of flash flood flow characteristics and sediment transport is developed in this study. The system comprises of three components: (a a modelling framework based on the hydrological model HSPF, (b the hydrodynamic module of the hydraulic model MIKE 11 (quasi-2-D, and (c the advection-dispersion module of MIKE 11 as a sediment transport model. An important parameter in hydraulic modelling is the Manning's coefficient, an indicator of the channel resistance which is directly depended on riparian vegetation changes. Riparian vegetation effect on flood propagation parameters such as water depth (inundation, discharge, flow velocity, and sediment transport load is investigated in this study. Based on the obtained results, when the weed cutting percentage is increased, the flood wave depth decreases while flow discharge, velocity and sediment transport load increase. The proposed modelling system is used to evaluate and illustrate the flood hazard for different cutting riparian vegetation scenarios. For the estimation of flood hazard, a combination of the flood propagation characteristics of water depth, flow velocity and sediment load was used. Next, an optimal selection of the most appropriate agricultural cutting practices of riparian vegetation was performed. Ultimately, the model results obtained for different agricultural cutting practice scenarios can be employed to create flood protection measures for flood prone areas. The proposed methodology was applied to the downstream part of a small mediterranean river basin in Crete, Greece.

  18. Robust Estimation of Mean and Dispersion Functions in Extended Generalized Additive Models

    NARCIS (Netherlands)

    Croux, C.; Gijbels, I.; Prosdocimi, I.

    2010-01-01

    Generalized Linear Models are a widely used method to obtain parametric es- timates for the mean function. They have been further extended to allow the re- lationship between the mean function and the covariates to be more flexible via Generalized Additive Models. However the fixed variance structur

  19. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    Science.gov (United States)

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  20. Modelling risk in high hazard operations: integrating technical, organisational and cultural factors

    NARCIS (Netherlands)

    Ale, B.J.M.; Hanea, D.M.; Sillem, S.; Lin, P.H.; Van Gulijk, C.; Hudson, P.T.W.

    2012-01-01

    Recent disasters in high hazard industries such as Oil and Gas Exploration (The Deepwater Horizon) and Petrochemical production (Texas City) have been found to have causes that range from direct technical failures through organizational shortcomings right up to weak regulation and inappropriate comp

  1. Assessing End-Of-Supply Risk of Spare Parts Using the Proportional Hazard Model

    NARCIS (Netherlands)

    X. Li (Xishu); R. Dekker (Rommert); C. Heij (Christiaan); M. Hekimoğlu (Mustafa)

    2016-01-01

    textabstractOperators of long field-life systems like airplanes are faced with hazards in the supply of spare parts. If the original manufacturers or suppliers of parts end their supply, this may have large impacts on operating costs of firms needing these parts. Existing end-of-supply evaluation me

  2. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey

    Science.gov (United States)

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  3. Between and beyond additivity and non-additivity; the statistical modelling of genotype by environment interaction in plant breeding.

    NARCIS (Netherlands)

    Eeuwijk, van F.A.

    1996-01-01

    In plant breeding it is a common observation to see genotypes react differently to environmental changes. This phenomenon is called genotype by environment interaction. Many statistical approaches for analysing genotype by environment interaction rely heavily on the analysis of variance model. Genot

  4. Incorporating descriptive metadata into seismic source zone models for seismic-hazard assessment : a case study of the Azores-West Iberian Region

    OpenAIRE

    Vilanova, S. P.; Nemser, E.S.; Besana-Ostman, G.M.; Bezzeghoud, M.; Borges, J. F.; Brum da Silveira, A.; Cabral, J.; Carvalho, J.; Cunha, P. P.; R.P. Dias; J. Madeira; Lopes, F.C.; C. S. Oliveira; Perea, H.; García‐Mayordomo, J.

    2014-01-01

    In probabilistic seismic-hazard analysis (PSHA), seismic source zone (SSZ) models are widely used to account for the contribution to the hazard from earth- quakes not directly correlated with geological structures. Notwithstanding the impact of SSZ models in PSHA, the theoretical framework underlying SSZ models and the criteria used to delineate the SSZs are seldom explicitly stated and suitably docu- mented. In this paper, we propose a methodological framework to develop and docu- ment SSZ m...

  5. [Proportional hazards model of birth intervals among marriage cohorts since the 1960s].

    Science.gov (United States)

    Otani, K

    1987-01-01

    With a view to investigating the possibility of an attitudinal change towards the timing of 1st and 2nd births, proportional hazards model analysis of the 1st and 2nd birth intervals and univariate life table analysis were both carried out. Results showed that love matches and conjugal families immediately after marriage are accompanied by a longer 1st birth interval than others, even after controlling for other independent variables. Marriage cohort analysis also shows a net effect on the relative risk of having a 1st birth. Marriage cohorts since the mid-1960s demonstrate a shorter 1st birth interval than the 1961-63 cohort. With regard to the 2nd birth interval, longer 1st birth intervals, arranged marriages, conjugal families immediately following marriage, and higher ages at 1st marriage of women tended to provoke a longer 2nd birth interval. There is no interaction between the 1st birth interval and marriage cohort. Once other independent variables were controlled, with the exception of the marriage cohorts of the early 1970s, the authors found no effect of marriage cohort on the relative risk of having a 2nd birth. This suggests that an attitudinal change towards the timing of births in this period was mainly restricted to that of a 1st birth. Fluctuations in the 2nd birth interval during the 1970-72 marriage cohort were scrutinized in detail. As a result, the authors found that conjugal families after marriage, wives with low educational status, women with husbands in white collar professions, women with white collar fathers, and wives with high age at 1st marriage who married during 1970-72 and had a 1st birth interval during 1972-74 suffered most from the pronounced rise in the 2nd birth interval. This might be due to the relatively high sensitivity to a change in socioeconomic status; the oil crisis occurring around the time of marriage and 1st birth induced a delay in the 2nd birth. The unanimous decrease in the 2nd birth interval among the 1973

  6. A Fault-based Crustal Deformation Model for UCERF3 and Its Implication to Seismic Hazard Analysis

    Science.gov (United States)

    Zeng, Y.; Shen, Z.

    2012-12-01

    shear zone and northern Walker Lane. This implies a significant increase in seismic hazard in the eastern California and northern Walker Lane region, but decreased seismic hazard in the southern San Andreas area, relative to the current model used in the USGS 2008 seismic hazard map evaluation. Overall the geodetic model suggests an increase in total regional moment rate of 24% compared with the UCERF2 model and the 150-yr California earthquake catalog. However not all the increases are seismic so the seismic/aseismic slip rate ratios are critical for future seismic hazard assessment.

  7. GIS-Based Spatial Analysis and Modeling for Landslide Hazard Assessment: A Case Study in Upper Minjiang River Basin

    Institute of Scientific and Technical Information of China (English)

    FENG Wenlan; ZHOU Qigang; ZHANG Baolei; ZHOU Wancun; LI Ainong; ZHANG Haizhen; XIAN Wei

    2006-01-01

    By analyzing the topographic features of past landslides since 1980s and the main land-cover types (including change information) in landslide-prone area, modeled spatial distribution of landslide hazard in upper Minjiang River Basin was studied based on spatial analysis of GIS in this paper. Results of GIS analysis showed that landslide occurrence in this region closely related to topographic feature. Most areas with high hazard probability were deep-sheared gorge. Most of them in investigation occurred assembly in areas with elevation lower than 3 000 m, due to fragile topographic conditions and intensive human disturbances. Land-cover type, including its change information, was likely an important environmental factor to trigger landslide. Destroy of vegetation driven by increase of population and its demands augmented the probability of landslide in steep slope.

  8. Debris flow hazard assessment by integrated modeling of landslide triggering and propagation: application to the Messina Province, Italy

    Science.gov (United States)

    Stancanelli, L. M.; Peres, D. J.; Cavallaro, L.; Cancelliere, A.; Foti, E.

    2014-12-01

    During the last decades an increase of debris flow catastrophic events has been recorded along the Italian territory, mainly due to the increment of settlements and human activities in mountain areas. Considering the large extent of debris flow prone areas, non structural protection strategies should be preferably implemented because of economic constrains associated with structural mitigation measures. In such a framework hazard assessment methodologies play a key role representing useful tools for the development of emergency management policies. The aim of the present study is to apply an integrated debris flow hazard assessment methodology, where rainfall probabilistic analysis and physically-based landslide triggering and propagation models are combined. In particular, the probabilistic rainfall analysis provides the forcing scenarios of different return periods, which are then used as input to a model based on combination of the USGS TRIGRS and the FLO-2D codes. The TRIGRS model (Baum et al., 2008; 2010), developed for analyzing shallow landslide triggering is based on an analytical solution of linearized forms of the Richards' infiltration equation and an infinite-slope stability calculation to estimate the timing and locations of slope failures, while the FLO-2D (O'Brien 1986) is a two-dimensional finite difference model that simulates debris flow propagation following a mono-phase approach, based on empirical quadratic rheological relation developed by O'Brien and Julien (1985). Various aspects of the combination of the models are analyzed, giving a particular focus on the possible variations of triggered amounts compatible with a given return period. The methodology is applied to the case study area of the Messina Province in Italy, which has been recently struck by severe events, as the one of the 1st October 2009 which hit the Giampilieri Village causing 37 fatalities. Results are analyzed to assess the potential hazard that may affect the densely

  9. Modeling Flood Hazard Zones at the Sub-District Level with the Rational Model Integrated with GIS and Remote Sensing Approaches

    Directory of Open Access Journals (Sweden)

    Daniel Asare-Kyei

    2015-07-01

    Full Text Available Robust risk assessment requires accurate flood intensity area mapping to allow for the identification of populations and elements at risk. However, available flood maps in West Africa lack spatial variability while global datasets have resolutions too coarse to be relevant for local scale risk assessment. Consequently, local disaster managers are forced to use traditional methods such as watermarks on buildings and media reports to identify flood hazard areas. In this study, remote sensing and Geographic Information System (GIS techniques were combined with hydrological and statistical models to delineate the spatial limits of flood hazard zones in selected communities in Ghana, Burkina Faso and Benin. The approach involves estimating peak runoff concentrations at different elevations and then applying statistical methods to develop a Flood Hazard Index (FHI. Results show that about half of the study areas fall into high intensity flood zones. Empirical validation using statistical confusion matrix and the principles of Participatory GIS show that flood hazard areas could be mapped at an accuracy ranging from 77% to 81%. This was supported with local expert knowledge which accurately classified 79% of communities deemed to be highly susceptible to flood hazard. The results will assist disaster managers to reduce the risk to flood disasters at the community level where risk outcomes are first materialized.

  10. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    Science.gov (United States)

    Ross, G.

    2015-12-01

    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.

  11. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Antoun, T; Harris, D; Lay, T; Myers, S C; Pasyanos, M E; Richards, P; Rodgers, A J; Walter, W R; Zucca, J J

    2008-02-11

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring and seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes a path by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas.

  12. Structured Additive Regression Models: An R Interface to BayesX

    Directory of Open Access Journals (Sweden)

    Nikolaus Umlauf

    2015-02-01

    Full Text Available Structured additive regression (STAR models provide a flexible framework for model- ing possible nonlinear effects of covariates: They contain the well established frameworks of generalized linear models and generalized additive models as special cases but also allow a wider class of effects, e.g., for geographical or spatio-temporal data, allowing for specification of complex and realistic models. BayesX is standalone software package providing software for fitting general class of STAR models. Based on a comprehensive open-source regression toolbox written in C++, BayesX uses Bayesian inference for estimating STAR models based on Markov chain Monte Carlo simulation techniques, a mixed model representation of STAR models, or stepwise regression techniques combining penalized least squares estimation with model selection. BayesX not only covers models for responses from univariate exponential families, but also models from less-standard regression situations such as models for multi-categorical responses with either ordered or unordered categories, continuous time survival data, or continuous time multi-state models. This paper presents a new fully interactive R interface to BayesX: the R package R2BayesX. With the new package, STAR models can be conveniently specified using Rs formula language (with some extended terms, fitted using the BayesX binary, represented in R with objects of suitable classes, and finally printed/summarized/plotted. This makes BayesX much more accessible to users familiar with R and adds extensive graphics capabilities for visualizing fitted STAR models. Furthermore, R2BayesX complements the already impressive capabilities for semiparametric regression in R by a comprehensive toolbox comprising in particular more complex response types and alternative inferential procedures such as simulation-based Bayesian inference.

  13. A fast global tsunami modeling suite as a trans-oceanic tsunami hazard prediction and mitigation tool

    Science.gov (United States)

    Mohammed, F.; Li, S.; Jalali Farahani, R.; Williams, C. R.; Astill, S.; Wilson, P. S.; B, S.; Lee, R.

    2014-12-01

    The past decade has been witness to two mega-tsunami events, 2004 Indian ocean tsunami and 2011 Japan tsunami and multiple major tsunami events; 2006 Java, Kuril Islands, 2007 Solomon Islands, 2009 Samoa and 2010 Chile, to name a few. These events generated both local and far field tsunami inundations with runup ranging from a few meters to around 40 m in the coastal impact regions. With a majority of the coastal population at risk, there is need for a sophisticated outlook towards catastrophe risk estimation and a quick mitigation response. At the same time tools and information are needed to aid advanced tsunami hazard prediction. There is an increased need for insurers, reinsurers and Federal hazard management agencies to quantify coastal inundations and vulnerability of coastal habitat to tsunami inundations. A novel tool is developed to model local and far-field tsunami generation, propagation and inundation to estimate tsunami hazards. The tool is a combination of the NOAA MOST propagation database and an efficient and fast GPU (Graphical Processing Unit)-based non-linear shallow water wave model solver. The tsunamigenic seismic sources are mapped on to the NOAA unit source distribution along subduction zones in the ocean basin. Slip models are defined for tsunamigenic seismic sources through a slip distribution on the unit sources while maintaining limits of fault areas. A GPU based finite volume solver is used to simulate non-linear shallow water wave propagation, inundation and runup. Deformation on the unit sources provide initial conditions for modeling local impacts, while the wave history from propagation database provides boundary conditions for far field impacts. The modeling suite provides good agreement with basins for basin wide tsunami propagation to validate local and far field tsunami inundations.

  14. Multi-Spatial Criteria Modelling of Fire Risk and Hazard in the West Gonja Area of Ghana

    Directory of Open Access Journals (Sweden)

    I. Yakubu

    2013-05-01

    Full Text Available About 30% of the West Gonja Area (WGA of Ghana is occupied by three major forest reserves, which have rich array of plants and animals. The ecosystem in the WGA has been experiencing changes as a result of activities such as lumbering, farming, poaching and ritual bush burning as well as wildfire. Of particular concern is wildfire which has devastating effect on the ecological system and the rural livelihood in the WGA. Therefore, prevention and control of wildfire in the WGA is important to the sustainability of the natural resources. This paper uses multi-spatial criteria technique to model fire risk and hazard in order to enhance the WGA ability to prevent and control wildfires in the fragile ecosystem. The input data included: topography (slope, elevation, aspect; vegetation (fuel quality, fuel size and shape; weather (rainfall, temperature, humidity, wind; land cover/use map; landform; accessibility data; fire history; culture; and population density of the WGA. Fuel risk, detection risk and response risks were modeled and used as inputs to model the final fire risk and hazard for the WGA. From the model, forest, agricultural lands and shrubs cover types were identified as the major fuel contributing loads whereas water bodies, roads and settlements were considered as minor fuel contributing loads. Steeply sloping areas, areas facing the sun, low lying areas and long distances of forests from the fire service stations were found to be more susceptible to fire. The fire risk and hazard model will assist decision makers and inhabitants of the area to know where there is the highest possibility for fire outbreak and adopt prudent ways of preventing, and managing incidences of, wildfires in the WGA.

  15. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  16. Artificial geochemical barriers for additional recovery of non-ferrous metals and reduction of ecological hazard from the mining industry waste.

    Science.gov (United States)

    Chanturiya, Valentine; Masloboev, Vladimir; Makarov, Dmitriy; Mazukhina, Svetlana; Nesterov, Dmitriy; Men'shikov, Yuriy

    2011-01-01

    Laboratory tests and physical-chemical modeling have determined that mixtures of activated silica and carbonatite, serpophite and carbonatite show considerable promise for developing artificial geochemical barriers. The obtained average contents of nickel and copper deposited on geochemical barriers in the formed mining induced ores are acceptable for their subsequent cost efficient processing using either pyro- or hydrometallurgy methods. Some tests of geochemical barriers have been carried out, involving the use of polluted water in the impact zone of the "Kol'skaya GMK" JSC. A possibility of water purification from heavy metals down to the MAC level for fishery water bodies has been displayed. PMID:22029700

  17. Learning from Nature - Mapping of Complex Hydrological and Geomorphological Process Systems for More Realistic Modelling of Hazard-related Maps

    Science.gov (United States)

    Chifflard, Peter; Tilch, Nils

    2010-05-01

    Introduction Hydrological or geomorphological processes in nature are often very diverse and complex. This is partly due to the regional characteristics which vary over time and space, as well as changeable process-initiating and -controlling factors. Despite being aware of this complexity, such aspects are usually neglected in the modelling of hazard-related maps due to several reasons. But particularly when it comes to creating more realistic maps, this would be an essential component to consider. The first important step towards solving this problem would be to collect data relating to regional conditions which vary over time and geographical location, along with indicators of complex processes. Data should be acquired promptly during and after events, and subsequently digitally combined and analysed. Study area In June 2009, considerable damage occurred in the residential area of Klingfurth (Lower Austria) as a result of great pre-event wetness and repeatedly heavy rainfall, leading to flooding, debris flow deposit and gravitational mass movement. One of the causes is the fact that the meso-scale watershed (16 km²) of the Klingfurth stream is characterised by adverse geological and hydrological conditions. Additionally, the river system network with its discharge concentration within the residential zone contributes considerably to flooding, particularly during excessive rainfall across the entire region, as the flood peaks from different parts of the catchment area are superposed. First results of mapping Hydro(geo)logical surveys across the entire catchment area have shown that - over 600 gravitational mass movements of various type and stage have occurred. 516 of those have acted as a bed load source, while 325 mass movements had not reached the final stage yet and could thus supply bed load in the future. It should be noted that large mass movements in the initial or intermediate stage were predominately found in clayey-silty areas and weathered material

  18. A guide to generalized additive models in crop science using SAS and R

    Directory of Open Access Journals (Sweden)

    Josefine Liew

    2015-06-01

    Full Text Available Linear models and generalized linear models are well known and are used extensively in crop science. Generalized additive models (GAMs are less well known. GAMs extend generalized linear models through inclusion of smoothing functions of explanatory variables, e.g., spline functions, allowing the curves to bend to better describe the observed data. This article provides an introduction to GAMs in the context of crop science experiments. This is exemplified using a dataset consisting of four populations of perennial sow-thistle (Sonchus arvensis L., originating from two regions, for which emergence of shoots over time was compared.

  19. Landslide Hazard Assessment and Mapping in the Guil Catchment (Queyras, Southern French Alps): From Landslide Inventory to Susceptibility Modelling

    Science.gov (United States)

    Roulleau, Louise; Bétard, François; Carlier, Benoît; Lissak, Candide; Fort, Monique

    2016-04-01

    Landslides are common natural hazards in the Southern French Alps, where they may affect human lives and cause severe damages to infrastructures. As a part of the SAMCO research project dedicated to risk evaluation in mountain areas, this study focuses on the Guil river catchment (317 km2), Queyras, to assess landslide hazard poorly studied until now. In that area, landslides are mainly occasional, low amplitude phenomena, with limited direct impacts when compared to other hazards such as floods or snow avalanches. However, when interacting with floods during extreme rainfall events, landslides may have indirect consequences of greater importance because of strong hillslope-channel connectivity along the Guil River and its tributaries (i.e. positive feedbacks). This specific morphodynamic functioning reinforces the need to have a better understanding of landslide hazards and their spatial distribution at the catchment scale to prevent local population from disasters with multi-hazard origin. The aim of this study is to produce a landslide susceptibility mapping at 1:50 000 scale as a first step towards global estimation of landslide hazard and risk. The three main methodologies used for assessing landslide susceptibility are qualitative (i.e. expert opinion), deterministic (i.e. physics-based models) and statistical methods (i.e. probabilistic models). Due to the rapid development of geographical information systems (GIS) during the last two decades, statistical methods are today widely used because they offer a greater objectivity and reproducibility at large scales. Among them, multivariate analyses are considered as the most robust techniques, especially the logistic regression method commonly used in landslide susceptibility mapping. However, this method like others is strongly dependent on the accuracy of the input data to avoid significant errors in the final results. In particular, a complete and accurate landslide inventory is required before the modelling

  20. Vector generalized additive models for extreme rainfall data analysis (study case rainfall data in Indramayu)

    Science.gov (United States)

    Utami, Eka Putri Nur; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall pattern are good indicators for potential disasters. Global Circulation Model (GCM) contains global scale information that can be used to predict the rainfall data. Statistical downscaling (SD) utilizes the global scale information to make inferences in the local scale. Essentially, SD can be used to predict local scale variables based on global scale variables. SD requires a method to accommodate non linear effects and extreme values. Extreme value Theory (EVT) can be used to analyze the extreme value. One of methods to identify the extreme events is peak over threshold that follows Generalized Pareto Distribution (GPD). The vector generalized additive model (VGAM) is an extension of the generalized additive model. It is able to accommodate linear or nonlinear effects by involving more than one additive predictors. The advantage of VGAM is to handle multi response models. The key idea of VGAM are iteratively reweighted least square for maximum likelihood estimation, penalized smoothing, fisher scoring and additive models. This works aims to analyze extreme rainfall data in Indramayu using VGAM. The results show that the VGAM with GPD is able to predict extreme rainfall data accurately. The prediction in February is very close to the actual value at quantile 75.

  1. A hazard-based duration model for analyzing crossing behavior of cyclists and electric bike riders at signalized intersections.

    Science.gov (United States)

    Yang, Xiaobao; Huan, Mei; Abdel-Aty, Mohamed; Peng, Yichuan; Gao, Ziyou

    2015-01-01

    This paper presents a hazard-based duration approach to investigate riders' waiting times, violation hazards, associated risk factors, and their differences between cyclists and electric bike riders at signalized intersections. A total of 2322 two-wheeled riders approaching the intersections during red light periods were observed in Beijing, China. The data were classified into censored and uncensored data to distinguish between safe crossing and red-light running behavior. The results indicated that the red-light crossing behavior of most riders was dependent on waiting time. They were inclined to terminate waiting behavior and run against the traffic light with the increase of waiting duration. Over half of the observed riders cannot endure 49s or longer. 25% of the riders can endure 97s or longer. Rider type, gender, waiting position, conformity tendency and crossing traffic volume were identified to have significant effects on riders' waiting times and violation hazards. Electric bike riders were found to be more sensitive to the external risk factors such as other riders' crossing behavior and crossing traffic volume than cyclists. Moreover, unobserved heterogeneity was examined in the proposed models. The finding of this paper can explain when and why cyclists and electric bike riders run against the red light at intersections. The results of this paper are useful for traffic design and management agencies to implement strategies to enhance the safety of riders. PMID:25463942

  2. A hazard-based duration model for analyzing crossing behavior of cyclists and electric bike riders at signalized intersections.

    Science.gov (United States)

    Yang, Xiaobao; Huan, Mei; Abdel-Aty, Mohamed; Peng, Yichuan; Gao, Ziyou

    2015-01-01

    This paper presents a hazard-based duration approach to investigate riders' waiting times, violation hazards, associated risk factors, and their differences between cyclists and electric bike riders at signalized intersections. A total of 2322 two-wheeled riders approaching the intersections during red light periods were observed in Beijing, China. The data were classified into censored and uncensored data to distinguish between safe crossing and red-light running behavior. The results indicated that the red-light crossing behavior of most riders was dependent on waiting time. They were inclined to terminate waiting behavior and run against the traffic light with the increase of waiting duration. Over half of the observed riders cannot endure 49s or longer. 25% of the riders can endure 97s or longer. Rider type, gender, waiting position, conformity tendency and crossing traffic volume were identified to have significant effects on riders' waiting times and violation hazards. Electric bike riders were found to be more sensitive to the external risk factors such as other riders' crossing behavior and crossing traffic volume than cyclists. Moreover, unobserved heterogeneity was examined in the proposed models. The finding of this paper can explain when and why cyclists and electric bike riders run against the red light at intersections. The results of this paper are useful for traffic design and management agencies to implement strategies to enhance the safety of riders.

  3. Preface to the Special Issue on “Taiwan Earthquake Model: Seismic Hazard Assessment and Earthquake Scenario”

    Directory of Open Access Journals (Sweden)

    Ruey-Juin Rau

    2016-06-01

    Full Text Available With its high strain rate (10-6 - 10-7 per year acting on the oblique subduction-collision transition zone, seismicity in Taiwan is indicated by frequent small and moderate sized earthquakes, and occasionally some M ~ 7 events. During the last century and in recent years, Taiwan experienced a significant number of earthquakes. For example, the 1906 M 7.1 Meishan earthquake, the 1935 M 7.1 Hsinchu-Taichung earthquake, and the 1999 M 7.6 Chi-Chi earthquake. The enormous damage caused by these events progressively compelled us to provide reliable and detailed seismic hazard and risk assessments for the country. With support from the former National Science Council and now the Ministry of Science and Technology and the Taiwan earthquake science communities, the Taiwan Earthquake Model (TEM organization was established in 2012 under the supervision of the Taiwan Earthquake Research Center (TEC. The main purpose of TEM is to study the probability for seismic hazard and risk analysis for Taiwan by integrating the earthquake science, earthquake engineering, and social science communities of Taiwan. With help from TEM related research, we wish to improve our understanding of Taiwan earthquake mechanisms and therefore provide new insight into seismic hazard and risk assessments for Taiwan.

  4. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    Energy Technology Data Exchange (ETDEWEB)

    Doligez, B.; Eschard, R. [Institut Francais du Petrole, Rueil Malmaison (France); Geffroy, F. [Centre de Geostatistique, Fontainebleau (France)] [and others

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  5. New results in RR Lyrae modeling: convective cycles, additional modes and more

    CERN Document Server

    Molnár, L; Szabó, R; Plachy, E

    2012-01-01

    Recent theoretical and observational findings breathed new life into the field of RR Lyrae stars. The ever more precise and complete measurements of the space asteroseismology missions revealed new details, such as the period doubling and the presence of the additional modes in the stars. Theoretical work also flourished: period doubling was explained and an additional mode has been detected in hydrodynamic models as well. Although the most intriguing mystery, the Blazhko-effect has remained unsolved, new findings indicate that the convective cycle model can be effectively ruled out for short- and medium-period modulations. On the other hand, the plausibility of the radial resonance model is increasing, as more and more resonances are detected both in models and stars.

  6. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models

    CERN Document Server

    Fan, Jianqing; Song, Rui

    2011-01-01

    A variable screening procedure via correlation learning was proposed Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under the nonparametric additive models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, an iterative nonparametric independence screening (INIS) is also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data a...

  7. Results of development and field tests of a radar-tracer system providing meteorological support to modeling hazardous technological releases

    International Nuclear Information System (INIS)

    radiation monitoring laboratory and the LDU was on the bank of the cooling pond. The main characteristics of a cloud of substances are standard deviations in the spatial concentration distributions in the cloud along mutually perpendicular directions, which are representative spatial scales of the cloud in these directions. The squares of these values are relative variances of the cloud. These characteristics were the first to determine and are among the most important input parameters in different models assessing transport and dispersion of hazardous substances. Dispersion of a cloud of substances is influenced by turbulence and vertical shifts of the mean wind speed, with the patterns of dispersion being different depending an diffusion time (or distance to a source). The results of the conducted experiments have revealed an important advantage of the used radar complex in terms of support to predictions of possible contamination in case of release of hazardous materials. As the chaff cloud moves with the wind and is dispersed, it echoes the meteorological situation at a given time moment (vertical wind distribution and air temperature, turbulence level, boundary layer thickness, type of underlying surface) and any changes occurring in the conditions on its way. Based on obtained radar data, the radar complex is capable of assessing the dispersion parameters which can then be used in the mathematical transport and dispersion models. What's more, determination of these parameters does not require any meteorological measurements. In addition, using data about movement of the cloud centroid, the wind speed and direction is estimated at the release height by the radar complex itself. The conducted experiments have shown that the developed technology makes possible measuring key parameters of a cloud of substances at the height of dispersion and taking into account local features of the distribution of meteorological quantities occurring in the vicinity of a source. This

  8. Estimating the phenology of elk brucellosis transmission with hierarchical models of cause-specific and baseline hazards

    Science.gov (United States)

    Cross, Paul C.; Maichak, Eric J.; Rogerson, Jared D.; Irvine, Kathryn M.; Jones, Jennifer D; Heisey, Dennis M.; Edwards, William H.; Scurlock, Brandon M.

    2015-01-01

    Understanding the seasonal timing of disease transmission can lead to more effective control strategies, but the seasonality of transmission is often unknown for pathogens transmitted directly. We inserted vaginal implant transmitters (VITs) in 575 elk (Cervus elaphus canadensis) from 2006 to 2014 to assess when reproductive failures (i.e., abortions or still births) occur, which is the primary transmission route of Brucella abortus, the causative agent of brucellosis in the Greater Yellowstone Ecosystem. Using a survival analysis framework, we developed a Bayesian hierarchical model that simultaneously estimated the total baseline hazard of a reproductive event as well as its 2 mutually exclusive parts (abortions or live births). Approximately, 16% (95% CI = 0.10, 0.23) of the pregnant seropositive elk had reproductive failures, whereas 2% (95% CI = 0.01, 0.04) of the seronegative elk had probable abortions. Reproductive failures could have occurred as early as 13 February and as late as 10 July, peaking from March through May. Model results suggest that less than 5% of likely abortions occurred after 6 June each year and abortions were approximately 5 times more likely in March, April, or May compared to February or June. In western Wyoming, supplemental feeding of elk begins in December and ends during the peak of elk abortions and brucellosis transmission (i.e., Mar and Apr). Years with more snow may enhance elk-to-elk transmission on supplemental feeding areas because elk are artificially aggregated for the majority of the transmission season. Elk-to-cattle transmission will depend on the transmission period relative to the end of the supplemental feeding season, elk seroprevalence, population size, and the amount of commingling. Our statistical approach allowed us to estimate the probability density function of different event types over time, which may be applicable to other cause-specific survival analyses. It is often challenging to assess the

  9. Business models for additive manufacturing:exploring digital technologies, consumer roles, and supply chains

    OpenAIRE

    Hadar, Ronen; Bilberg, Arne; Bogers, Marcel

    2015-01-01

    Digital fabrication — including additive manufacturing (AM), rapid prototyping and 3D printing — has the potential to revolutionize the way in which products are produced and delivered to the customer. Therefore, it challenges companies to reinvent their business model — describing the logic of creating and capturing value. In this paper, we explore the implications that AM technologies have for manufacturing systems in the new business models that they enable. In particular, we consider how ...

  10. Modeling ancient and modern arithmetic practices : addition and multiplication with Arabic and Roman numerals

    OpenAIRE

    Schlimm, Dirk; Neth, Hansjörg

    2008-01-01

    To analyze the task of mental arithmetic with external representations in different number systems we model algorithms for addition and multiplication with Arabic and Roman numerals. This demonstrates that Roman numerals are not only informationally equivalent to Arabic ones but also computationally similar - a claim that is widely disputed. An analysis of our models' elementary processing steps reveals intricate trade-offs between problem representation, algorithm, and interactive resources....

  11. Efficiency modeling of solidification/stabilization of multi-metal contaminated industrial soil using cement and additives

    Energy Technology Data Exchange (ETDEWEB)

    Voglar, Grega E. [RDA - Regional Development Agency Celje, Kidriceva ulica 25, 3000 Celje (Slovenia); Lestan, Domen, E-mail: domen.lestan@bf.uni-lj.si [Agronomy Department, Centre for Soil and Environmental Science, Biotechnical Faculty, University of Ljubljana, Jamnikarjeva 101, 1000 Ljubljana (Slovenia)

    2011-08-30

    Highlights: {yields} We assess the feasibility of using soil S/S for industrial land reclamation. {yields} Retarders, accelerators, plasticizers were used in S/S cementitious formulation. {yields} We proposed novel S/S efficiency model for multi-metal contaminated soils. - Abstract: In a laboratory study, formulations of 15% (w/w) of ordinary Portland cement (OPC), calcium aluminate cement (CAC) and pozzolanic cement (PC) and additives: plasticizers cementol delta ekstra (PCDE) and cementol antikorodin (PCA), polypropylene fibers (PPF), polyoxyethylene-sorbitan monooleate (Tween 80) and aqueous acrylic polymer dispersion (Akrimal) were used for solidification/stabilization (S/S) of soils from an industrial brownfield contaminated with up to 157, 32,175, 44,074, 7614, 253 and 7085 mg kg{sup -1} of Cd, Pb, Zn, Cu, Ni and As, respectively. Soils formed solid monoliths with all cementitious formulations tested, with a maximum mechanical strength of 12 N mm{sup -2} achieved after S/S with CAC + PCA. To assess the S/S efficiency of the used formulations for multi-element contaminated soils, we propose an empirical model in which data on equilibrium leaching of toxic elements into deionized water and TCLP (toxicity characteristic leaching procedure) solution and the mass transfer of elements from soil monoliths were weighed against the relative potential hazard of the particular toxic element. Based on the model calculation, the most efficient S/S formulation was CAC + Akrimal, which reduced soil leachability of Cd, Pb, Zn, Cu, Ni and As into deionized water below the limit of quantification and into TCLP solution by up to 55, 185, 8750, 214, 4.7 and 1.2-times, respectively; and the mass transfer of elements from soil monoliths by up to 740, 746, 104,000, 4.7, 343 and 181-times, respectively.

  12. Antimicrobial combinations: Bliss independence and Loewe additivity derived from mechanistic multi-hit models.

    Science.gov (United States)

    Baeder, Desiree Y; Yu, Guozhi; Hozé, Nathanaël; Rolff, Jens; Regoes, Roland R

    2016-05-26

    Antimicrobial peptides (AMPs) and antibiotics reduce the net growth rate of bacterial populations they target. It is relevant to understand if effects of multiple antimicrobials are synergistic or antagonistic, in particular for AMP responses, because naturally occurring responses involve multiple AMPs. There are several competing proposals describing how multiple types of antimicrobials add up when applied in combination, such as Loewe additivity or Bliss independence. These additivity terms are defined ad hoc from abstract principles explaining the supposed interaction between the antimicrobials. Here, we link these ad hoc combination terms to a mathematical model that represents the dynamics of antimicrobial molecules hitting targets on bacterial cells. In this multi-hit model, bacteria are killed when a certain number of targets are hit by antimicrobials. Using this bottom-up approach reveals that Bliss independence should be the model of choice if no interaction between antimicrobial molecules is expected. Loewe additivity, on the other hand, describes scenarios in which antimicrobials affect the same components of the cell, i.e. are not acting independently. While our approach idealizes the dynamics of antimicrobials, it provides a conceptual underpinning of the additivity terms. The choice of the additivity term is essential to determine synergy or antagonism of antimicrobials.This article is part of the themed issue 'Evolutionary ecology of arthropod antimicrobial peptides'. PMID:27160596

  13. A spatial hazard model for cluster detection on continuous indicators of disease: application to somatic cell score.

    Science.gov (United States)

    Gay, Emilie; Senoussi, Rachid; Barnouin, Jacques

    2007-01-01

    Methods for spatial cluster detection dealing with diseases quantified by continuous variables are few, whereas several diseases are better approached by continuous indicators. For example, subclinical mastitis of the dairy cow is evaluated using a continuous marker of udder inflammation, the somatic cell score (SCS). Consequently, this study proposed to analyze spatialized risk and cluster components of herd SCS through a new method based on a spatial hazard model. The dataset included annual SCS for 34 142 French dairy herds for the year 2000, and important SCS risk factors: mean parity, percentage of winter and spring calvings, and herd size. The model allowed the simultaneous estimation of the effects of known risk factors and of potential spatial clusters on SCS, and the mapping of the estimated clusters and their range. Mean parity and winter and spring calvings were significantly associated with subclinical mastitis risk. The model with the presence of 3 clusters was highly significant, and the 3 clusters were attractive, i.e. closeness to cluster center increased the occurrence of high SCS. The three localizations were the following: close to the city of Troyes in the northeast of France; around the city of Limoges in the center-west; and in the southwest close to the city of Tarbes. The semi-parametric method based on spatial hazard modeling applies to continuous variables, and takes account of both risk factors and potential heterogeneity of the background population. This tool allows a quantitative detection but assumes a spatially specified form for clusters.

  14. Vector generalized linear and additive models with an implementation in R

    CERN Document Server

    Yee, Thomas W

    2015-01-01

    This book presents a statistical framework that expands generalized linear models (GLMs) for regression modelling. The framework shared in this book allows analyses based on many semi-traditional applied statistics models to be performed as a coherent whole. This is possible through the approximately half-a-dozen major classes of statistical models included in the book and the software infrastructure component, which makes the models easily operable.    The book’s methodology and accompanying software (the extensive VGAM R package) are directed at these limitations, and this is the first time the methodology and software are covered comprehensively in one volume. Since their advent in 1972, GLMs have unified important distributions under a single umbrella with enormous implications. The demands of practical data analysis, however, require a flexibility that GLMs do not have. Data-driven GLMs, in the form of generalized additive models (GAMs), are also largely confined to the exponential family. This book ...

  15. Estimating interaction on an additive scale between continuous determinants in a logistic regression model

    NARCIS (Netherlands)

    Knol, Mirjam J.; van der Tweel, Ingeborg; Grobbee, Diederick E.; Numans, Mattijs E.; Geerlings, Mirjam I.

    2007-01-01

    Background To determine the presence of interaction in epidemiologic research, typically a product term is added to the regression model. In linear regression, the regression coefficient of the product term reflects interaction as departure from additivity. However, in logistic regression it refers

  16. Midrapidity inclusive densities in high energy pp collisions in additive quark model

    Science.gov (United States)

    Shabelski, Yu. M.; Shuvaev, A. G.

    2016-08-01

    High energy (CERN SPS and LHC) inelastic pp (pbar{p}) scattering is treated in the framework of the additive quark model together with Pomeron exchange theory. We extract the midrapidity inclusive density of the charged secondaries produced in a single quark-quark collision and investigate its energy dependence. Predictions for the π p collisions are presented.

  17. Logit model utilization in additional and alternative income estimation of farms

    Directory of Open Access Journals (Sweden)

    Piotr Bórawski

    2009-01-01

    Full Text Available Particular attention was paid to profitability of additional and alternative income sources, mainly ostrich and fallow deers. The analysis was presented using Logit model. The survey proved that the profitability of ostrich production was determined by basic herd, farmers’ age and own sawings. In fallow deers production the most important were basic herd and own savings.

  18. Modeling the Use of Sulfate Additives for Potassium Chloride Destruction in Biomass Combustion

    DEFF Research Database (Denmark)

    Wu, Hao; Pedersen, Morten Nedergaard; Jespersen, Jacob Boll;

    2014-01-01

    Potassium chloride, KCl, formed from biomass combustion may lead to ash deposition and corrosion problems in boilers. Sulfates are effective additives for converting KCl to the less harmful K2SO4 and HCl. In the present study, the rate constants for decomposition of ammonium sulfate and aluminum......-dependent distribution of SO2 and SO3 from ammonium sulfate decomposition. On the basis of these data as well as earlier results, a detailed chemical kinetic model for sulfation of KCl by a range of sulfate additives was established. Modeling results were compared to biomass combustion experiments in a bubbling...... results, implying a need for further investigations. Predictions for the effectiveness of the sulfur-based additives indicate that ferric sulfate and ammonium sulfate have similar effectiveness at temperatures ranging from approximately 850 to 900 °C, whereas ferric sulfate is more efficient at higher...

  19. On the development of a seismic source zonation model for seismic hazard assessment in western Saudi Arabia

    Science.gov (United States)

    Zahran, Hani M.; Sokolov, Vladimir; Roobol, M. John; Stewart, Ian C. F.; El-Hadidy Youssef, Salah; El-Hadidy, Mahmoud

    2016-07-01

    A new seismic source model has been developed for the western part of the Arabian Peninsula, which has experienced considerable earthquake activity in the historical past and in recent times. The data used for the model include an up-to-date seismic catalog, results of recent studies of Cenozoic faulting in the area, aeromagnetic anomaly and gravity maps, geological maps, and miscellaneous information on volcanic activity. The model includes 18 zones ranging along the Red Sea and the Arabian Peninsula from the Gulf of Aqaba and the Dead Sea in the north to the Gulf of Aden in the south. The seismic source model developed in this study may be considered as one of the basic branches in a logic tree approach for seismic hazard assessment in Saudi Arabia and adjacent territories.

  20. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    Science.gov (United States)

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  1. Formation and reduction of carcinogenic furan in various model systems containing food additives.

    Science.gov (United States)

    Kim, Jin-Sil; Her, Jae-Young; Lee, Kwang-Geun

    2015-12-15

    The aim of this study was to analyse and reduce furan in various model systems. Furan model systems consisting of monosaccharides (0.5M glucose and ribose), amino acids (0.5M alanine and serine) and/or 1.0M ascorbic acid were heated at 121°C for 25 min. The effects of food additives (each 0.1M) such as metal ions (iron sulphate, magnesium sulphate, zinc sulphate and calcium sulphate), antioxidants (BHT and BHA), and sodium sulphite on the formation of furan were measured. The level of furan formed in the model systems was 6.8-527.3 ng/ml. The level of furan in the model systems of glucose/serine and glucose/alanine increased 7-674% when food additives were added. In contrast, the level of furan decreased by 18-51% in the Maillard reaction model systems that included ribose and alanine/serine with food additives except zinc sulphate. PMID:26190608

  2. Analysis of the camouflage effect in time of segregation in texturized regions using the Cox proportional hazard model

    Directory of Open Access Journals (Sweden)

    Eduardo Yoshio Nakano

    2012-11-01

    Full Text Available Humans have trichromatic vision. However variations in gene can cause deficiency in color vision resulting to dichromatism. The aim of this work was to verify the real efficiency of dichromats to break colour camouflage. Total of nine colour-blind individuals participated in this study and the variable considered was the time to segregation of camouflaged targets. The interest was to compare the response time in several conditions of camouflage and the analysis was performed using the Cox proportional hazard model.

  3. NB-PLC channel modelling with cyclostationary noise addition & OFDM implementation for smart grid

    Science.gov (United States)

    Thomas, Togis; Gupta, K. K.

    2016-03-01

    Power line communication (PLC) technology can be a viable solution for the future ubiquitous networks because it provides a cheaper alternative to other wired technology currently being used for communication. In smart grid Power Line Communication (PLC) is used to support communication with low rate on low voltage (LV) distribution network. In this paper, we propose the channel modelling of narrowband (NB) PLC in the frequency range 5 KHz to 500 KHz by using ABCD parameter with cyclostationary noise addition. Behaviour of the channel was studied by the addition of 11KV/230V transformer, by varying load location and load. Bit error rate (BER) Vs signal to noise ratio SNR) was plotted for the proposed model by employing OFDM. Our simulation results based on the proposed channel model show an acceptable performance in terms of bit error rate versus signal to noise ratio, which enables communication required for smart grid applications.

  4. Estimation for an additive growth curve model with orthogonal design matrices

    CERN Document Server

    Hu, Jianhua; You, Jinhong; 10.3150/10-BEJ315

    2012-01-01

    An additive growth curve model with orthogonal design matrices is proposed in which observations may have different profile forms. The proposed model allows us to fit data and then estimate parameters in a more parsimonious way than the traditional growth curve model. Two-stage generalized least-squares estimators for the regression coefficients are derived where a quadratic estimator for the covariance of observations is taken as the first-stage estimator. Consistency, asymptotic normality and asymptotic independence of these estimators are investigated. Simulation studies and a numerical example are given to illustrate the efficiency and parsimony of the proposed model for model specifications in the sense of minimizing Akaike's information criterion (AIC).

  5. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    Energy Technology Data Exchange (ETDEWEB)

    Plesko, Catherine S [Los Alamos National Laboratory; Clement, R Ryan [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  6. EFFECT OF NANOPOWDER ADDITION ON THE FLEXURAL STRENGTH OF ALUMINA CERAMIC - A WEIBULL MODEL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Daidong Guo

    2016-05-01

    Full Text Available Alumina ceramics were prepared either with micrometer-sized alumina powder (MAP or with the addition of nanometer-sized alumina powder (NAP. The density, crystalline phase, flexural strength and the fracture surface of the two ceramics were measured and compared. Emphasis has been put on the influence of nanopowder addition on the flexural strength of Al₂O₃ ceramic. The analysis based on the Weibull distribution model suggests the distribution of the flexural strength of the NAP ceramic is more concentrated than that of the MAP ceramic. Therefore, the NAP ceramics will be more stable and reliable in real applications.

  7. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    Science.gov (United States)

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  8. Efficient semiparametric estimation in generalized partially linear additive models for longitudinal/clustered data

    KAUST Repository

    Cheng, Guang

    2014-02-01

    We consider efficient estimation of the Euclidean parameters in a generalized partially linear additive models for longitudinal/clustered data when multiple covariates need to be modeled nonparametrically, and propose an estimation procedure based on a spline approximation of the nonparametric part of the model and the generalized estimating equations (GEE). Although the model in consideration is natural and useful in many practical applications, the literature on this model is very limited because of challenges in dealing with dependent data for nonparametric additive models. We show that the proposed estimators are consistent and asymptotically normal even if the covariance structure is misspecified. An explicit consistent estimate of the asymptotic variance is also provided. Moreover, we derive the semiparametric efficiency score and information bound under general moment conditions. By showing that our estimators achieve the semiparametric information bound, we effectively establish their efficiency in a stronger sense than what is typically considered for GEE. The derivation of our asymptotic results relies heavily on the empirical processes tools that we develop for the longitudinal/clustered data. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2014 ISI/BS.

  9. Generalized neurofuzzy network modeling algorithms using Bézier-Bernstein polynomial functions and additive decomposition.

    Science.gov (United States)

    Hong, X; Harris, C J

    2000-01-01

    This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bézier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bézier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bézier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bézier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

  10. CalTOX, a multimedia total exposure model for hazardous-waste sites; Part 1, Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    McKone, T.E.

    1993-06-01

    CalTOX has been developed as a spreadsheet model to assist in health-risk assessments that address contaminated soils and the contamination of adjacent air, surface water, sediments, and ground water. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify and reduce uncertainty in multimedia, multiple-pathway exposure models. This report provides an overview of the CalTOX model components, lists the objectives of the model, describes the philosophy under which the model was developed, identifies the chemical classes for which the model can be used, and describes critical sensitivities and uncertainties. The multimedia transport and transformation model is a dynamic model that can be used to assess time-varying concentrations of contaminants introduced initially to soil layers or for contaminants released continuously to air or water. This model assists the user in examining how chemical and landscape properties impact both the ultimate route and quantity of human contact. Multimedia, multiple pathway exposure models are used in the CalTOX model to estimate average daily potential doses within a human population in the vicinity of a hazardous substances release site. The exposure models encompass twenty-three exposure pathways. The exposure assessment process consists of relating contaminant concentrations in the multimedia model compartments to contaminant concentrations in the media with which a human population has contact (personal air, tap water, foods, household dusts soils, etc.). The average daily dose is the product of the exposure concentrations in these contact media and an intake or uptake factor that relates the concentrations to the distributions of potential dose within the population.

  11. Integrating multidisciplinary science, modelling and impact data into evolving, syn-event volcanic hazard mapping and communication: A case study from the 2012 Tongariro eruption crisis, New Zealand

    Science.gov (United States)

    Leonard, Graham S.; Stewart, Carol; Wilson, Thomas M.; Procter, Jonathan N.; Scott, Bradley J.; Keys, Harry J.; Jolly, Gill E.; Wardman, Johnny B.; Cronin, Shane J.; McBride, Sara K.

    2014-10-01

    New Zealand's Tongariro National Park volcanoes produce hazardous eruptions every few years to decades. On 6 August 2012 the Te Maari vent of Tongariro Volcano erupted, producing a series of explosions and a fine ash of minor volume which was dispersed rapidly to the east. This manuscript presents a summary of the eruption impacts and the way these supported science communication during the crisis, particularly in terms of hazard map development. The most significant proximal impact was damage from pyroclastic surges and ballistics to the popular and economically-important Tongariro Alpine Crossing track. The only hazard to affect the medial impact zone was a few mms of ashfall with minor impacts. Field testing indicated that the Te Maari ash had extremely low resistivity when wetted, implying a very high potential to cause disruption to nationally-important power transmission networks via the mechanism of insulator flashover. This was not observed, presumably due to insufficient ash accumulation on insulators. Virtually no impacts from distal ashfall were reported. Post-event analysis of PM10 data demonstrates the additional value of regional air quality monitoring networks in quantifying population exposure to airborne respirable ash. While the eruption was minor, it generated a high level of public interest and a demand for information on volcanic hazards and impacts from emergency managers, the public, critical infrastructure managers, health officials, and the agriculture sector. Meeting this demand fully taxed available resources. We present here aspects of the New Zealand experience which may have wider applicability in moving towards improved integration of hazard impact information, mapping, and communication. These include wide use of a wiki technical clearinghouse and email listservs, a focus on multi-agency consistent messages, and a recently developed environment of collaboration and alignment of both research funding and technical science advice

  12. Local Tsunami Hazard In The Marquesas Islands (french Polynesia) : Numerical Modeling of The 1999 Fatu Hiva Landslide and Tsunami

    Science.gov (United States)

    Hébert, H.; Schindelé, F.; Heinrich, P.; Piatanesi, A.; Okal, E. A.

    In French Polynesia, the Marquesas Islands are particularly prone to amplification of tsunamis generated at the Pacific Rim, due to relatively mild submarine slopes and to large open bays not protected by any coral reef. These islands are also threatened by local tsunamis, as shown by the recent 1999 event on Fatu Hiva. On September 13, 1999, Omoa Bay was struck by 2 to 5 m high water waves: several buildings, among them the school, were flooded and destroyed but no lives were lost. Observations gath- ered during a post-event survey revealed the recent collapse into the sea of a 300x300 m, at least 20-m thick, cliff located 5 km southeast of Omoa. This cliff failure most certainly triggered the tsunami waves since the cliff was reported intact 45 min earlier. We simulate the tsunami generation due to a subaerial landslide, using a finite- difference model assimilating the landslide to a flow of granular material. Numerical modeling shows that a 0.0024-km3 landslide located in the presumed source area ac- counts well for the tsunami waves reported in Omoa Bay. We show that the striking amplification observed in Omoa Bay is related to the trapping of waves due to the shallow submarine shelf surrounding the island. These results stress the local tsunami hazard that should be taken into account in the natural hazard assessment and mitiga- tion of the area, where historical cliff collapses can be observed and should happen again.

  13. Predicting mastitis in dairy cows using neural networks and generalized additive models: a comparison

    DEFF Research Database (Denmark)

    Anantharama Ankinakatte, Smitha; Norberg, Elise; Løvendahl, Peter;

    2013-01-01

    The aim of this paper is to develop and compare methods for early detection of oncoming mastitis with automated recorded data. The data were collected at the Danish Cattle Research Center (Tjele, Denmark). As indicators of mastitis, electrical conductivity (EC), somatic cell scores (SCS), lactate...... that combines residual components into a score to improve the model. To develop and verify the model, the data are randomly divided into training and validation data sets. To predict the occurrence of mastitis, neural network models (NNs) and generalized additive models (GAMs) are developed using the training...... classification with all indicators, using individual residuals rather than factor scores. When SCS is excluded, GAMs shows better classification result when milk yield is also excluded. In conclusion, the study shows that NNs and GAMs are similar in their ability to detect mastitis, a sensitivity of almost 75...

  14. Use of additive technologies for practical working with complex models for foundry technologies

    Science.gov (United States)

    Olkhovik, E.; Butsanets, A. A.; Ageeva, A. A.

    2016-07-01

    The article presents the results of research of additive technology (3D printing) application for developing a geometrically complex model of castings parts. Investment casting is well known and widely used technology for the production of complex parts. The work proposes the use of a 3D printing technology for manufacturing models parts, which are removed by thermal destruction. Traditional methods of equipment production for investment casting involve the use of manual labor which has problems with dimensional accuracy, and CNC technology which is less used. Such scheme is low productive and demands considerable time. We have offered an alternative method which consists in printing the main knots using a 3D printer (PLA and ABS) with a subsequent production of castings models from them. In this article, the main technological methods are considered and their problems are discussed. The dimensional accuracy of models in comparison with investment casting technology is considered as the main aspect.

  15. Generalized Additive Mixed-Models for Pharmacology Using Integrated Discrete Multiple Organ Co-Culture

    Science.gov (United States)

    Ingersoll, Thomas; Cole, Stephanie; Madren-Whalley, Janna; Booker, Lamont; Dorsey, Russell; Li, Albert; Salem, Harry

    2016-01-01

    Integrated Discrete Multiple Organ Co-culture (IDMOC) is emerging as an in-vitro alternative to in-vivo animal models for pharmacology studies. IDMOC allows dose-response relationships to be investigated at the tissue and organoid levels, yet, these relationships often exhibit responses that are far more complex than the binary responses often measured in whole animals. To accommodate departure from binary endpoints, IDMOC requires an expansion of analytic techniques beyond simple linear probit and logistic models familiar in toxicology. IDMOC dose-responses may be measured at continuous scales, exhibit significant non-linearity such as local maxima or minima, and may include non-independent measures. Generalized additive mixed-modeling (GAMM) provides an alternative description of dose-response that relaxes assumptions of independence and linearity. We compared GAMMs to traditional linear models for describing dose-response in IDMOC pharmacology studies. PMID:27110941

  16. Modeling of sulfation of potassium chloride by ferric sulfate addition during grate-firing of biomass

    DEFF Research Database (Denmark)

    Wu, Hao; Jespersen, Jacob Boll; Aho, Martti;

    2013-01-01

    -scale tube reactor. It is revealed that approximately 40% of the sulfur is released as SO3, the remaining fraction being released as SO2. The proposed decomposition model of ferric sulfate is combined with a detailed gas phase kinetic model of KCl sulfation, and a simplified model of K2SO4 condensation...... harmful K2SO4. In the present study the decomposition of ferric sulfate is studied in a fast-heating rate thermogravimetric analyzer (TGA), and a kinetic model is proposed to describe the decomposition process. The yields of SO2 and SO3 from ferric sulfate decomposition are investigated in a laboratory...... in order to simulate the sulfation of KCl by ferric sulfate addition during grate-firing of biomass. The simulation results show good agreements with the experimental data obtained in a pilot-scale biomass grate-firing reactor, where different amounts of ferric sulfate was injected on the grate...

  17. Choosing components in the additive main effect and multiplicative interaction (AMMI models

    Directory of Open Access Journals (Sweden)

    Dias Carlos Tadeu dos Santos

    2006-01-01

    Full Text Available The additive main effect and multiplicative interaction (AMMI models allows analysts to detect interactions between rows and columns in a two-way table. However, there are many methods proposed in the literature to determine the number of multiplicative components to include in the AMMI model. These methods typically give different results for any particular data set, so the user needs some guidance as to which methods to use. In this paper we compare four commonly used methods using simulated data based on real experiments, and provide some general recommendations.

  18. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  19. Model for Assembly Line Re-Balancing Considering Additional Capacity and Outsourcing to Face Demand Fluctuations

    Science.gov (United States)

    Samadhi, TMAA; Sumihartati, Atin

    2016-02-01

    The most critical stage in a garment industry is sewing process, because generally, it consists of a number of operations and a large number of sewing machines for each operation. Therefore, it requires a balancing method that can assign task to work station with balance workloads. Many studies on assembly line balancing assume a new assembly line, but in reality, due to demand fluctuation and demand increased a re-balancing is needed. To cope with those fluctuating demand changes, additional capacity can be carried out by investing in spare sewing machine and paying for sewing service through outsourcing. This study develops an assembly line balancing (ALB) model on existing line to cope with fluctuating demand change. Capacity redesign is decided if the fluctuation demand exceeds the available capacity through a combination of making investment on new machines and outsourcing while considering for minimizing the cost of idle capacity in the future. The objective of the model is to minimize the total cost of the line assembly that consists of operating costs, machine cost, adding capacity cost, losses cost due to idle capacity and outsourcing costs. The model develop is based on an integer programming model. The model is tested for a set of data of one year demand with the existing number of sewing machines of 41 units. The result shows that additional maximum capacity up to 76 units of machine required when there is an increase of 60% of the average demand, at the equal cost parameters..

  20. Modeling the use of sulfate additives for potassium chloride destruction in biomass combustion

    OpenAIRE

    Wu, Hao; Grell, Morten Nedergaard; Jespersen, Jacob Boll; Aho, Martti; Jappe Frandsen, Flemming; Glarborg, Peter

    2013-01-01

    Potassium chloride, KCl, formed from biomass combustion may lead to ash deposition and corrosion problems in boilers. Sulfates are effective additives for converting KCl to the less harmful K2SO4. In the present study, the decomposition of ammonium sulfate, aluminum sulfate and ferric sulfate was studied respectively in a fast-heating rate thermogravimetric analyzer (TGA) for deriving a kinetic model. The yields of SO2 and SO3 from the decomposition were studied in a tube reactor, revealing t...

  1. Internal structure and volcanic hazard potential of Mt Tongariro, New Zealand, from 3D gravity and magnetic models

    Science.gov (United States)

    Miller, Craig A.; Williams-Jones, Glyn

    2016-06-01

    A new 3D geophysical model of the Mt Tongariro Volcanic Massif (TgVM), New Zealand, provides a high resolution view of the volcano's internal structure and hydrothermal system, from which we derive implications for volcanic hazards. Geologically constrained 3D inversions of potential field data provides a greater level of insight into the volcanic structure than is possible from unconstrained models. A complex region of gravity highs and lows (± 6 mGal) is set within a broader, ~ 20 mGal gravity low. A magnetic high (1300 nT) is associated with Mt Ngauruhoe, while a substantial, thick, demagnetised area occurs to the north, coincident with a gravity low and interpreted as representing the hydrothermal system. The hydrothermal system is constrained to the west by major faults, interpreted as an impermeable barrier to fluid migration and extends to basement depth. These faults are considered low probability areas for future eruption sites, as there is little to indicate they have acted as magmatic pathways. Where the hydrothermal system coincides with steep topographic slopes, an increased likelihood of landslides is present and the newly delineated hydrothermal system maps the area most likely to have phreatic eruptions. Such eruptions, while small on a global scale, are important hazards at the TgVM as it is a popular hiking area with hundreds of visitors per day in close proximity to eruption sites. The model shows that the volume of volcanic material erupted over the lifespan of the TgVM is five to six times greater than previous estimates, suggesting a higher rate of magma supply, in line with global rates of andesite production. We suggest that our model of physical property distribution can be used to provide constraints for other models of dynamic geophysical processes occurring at the TgVM.

  2. Bi-Objective Modelling for Hazardous Materials Road-Rail Multimodal Routing Problem with Railway Schedule-Based Space-Time Constraints.

    Science.gov (United States)

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-07-28

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing-Tianjin-Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study.

  3. Bi-Objective Modelling for Hazardous Materials Road–Rail Multimodal Routing Problem with Railway Schedule-Based Space–Time Constraints

    Science.gov (United States)

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-01-01

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing–Tianjin–Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study. PMID:27483294

  4. Bi-Objective Modelling for Hazardous Materials Road-Rail Multimodal Routing Problem with Railway Schedule-Based Space-Time Constraints.

    Science.gov (United States)

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-01-01

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing-Tianjin-Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study. PMID:27483294

  5. Bayesian structured additive regression modeling of epidemic data: application to cholera

    Directory of Open Access Journals (Sweden)

    Osei Frank B

    2012-08-01

    Full Text Available Abstract Background A significant interest in spatial epidemiology lies in identifying associated risk factors which enhances the risk of infection. Most studies, however, make no, or limited use of the spatial structure of the data, as well as possible nonlinear effects of the risk factors. Methods We develop a Bayesian Structured Additive Regression model for cholera epidemic data. Model estimation and inference is based on fully Bayesian approach via Markov Chain Monte Carlo (MCMC simulations. The model is applied to cholera epidemic data in the Kumasi Metropolis, Ghana. Proximity to refuse dumps, density of refuse dumps, and proximity to potential cholera reservoirs were modeled as continuous functions; presence of slum settlers and population density were modeled as fixed effects, whereas spatial references to the communities were modeled as structured and unstructured spatial effects. Results We observe that the risk of cholera is associated with slum settlements and high population density. The risk of cholera is equal and lower for communities with fewer refuse dumps, but variable and higher for communities with more refuse dumps. The risk is also lower for communities distant from refuse dumps and potential cholera reservoirs. The results also indicate distinct spatial variation in the risk of cholera infection. Conclusion The study highlights the usefulness of Bayesian semi-parametric regression model analyzing public health data. These findings could serve as novel information to help health planners and policy makers in making effective decisions to control or prevent cholera epidemics.

  6. Automated Standard Hazard Tool

    Science.gov (United States)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  7. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models

    Directory of Open Access Journals (Sweden)

    Monica Musio

    2012-11-01

    Full Text Available Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region.

  8. Updated Colombian Seismic Hazard Map

    Science.gov (United States)

    Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.

    2013-05-01

    The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is

  9. Evaporation model for beam based additive manufacturing using free surface lattice Boltzmann methods

    International Nuclear Information System (INIS)

    Evaporation plays an important role in many technical applications including beam-based additive manufacturing processes, such as selective electron beam or selective laser melting (SEBM/SLM). In this paper, we describe an evaporation model which we employ within the framework of a two-dimensional free surface lattice Boltzmann method. With this method, we solve the hydrodynamics as well as thermodynamics of the molten material taking into account the mass and energy losses due to evaporation and the recoil pressure acting on the melt pool. Validation of the numerical model is performed by measuring maximum melt depths and evaporative losses in samples of pure titanium and Ti–6Al–4V molten by an electron beam. Finally, the model is applied to create processing maps for an SEBM process. The results predict that the penetration depth of the electron beam, which is a function of the acceleration voltage, has a significant influence on evaporation effects. (paper)

  10. A model for the additional dissipation in transients: a volume viscosity coefficient

    International Nuclear Information System (INIS)

    During transients with water column separation, the formation of a two-phase mixture in more or less large conduit lengths where pressures are close to vapour pressure, does produce, as is well known, additional dissipations besides the dissipations which are normally met with during transients without cavitation. The effects, certainly related to the presence of a gaseous phase, have been represented in several models, among which those of Streeter (1970), Kalkwijk and de Vries (1971), Safwat (1972), Kalkwijk (1974). The model developed in this paper, already presented at the Vallombrosa meeting (1974), introduces a global volume viscosity coefficient. This model is compared with others and with some experimental results: this formulation does complete previous contributions and by means of a specific analytic representation of the dissipative term, gives a correct damping o

  11. Evaporation model for beam based additive manufacturing using free surface lattice Boltzmann methods

    Science.gov (United States)

    Klassen, Alexander; Scharowsky, Thorsten; Körner, Carolin

    2014-07-01

    Evaporation plays an important role in many technical applications including beam-based additive manufacturing processes, such as selective electron beam or selective laser melting (SEBM/SLM). In this paper, we describe an evaporation model which we employ within the framework of a two-dimensional free surface lattice Boltzmann method. With this method, we solve the hydrodynamics as well as thermodynamics of the molten material taking into account the mass and energy losses due to evaporation and the recoil pressure acting on the melt pool. Validation of the numerical model is performed by measuring maximum melt depths and evaporative losses in samples of pure titanium and Ti-6Al-4V molten by an electron beam. Finally, the model is applied to create processing maps for an SEBM process. The results predict that the penetration depth of the electron beam, which is a function of the acceleration voltage, has a significant influence on evaporation effects.

  12. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    Science.gov (United States)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  13. Numerical modeling of microstructure evolution during laser additive manufacturing of a nickel-based superalloy

    International Nuclear Information System (INIS)

    A multi-scale model that combines the finite element method and stochastic analysis is developed to simulate the evolution of the microstructure of an Nb-bearing nickel-based superalloy during laser additive manufacturing solidification. Through the use of this model, the nucleation and growth of dendrites, the segregation of niobium (Nb) and the formation of Laves phase particles during the solidification are investigated to provide the relationship between the solidification conditions and the resultant microstructure, especially in the morphology of Laves phase particles. The study shows that small equiaxed dendrite arm spacing under a high cooling rate and low temperature gradient to growth rate (G/R) ratio is beneficial for forming discrete Laves phase particles. In contrast, large columnar dendrite arm spacing under a low cooling rate and high G/R ratio tends to produce continuously distributed coarse Laves phase particles, which are known to be detrimental to mechanical properties. In addition, the improvement of hot cracking resistance by controlling the morphology of Laves phase particles is discussed by analyzing the cracking pattern and microstructure in the laser deposited material. This work provides valuable understanding of solidification microstructure development in Nb-bearing nickel-based superalloys, like IN 718, during laser additive manufacturing and constitutes a fundamental basis for controlling the microstructure to minimize the formation of deleterious Laves phase particles

  14. "Bunched Black Swans" in Complex Geosystems: Cross-Disciplinary Approaches to the Additive and Multiplicative Modelling of Correlated Extreme Bursts

    Science.gov (United States)

    Watkins, N. W.; Rypdal, M.; Lovsletten, O.

    2012-12-01

    For all natural hazards, the question of when the next "extreme event" (c.f. Taleb's "black swans") is expected is of obvious importance. In the environmental sciences users often frame such questions in terms of average "return periods", e.g. "is an X meter rise in the Thames water level a 1-in-Y year event ?". Frequently, however, we also care about the emergence of correlation, and whether the probability of several big events occurring in close succession is truly independent, i.e. are the black swans "bunched". A "big event", or a "burst", defined by its integrated signal above a threshold, might be a single, very large, event, or, instead, could in fact be a correlated series of "smaller" (i.e. less wildly fluctuating) events. Several available stochastic approaches provide quantitative information about such bursts, including Extreme Value Theory (EVT); the theory of records; level sets; sojourn times; and models of space-time "avalanches" of activity in non-equilibrium systems. Some focus more on the probability of single large events. Others are more concerned with extended dwell times above a given spatiotemporal threshold: However, the state of the art is not yet fully integrated, and the above-mentioned approaches differ in fundamental aspects. EVT is perhaps the best known in the geosciences. It is concerned with the distribution obeyed by the extremes of datasets, e.g. the 100 values obtained by considering the largest daily temperature recorded in each of the years of a century. However, the pioneering work from the 1920s on which EVT originally built was based on independent identically distributed samples, and took no account of memory and correlation that characterise many natural hazard time series. Ignoring this would fundamentally limit our ability to forecast; so much subsequent activity has been devoted to extending EVT to encompass dependence. A second group of approaches, by contrast, has notions of time and thus possible non

  15. Hydrological risks in anthropized watersheds: modeling of hazard, vulnerability and impacts on population from south-west of Madagascar

    Science.gov (United States)

    Mamy Rakotoarisoa, Mahefa; Fleurant, Cyril; Taibi, Nuscia; Razakamanana, Théodore

    2016-04-01

    Hydrological risks, especially for floods, are recurrent on the Fiherenana watershed - southwest of Madagascar. The city of Toliara, which is located at the outlet of the river basin, is subjected each year to hurricane hazards and floods. The stakes are of major importance in this part of the island. This study begins with the analysis of hazard by collecting all existing hydro-climatic data on the catchment. It then seeks to determine trends, despite the significant lack of data, using simple statistical models (decomposition of time series). Then, two approaches are conducted to assess the vulnerability of the city of Toliara and the surrounding villages. First, a static approach, from surveys of land and the use of GIS are used. Then, the second method is the use of a multi-agent-based simulation model. The first step is the mapping of a vulnerability index which is the arrangement of several static criteria. This is a microscale indicator (the scale used is the housing). For each House, there are several criteria of vulnerability, which are the potential water depth, the flow rate, or the architectural typology of the buildings. For the second part, simulations involving scenes of agents are used in order to evaluate the degree of vulnerability of homes from flooding. Agents are individual entities to which we can assign behaviours on purpose to simulate a given phenomenon. The aim is not to give a criterion to the house as physical building, such as its architectural typology or its strength. The model wants to know the chances of the occupants of the house to escape from a catastrophic flood. For this purpose, we compare various settings and scenarios. Some scenarios are conducted to take into account the effect of certain decision made by the responsible entities (Information and awareness of the villagers for example). The simulation consists of two essential parts taking place simultaneously in time: simulation of the rise of water and the flow using

  16. Exploration of land-use scenarios for flood hazard modeling – the case of Santiago de Chile

    Directory of Open Access Journals (Sweden)

    A. Müller

    2011-04-01

    Full Text Available Urban expansion leads to modifications in land use and land cover and to the loss of vegetated areas. These developments are in some regions of the world accelerated by a changing regional climate. As a consequence, major changes in the amount of green spaces can be observed in many urban regions. Amongst other dependences the amount of green spaces determines the availability of retention areas in a watershed. The goal of this research is to develop possible land-use and land-cover scenarios for a watershed and to explore the influence of land-use and land-cover changes on its runoff behavior using the distributed hydrological model HEC-HMS. The study area for this research is a small peri-urban watershed in the eastern area of Santiago de Chile.

    Three spatially explicit exploratory land-use/land-cover scenario alternatives were developed based on the analysis of previous land-use developments using high resolution satellite data, on the analysis of urban planning laws, on the analysis of climate change predictions, and on expert interviews. Modeling the resulting changes in runoff allows making predictions about the changes in flood hazard which the adjacent urban areas are facing after heavy winter precipitation events. The paper shows how HEC-HMS was used applying a distributed event modeling approach. The derived runoff values are combined with existing flood hazard maps and can be regarded as important source of information for the adaptation to changing conditions in the study area. The most significant finding is that the land-use changes that have to be expected after long drought periods pose the highest risk with respect to floods.

  17. Topsoil organic carbon content of Europe, a new map based on a generalised additive model

    Science.gov (United States)

    de Brogniez, Delphine; Ballabio, Cristiano; Stevens, Antoine; Jones, Robert J. A.; Montanarella, Luca; van Wesemael, Bas

    2014-05-01

    There is an increasing demand for up-to-date spatially continuous organic carbon (OC) data for global environment and climatic modeling. Whilst the current map of topsoil organic carbon content for Europe (Jones et al., 2005) was produced by applying expert-knowledge based pedo-transfer rules on large soil mapping units, the aim of this study was to replace it by applying digital soil mapping techniques on the first European harmonised geo-referenced topsoil (0-20 cm) database, which arises from the LUCAS (land use/cover area frame statistical survey) survey. A generalized additive model (GAM) was calibrated on 85% of the dataset (ca. 17 000 soil samples) and a backward stepwise approach selected slope, land cover, temperature, net primary productivity, latitude and longitude as environmental covariates (500 m resolution). The validation of the model (applied on 15% of the dataset), gave an R2 of 0.27. We observed that most organic soils were under-predicted by the model and that soils of Scandinavia were also poorly predicted. The model showed an RMSE of 42 g kg-1 for mineral soils and of 287 g kg-1 for organic soils. The map of predicted OC content showed the lowest values in Mediterranean countries and in croplands across Europe, whereas highest OC content were predicted in wetlands, woodlands and in mountainous areas. The map of standard error of the OC model predictions showed high values in northern latitudes, wetlands, moors and heathlands, whereas low uncertainty was mostly found in croplands. A comparison of our results with the map of Jones et al. (2005) showed a general agreement on the prediction of mineral soils' OC content, most probably because the models use some common covariates, namely land cover and temperature. Our model however failed to predict values of OC content greater than 200 g kg-1, which we explain by the imposed unimodal distribution of our model, whose mean is tilted towards the majority of soils, which are mineral. Finally, average

  18. Obtaining 3D PLY Part from DEM Surface Data for Terrain Modeling by Additive Fabrication

    Directory of Open Access Journals (Sweden)

    YASHWANT KUMAR MODI

    2014-04-01

    Full Text Available Physical modeling of the earth’s terrain has been gaining popularity among architects and land-use planners in the last few years. It is partly because of the limitations with the cartographic maps and virtual reality techniques and partly because of availability of rapid manufacturing processes to produce physical models of terrain with accurate freeform surfaces. Recently many researchers have employed Additive Manufacturing (AM processes to fabricate physical scale models of the terrains. However, they got a physical model in several steps. They used more than one software package to translate surface DEM data into faceted models, leading to loss of data in intermediate file format conversion. This paper presents a methodology which can convert surface DEM data directly into PLY format in single step. This work also eliminates the data loss associated with translation of data into intermediate file formats. In this paper two data formats: DEM ASCII XYZ and Surfer Grid have been directly converted into PLY format. The results of the program are verified and validated with the help of sample data files as well as real world DEM data.

  19. Mixed butanols addition to gasoline surrogates: Shock tube ignition delay time measurements and chemical kinetic modeling

    KAUST Repository

    AlRamadan, Abdullah S.

    2015-10-01

    The demand for fuels with high anti-knock quality has historically been rising, and will continue to increase with the development of downsized and turbocharged spark-ignition engines. Butanol isomers, such as 2-butanol and tert-butanol, have high octane ratings (RON of 105 and 107, respectively), and thus mixed butanols (68.8% by volume of 2-butanol and 31.2% by volume of tert-butanol) can be added to the conventional petroleum-derived gasoline fuels to improve octane performance. In the present work, the effect of mixed butanols addition to gasoline surrogates has been investigated in a high-pressure shock tube facility. The ignition delay times of mixed butanols stoichiometric mixtures were measured at 20 and 40bar over a temperature range of 800-1200K. Next, 10vol% and 20vol% of mixed butanols (MB) were blended with two different toluene/n-heptane/iso-octane (TPRF) fuel blends having octane ratings of RON 90/MON 81.7 and RON 84.6/MON 79.3. These MB/TPRF mixtures were investigated in the shock tube conditions similar to those mentioned above. A chemical kinetic model was developed to simulate the low- and high-temperature oxidation of mixed butanols and MB/TPRF blends. The proposed model is in good agreement with the experimental data with some deviations at low temperatures. The effect of mixed butanols addition to TPRFs is marginal when examining the ignition delay times at high temperatures. However, when extended to lower temperatures (T < 850K), the model shows that the mixed butanols addition to TPRFs causes the ignition delay times to increase and hence behaves like an octane booster at engine-like conditions. © 2015 The Combustion Institute.

  20. An Empirical Research on the Model of the Right in Additional Allocation of Stocks

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    How to define the value of the Right in Additional Al location of Stocks (RAAS) acts an important role in stock markets whether or not the shareholders execute the right. Moreover, the valuation defining of RAAS an d the exercise price (K) are mutual cause and effect. Based on some literatures on this subject, this paper presents a model valuing the RAAS per-share. With t he opening information in ShenZheng Stock Markets, we make a simulation on the R AAS's value of shenwuye, which is a shareholding corp...

  1. Generalized additive models and Lucilia sericata growth: assessing confidence intervals and error rates in forensic entomology.

    Science.gov (United States)

    Tarone, Aaron M; Foran, David R

    2008-07-01

    Forensic entomologists use blow fly development to estimate a postmortem interval. Although accurate, fly age estimates can be imprecise for older developmental stages and no standard means of assigning confidence intervals exists. Presented here is a method for modeling growth of the forensically important blow fly Lucilia sericata, using generalized additive models (GAMs). Eighteen GAMs were created to predict the extent of juvenile fly development, encompassing developmental stage, length, weight, strain, and temperature data, collected from 2559 individuals. All measures were informative, explaining up to 92.6% of the deviance in the data, though strain and temperature exerted negligible influences. Predictions made with an independent data set allowed for a subsequent examination of error. Estimates using length and developmental stage were within 5% of true development percent during the feeding portion of the larval life cycle, while predictions for postfeeding third instars were less precise, but within expected error.

  2. Hybrid 2D-3D modelling of GTA welding with filler wire addition

    KAUST Repository

    Traidia, Abderrazak

    2012-07-01

    A hybrid 2D-3D model for the numerical simulation of Gas Tungsten Arc welding is proposed in this paper. It offers the possibility to predict the temperature field as well as the shape of the solidified weld joint for different operating parameters, with relatively good accuracy and reasonable computational cost. Also, an original approach to simulate the effect of immersing a cold filler wire in the weld pool is presented. The simulation results reveal two important observations. First, the weld pool depth is locally decreased in the presence of filler metal, which is due to the energy absorption by the cold feeding wire from the hot molten pool. In addition, the weld shape, maximum temperature and thermal cycles in the workpiece are relatively well predicted even when a 2D model for the arc plasma region is used. © 2012 Elsevier Ltd. All rights reserved.

  3. Horizontal deployment of the search for potentially hazardous facilities on digital plant model. Application of Heinrich's law

    International Nuclear Information System (INIS)

    This paper proposes an innovative method of the search for potentially hazardous facilities having the similar conditions to those of accidental facilities. By providing digital plant model equipped with knowledge included in the drawings, specification sheets, database, and standards for the design, maintenance, safety, the logical search for the risky facilities is conducted and the results are visualized by color on the knowledge based drawings. Heinrich's law tells 300 minor troubles and 30 middle leveled troubles will be accompanied with the heavy severe accidents. To prevent critical accidents it is necessary to prevent minor troubles. The proposed digital search afford to prevent critical accidents by finding similar cause in minor troubles. Sharing the trouble information among plant owners, plant makers, and third parties incl. municipal authorities is also important. (author)

  4. Guarana provides additional stimulation over caffeine alone in the planarian model.

    Science.gov (United States)

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R; Constable, Mic Andre; Mulligan, Margaret E; Voura, Evelyn B

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  5. Guarana provides additional stimulation over caffeine alone in the planarian model.

    Directory of Open Access Journals (Sweden)

    Dimitrios Moustakas

    Full Text Available The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose.

  6. Comparison of the Results of Cox Proportional Hazards Model and Parametric Models in the Study of Length of Stay in A Tertiary Teaching Hospital in Tehran, Iran

    Directory of Open Access Journals (Sweden)

    Ali Zare

    2011-10-01

    Full Text Available Survival analysis is a set of methods used for analysis of the data which exist until the occurrence of an event. This study aimed to compare the results of the use of the semi-parametric Cox model with parametric models to determine the factors influencing the length of stay of patients in the inpatient units of Women Hospital in Tehran, Iran. In this historical cohort study all 3421 charts of the patients admitted to Obstetrics, Surgery and Oncology units in 2008 were reviewed and the required patient data such as medical insurance coverage types, admission months, days and times, inpatient units, final diagnoses, the number of diagnostic tests, admission types were collected. The patient length of stay in hospitals leading to recoverys was considered as a survival variable. To compare the semi-parametric Cox model and parametric (including exponential, Weibull, Gompertz, log-normal, log-logistic and gamma models and find the best model fitted to studied data, Akaike's Information Criterion (AIC and Cox-Snell residual were used. P<0.05 was considered as statistically significant. AIC and Cox-Snell residual graph showed that the gamma model had the lowest AIC (4288.598 and the closest graph to the bisector. The results of the gamma model showed that factors affecting the patient length of stay were admission day, inpatient unit, related physician specialty, emergent admission, final diagnosis and the number of laboratory tests, radiographies and sonographies (P<0.05. The results showed that the gamma model provided a better fit to the studied data than the Cox proportional hazards model. Therefore, it is better for researchers of healthcare field to consider this model in their researches about the patient length of stay (LOS if the assumption of proportional hazards is not fulfilled.

  7. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models.

    Science.gov (United States)

    Fan, Jianqing; Feng, Yang; Song, Rui

    2011-06-01

    A variable screening procedure via correlation learning was proposed in Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under general nonparametric models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, a data-driven thresholding and an iterative nonparametric independence screening (INIS) are also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data analysis demonstrate that the proposed procedure works well with moderate sample size and large dimension and performs better than competing methods. PMID:22279246

  8. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models.

    Science.gov (United States)

    Fan, Jianqing; Feng, Yang; Song, Rui

    2011-06-01

    A variable screening procedure via correlation learning was proposed in Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under general nonparametric models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, a data-driven thresholding and an iterative nonparametric independence screening (INIS) are also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data analysis demonstrate that the proposed procedure works well with moderate sample size and large dimension and performs better than competing methods.

  9. Quantifying spatial disparities in neonatal mortality using a structured additive regression model.

    Directory of Open Access Journals (Sweden)

    Lawrence N Kazembe

    Full Text Available BACKGROUND: Neonatal mortality contributes a large proportion towards early childhood mortality in developing countries, with considerable geographical variation at small areas within countries. METHODS: A geo-additive logistic regression model is proposed for quantifying small-scale geographical variation in neonatal mortality, and to estimate risk factors of neonatal mortality. Random effects are introduced to capture spatial correlation and heterogeneity. The spatial correlation can be modelled using the Markov random fields (MRF when data is aggregated, while the two dimensional P-splines apply when exact locations are available, whereas the unstructured spatial effects are assigned an independent Gaussian prior. Socio-economic and bio-demographic factors which may affect the risk of neonatal mortality are simultaneously estimated as fixed effects and as nonlinear effects for continuous covariates. The smooth effects of continuous covariates are modelled by second-order random walk priors. Modelling and inference use the empirical Bayesian approach via penalized likelihood technique. The methodology is applied to analyse the likelihood of neonatal deaths, using data from the 2000 Malawi demographic and health survey. The spatial effects are quantified through MRF and two dimensional P-splines priors. RESULTS: Findings indicate that both fixed and spatial effects are associated with neonatal mortality. CONCLUSIONS: Our study, therefore, suggests that the challenge to reduce neonatal mortality goes beyond addressing individual factors, but also require to understanding unmeasured covariates for potential effective interventions.

  10. Climate change impact assessment on Veneto and Friuli plain groundwater. Part I: An integrated modeling approach for hazard scenario construction

    Energy Technology Data Exchange (ETDEWEB)

    Baruffi, F. [Autorita di Bacino dei Fiumi dell' Alto Adriatico, Cannaregio 4314, 30121 Venice (Italy); Cisotto, A., E-mail: segreteria@adbve.it [Autorita di Bacino dei Fiumi dell' Alto Adriatico, Cannaregio 4314, 30121 Venice (Italy); Cimolino, A.; Ferri, M.; Monego, M.; Norbiato, D.; Cappelletto, M.; Bisaglia, M. [Autorita di Bacino dei Fiumi dell' Alto Adriatico, Cannaregio 4314, 30121 Venice (Italy); Pretner, A.; Galli, A. [SGI Studio Galli Ingegneria, via della Provvidenza 13, 35030 Sarmeola di Rubano (PD) (Italy); Scarinci, A., E-mail: andrea.scarinci@sgi-spa.it [SGI Studio Galli Ingegneria, via della Provvidenza 13, 35030 Sarmeola di Rubano (PD) (Italy); Marsala, V.; Panelli, C. [SGI Studio Galli Ingegneria, via della Provvidenza 13, 35030 Sarmeola di Rubano (PD) (Italy); Gualdi, S., E-mail: silvio.gualdi@bo.ingv.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Bucchignani, E., E-mail: e.bucchignani@cira.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Torresan, S., E-mail: torresan@cmcc.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Pasini, S., E-mail: sara.pasini@stud.unive.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Department of Environmental Sciences, Informatics and Statistics, University Ca' Foscari Venice, Calle Larga S. Marta 2137, 30123 Venice (Italy); Critto, A., E-mail: critto@unive.it [Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), via Augusto Imperatore 16, 73100 Lecce (Italy); Department of Environmental Sciences, Informatics and Statistics, University Ca' Foscari Venice, Calle Larga S. Marta 2137, 30123 Venice (Italy); and others

    2012-12-01

    Climate change impacts on water resources, particularly groundwater, is a highly debated topic worldwide, triggering international attention and interest from both researchers and policy makers due to its relevant link with European water policy directives (e.g. 2000/60/EC and 2007/118/EC) and related environmental objectives. The understanding of long-term impacts of climate variability and change is therefore a key challenge in order to address effective protection measures and to implement sustainable management of water resources. This paper presents the modeling approach adopted within the Life + project TRUST (Tool for Regional-scale assessment of groUndwater Storage improvement in adaptation to climaTe change) in order to provide climate change hazard scenarios for the shallow groundwater of high Veneto and Friuli Plain, Northern Italy. Given the aim to evaluate potential impacts on water quantity and quality (e.g. groundwater level variation, decrease of water availability for irrigation, variations of nitrate infiltration processes), the modeling approach integrated an ensemble of climate, hydrologic and hydrogeologic models running from the global to the regional scale. Global and regional climate models and downscaling techniques were used to make climate simulations for the reference period 1961-1990 and the projection period 2010-2100. The simulation of the recent climate was performed using observed radiative forcings, whereas the projections have been done prescribing the radiative forcings according to the IPCC A1B emission scenario. The climate simulations and the downscaling, then, provided the precipitation, temperatures and evapo-transpiration fields used for the impact analysis. Based on downscaled climate projections, 3 reference scenarios for the period 2071-2100 (i.e. the driest, the wettest and the mild year) were selected and used to run a regional geomorphoclimatic and hydrogeological model. The final output of the model ensemble produced

  11. Climate change impact assessment on Veneto and Friuli plain groundwater. Part I: An integrated modeling approach for hazard scenario construction

    International Nuclear Information System (INIS)

    Climate change impacts on water resources, particularly groundwater, is a highly debated topic worldwide, triggering international attention and interest from both researchers and policy makers due to its relevant link with European water policy directives (e.g. 2000/60/EC and 2007/118/EC) and related environmental objectives. The understanding of long-term impacts of climate variability and change is therefore a key challenge in order to address effective protection measures and to implement sustainable management of water resources. This paper presents the modeling approach adopted within the Life + project TRUST (Tool for Regional-scale assessment of groUndwater Storage improvement in adaptation to climaTe change) in order to provide climate change hazard scenarios for the shallow groundwater of high Veneto and Friuli Plain, Northern Italy. Given the aim to evaluate potential impacts on water quantity and quality (e.g. groundwater level variation, decrease of water availability for irrigation, variations of nitrate infiltration processes), the modeling approach integrated an ensemble of climate, hydrologic and hydrogeologic models running from the global to the regional scale. Global and regional climate models and downscaling techniques were used to make climate simulations for the reference period 1961–1990 and the projection period 2010–2100. The simulation of the recent climate was performed using observed radiative forcings, whereas the projections have been done prescribing the radiative forcings according to the IPCC A1B emission scenario. The climate simulations and the downscaling, then, provided the precipitation, temperatures and evapo-transpiration fields used for the impact analysis. Based on downscaled climate projections, 3 reference scenarios for the period 2071–2100 (i.e. the driest, the wettest and the mild year) were selected and used to run a regional geomorphoclimatic and hydrogeological model. The final output of the model ensemble

  12. Modeling protein density of states: additive hydrophobic effects are insufficient for calorimetric two-state cooperativity.

    Science.gov (United States)

    Chan, H S

    2000-09-01

    A well-established experimental criterion for two-state thermodynamic cooperativity in protein folding is that the van't Hoff enthalpy DeltaH(vH) around the transition midpoint is equal, or very nearly so, to the calorimetric enthalpy DeltaH(cal) of the entire transition. This condition is satisfied by many small proteins. We use simple lattice models to provide a statistical mechanical framework to elucidate how this calorimetric two-state picture may be reconciled with the hierarchical multistate scenario emerging from recent hydrogen exchange experiments. We investigate the feasibility of using inverse Laplace transforms to recover the underlying density of states (i.e., enthalpy distribution) from calorimetric data. We find that the constraint imposed by DeltaH(vH)/DeltaH(cal) approximately 1 on densities of states of proteins is often more stringent than other "two-state" criteria proposed in recent theoretical studies. In conjunction with reasonable assumptions, the calorimetric two-state condition implies a narrow distribution of denatured-state enthalpies relative to the overall enthalpy difference between the native and the denatured conformations. This requirement does not always correlate with simple definitions of "sharpness" of a transition and has important ramifications for theoretical modeling. We find that protein models that assume capillarity cooperativity can exhibit overall calorimetric two-state-like behaviors. However, common heteropolymer models based on additive hydrophobic-like interactions, including highly specific two-dimensional Gō models, fail to produce proteinlike DeltaH(vH)/DeltaH(cal) approximately 1. A simple model is constructed to illustrate a proposed scenario in which physically plausible local and nonlocal cooperative terms, which mimic helical cooperativity and environment-dependent hydrogen bonding strength, can lead to thermodynamic behaviors closer to experiment. Our results suggest that proteinlike thermodynamic

  13. Determinants of Low Birth Weight in Malawi: Bayesian Geo-Additive Modelling.

    Directory of Open Access Journals (Sweden)

    Alfred Ngwira

    Full Text Available Studies on factors of low birth weight in Malawi have neglected the flexible approach of using smooth functions for some covariates in models. Such flexible approach reveals detailed relationship of covariates with the response. The study aimed at investigating risk factors of low birth weight in Malawi by assuming a flexible approach for continuous covariates and geographical random effect. A Bayesian geo-additive model for birth weight in kilograms and size of the child at birth (less than average or average and higher with district as a spatial effect using the 2010 Malawi demographic and health survey data was adopted. A Gaussian model for birth weight in kilograms and a binary logistic model for the binary outcome (size of child at birth were fitted. Continuous covariates were modelled by the penalized (p splines and spatial effects were smoothed by the two dimensional p-spline. The study found that child birth order, mother weight and height are significant predictors of birth weight. Secondary education for mother, birth order categories 2-3 and 4-5, wealth index of richer family and mother height were significant predictors of child size at birth. The area associated with low birth weight was Chitipa and areas with increased risk to less than average size at birth were Chitipa and Mchinji. The study found support for the flexible modelling of some covariates that clearly have nonlinear influences. Nevertheless there is no strong support for inclusion of geographical spatial analysis. The spatial patterns though point to the influence of omitted variables with some spatial structure or possibly epidemiological processes that account for this spatial structure and the maps generated could be used for targeting development efforts at a glance.

  14. A Model of Incentive Compatibility under Moral Hazard in Livestock Disease Outbreak Response

    OpenAIRE

    Gramig, Benjamin M.; Horan, Richard D.; Wolf, Christopher A.

    2005-01-01

    This paper uses a principal-agent model to examine incentive compatibility in the presence of information asymmetry between the government and individual producers. Prior models of livestock disease have not incorporated information asymmetry between livestock managers and social planners. By incorporating the asymmetry, we investigate the role of incentives in producer behavior that influences the duration and magnitude of a disease epidemic.

  15. Hazardous Chemicals

    Centers for Disease Control (CDC) Podcasts

    2007-04-10

    Chemicals are a part of our daily lives, providing many products and modern conveniences. With more than three decades of experience, The Centers for Disease Control and Prevention (CDC) has been in the forefront of efforts to protect and assess people's exposure to environmental and hazardous chemicals. This report provides information about hazardous chemicals and useful tips on how to protect you and your family from harmful exposure.  Created: 4/10/2007 by CDC National Center for Environmental Health.   Date Released: 4/13/2007.

  16. Applications of the seismic hazard model of Italy: from a new building code to the L'Aquila trial against seismologists

    Science.gov (United States)

    Meletti, C.

    2013-05-01

    In 2003, a large national project fur updating the seismic hazard map and the seismic zoning in Italy started, according to the rules fixed by an Ordinance by Italian Prime Minister. New input elements for probabilistic seismic hazard assessment were compiled: the earthquake catalogue, the seismogenic zonation, the catalogue completeness, a set of new attenuation relationships. The map of expected PGA on rock soil condition with 10% probability of exceedance is the new reference seismic hazard map for Italy (http://zonesismiche.mi.ingv.it). In the following, further 9 probabilities of exceedance and the uniform hazard spectra up to 2 seconds together with the disaggregation of the PGA was also released. A comprehensive seismic hazard model that fully describes the seismic hazard in Italy was then available, accessible by a webGis application (http://esse1-gis.mi.ingv.it/en.php). The detailed information make possible to change the approach for evaluating the proper seismic action for designing: from a zone-dependent approach (in Italy there were 4 seismic zones, each one with a single design spectrum) to a site-dependent approach: the design spectrum is now defined at each site of a grid of about 11000 points covering the whole national territory. The new building code becomes mandatory only after the 6 April 2009 L'Aquila earthquake, the first strong event in Italy after the release of the seismic hazard map. The large number of recordings and the values of the experienced accelerations suggested the comparisons between the recorded spectra and spectra defined in the seismic codes Even if such comparisons could be robust only after several consecutive 50-year periods of observation and in a probabilistic approach it is not a single observation that can validate or not the hazard estimate, some of the comparisons that can be undertaken between the observed ground motions and the hazard model used for the seismic code have been performed and have shown that the

  17. COMPUTERS HAZARDS

    Directory of Open Access Journals (Sweden)

    Andrzej Augustynek

    2007-01-01

    Full Text Available In June 2006, over 12.6 million Polish users of the Web registered. On the average, each of them spent 21 hours and 37 minutes monthly browsing the Web. That is why the problems of the psychological aspects of computer utilization have become an urgent research subject. The results of research into the development of Polish information society carried out in AGH University of Science and Technology, under the leadership of Leslaw H. Haber, in the period from 2000 until present time, indicate the emergence dynamic changes in the ways of computer utilization and their circumstances. One of the interesting regularities has been the inverse proportional relation between the level of computer skills and the frequency of the Web utilization.It has been found that in 2005, compared to 2000, the following changes occurred:- A significant drop in the number of students who never used computers and the Web;- Remarkable increase in computer knowledge and skills (particularly pronounced in the case of first years student- Decreasing gap in computer skills between students of the first and the third year; between male and female students;- Declining popularity of computer games.It has been demonstrated also that the hazard of computer screen addiction was the highest in he case of unemployed youth outside school system. As much as 12% of this group of young people were addicted to computer. A lot of leisure time that these youths enjoyed inducted them to excessive utilization of the Web. Polish housewives are another population group in risk of addiction to the Web. The duration of long Web charts carried out by younger and younger youths has been another matter of concern. Since the phenomenon of computer addiction is relatively new, no specific therapy methods has been developed. In general, the applied therapy in relation to computer addition syndrome is similar to the techniques applied in the cases of alcohol or gambling addiction. Individual and group

  18. Five Years Survival of Patients After Liver Transplantation and Its Effective Factors by Neural Network and Cox Poroportional Hazard Regression Models

    Directory of Open Access Journals (Sweden)

    Khosravi

    2015-09-01

    Full Text Available Background Transplantation is the only treatment for patients with liver failure. Since the therapy imposes high expenses to the patients and community, identification of effective factors on survival of such patients after transplantation is valuable. Objectives The current study attempted to model the survival of patients (two years old and above after liver transplantation using neural network and Cox Proportional Hazards (Cox PH regression models. The event is defined as death due to complications of liver transplantation. Patients and Methods In a historical cohort study, the clinical findings of 1168 patients who underwent liver transplant surgery (from March 2008 to march 2013 at Shiraz Namazee Hospital Organ Transplantation Center, Shiraz, Southern Iran, were used. To model the one to five years survival of such patients, Cox PH regression model accompanied by three layers feed forward artificial neural network (ANN method were applied on data separately and their prediction accuracy was compared using the area under the receiver operating characteristic curve (ROC. Furthermore, Kaplan-Meier method was used to estimate the survival probabilities in different years. Results The estimated survival probability of one to five years for the patients were 91%, 89%, 85%, 84%, and 83%, respectively. The areas under the ROC were 86.4% and 80.7% for ANN and Cox PH models, respectively. In addition, the accuracy of prediction rate for ANN and Cox PH methods was equally 92.73%. Conclusions The present study detected more accurate results for ANN method compared to those of Cox PH model to analyze the survival of patients with liver transplantation. Furthermore, the order of effective factors in patients’ survival after transplantation was clinically more acceptable. The large dataset with a few missing data was the advantage of this study, the fact which makes the results more reliable.

  19. Generalized Additive Models Used to Predict Species Abundance in the Gulf of Mexico: An Ecosystem Modeling Tool

    Science.gov (United States)

    Drexler, Michael; Ainsworth, Cameron H.

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  20. Generalized additive models used to predict species abundance in the Gulf of Mexico: an ecosystem modeling tool.

    Directory of Open Access Journals (Sweden)

    Michael Drexler

    Full Text Available Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM approach is used to describe the abundance of 40 species groups (i.e. functional groups across the Gulf of Mexico (GoM using a large fisheries independent data set (SEAMAP and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist.

  1. A habitat suitability model for Chinese sturgeon determined using the generalized additive method

    Science.gov (United States)

    Yi, Yujun; Sun, Jie; Zhang, Shanghong

    2016-03-01

    The Chinese sturgeon is a type of large anadromous fish that migrates between the ocean and rivers. Because of the construction of dams, this sturgeon's migration path has been cut off, and this species currently is on the verge of extinction. Simulating suitable environmental conditions for spawning followed by repairing or rebuilding its spawning grounds are effective ways to protect this species. Various habitat suitability models based on expert knowledge have been used to evaluate the suitability of spawning habitat. In this study, a two-dimensional hydraulic simulation is used to inform a habitat suitability model based on the generalized additive method (GAM). The GAM is based on real data. The values of water depth and velocity are calculated first via the hydrodynamic model and later applied in the GAM. The final habitat suitability model is validated using the catch per unit effort (CPUEd) data of 1999 and 2003. The model results show that a velocity of 1.06-1.56 m/s and a depth of 13.33-20.33 m are highly suitable ranges for the Chinese sturgeon to spawn. The hydraulic habitat suitability indexes (HHSI) for seven discharges (4000; 9000; 12,000; 16,000; 20,000; 30,000; and 40,000 m3/s) are calculated to evaluate integrated habitat suitability. The results show that the integrated habitat suitability reaches its highest value at a discharge of 16,000 m3/s. This study is the first to apply a GAM to evaluate the suitability of spawning grounds for the Chinese sturgeon. The study provides a reference for the identification of potential spawning grounds in the entire basin.

  2. Earthquake forecasting and seismic hazard analysis: some insights on the testing phase and the modeling

    OpenAIRE

    Taroni, Matteo

    2014-01-01

    This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that...

  3. Nonlinear feedback in a six-dimensional Lorenz model: impact of an additional heating term

    Science.gov (United States)

    Shen, B.-W.

    2015-12-01

    In this study, a six-dimensional Lorenz model (6DLM) is derived, based on a recent study using a five-dimensional (5-D) Lorenz model (LM), in order to examine the impact of an additional mode and its accompanying heating term on solution stability. The new mode added to improve the representation of the streamfunction is referred to as a secondary streamfunction mode, while the two additional modes, which appear in both the 6DLM and 5DLM but not in the original LM, are referred to as secondary temperature modes. Two energy conservation relationships of the 6DLM are first derived in the dissipationless limit. The impact of three additional modes on solution stability is examined by comparing numerical solutions and ensemble Lyapunov exponents of the 6DLM and 5DLM as well as the original LM. For the onset of chaos, the critical value of the normalized Rayleigh number (rc) is determined to be 41.1. The critical value is larger than that in the 3DLM (rc ~ 24.74), but slightly smaller than the one in the 5DLM (rc ~ 42.9). A stability analysis and numerical experiments obtained using generalized LMs, with or without simplifications, suggest the following: (1) negative nonlinear feedback in association with the secondary temperature modes, as first identified using the 5DLM, plays a dominant role in providing feedback for improving the solution's stability of the 6DLM, (2) the additional heating term in association with the secondary streamfunction mode may destabilize the solution, and (3) overall feedback due to the secondary streamfunction mode is much smaller than the feedback due to the secondary temperature modes; therefore, the critical Rayleigh number of the 6DLM is comparable to that of the 5DLM. The 5DLM and 6DLM collectively suggest different roles for small-scale processes (i.e., stabilization vs. destabilization), consistent with the following statement by Lorenz (1972): "If the flap of a butterfly's wings can be instrumental in generating a tornado, it can

  4. Nonlinear feedback in a six-dimensional Lorenz Model: impact of an additional heating term

    Directory of Open Access Journals (Sweden)

    B.-W. Shen

    2015-03-01

    Full Text Available In this study, a six-dimensional Lorenz model (6DLM is derived, based on a recent study using a five-dimensional (5-D Lorenz model (LM, in order to examine the impact of an additional mode and its accompanying heating term on solution stability. The new mode added to improve the representation of the steamfunction is referred to as a secondary streamfunction mode, while the two additional modes, that appear in both the 6DLM and 5DLM but not in the original LM, are referred to as secondary temperature modes. Two energy conservation relationships of the 6DLM are first derived in the dissipationless limit. The impact of three additional modes on solution stability is examined by comparing numerical solutions and ensemble Lyapunov exponents of the 6DLM and 5DLM as well as the original LM. For the onset of chaos, the critical value of the normalized Rayleigh number (rc is determined to be 41.1. The critical value is larger than that in the 3DLM (rc ~ 24.74, but slightly smaller than the one in the 5DLM (rc ~ 42.9. A stability analysis and numerical experiments obtained using generalized LMs, with or without simplifications, suggest the following: (1 negative nonlinear feedback in association with the secondary temperature modes, as first identified using the 5DLM, plays a dominant role in providing feedback for improving the solution's stability of the 6DLM, (2 the additional heating term in association with the secondary streamfunction mode may destabilize the solution, and (3 overall feedback due to the secondary streamfunction mode is much smaller than the feedback due to the secondary temperature modes; therefore, the critical Rayleigh number of the 6DLM is comparable to that of the 5DLM. The 5DLM and 6DLM collectively suggest different roles for small-scale processes (i.e., stabilization vs. destabilization, consistent with the following statement by Lorenz (1972: If the flap of a butterfly's wings can be instrumental in generating a tornado, it

  5. Applying the Land Use Portfolio Model to Estimate Natural-Hazard Loss and Risk - A Hypothetical Demonstration for Ventura County, California

    Science.gov (United States)

    Dinitz, Laura B.

    2008-01-01

    With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS

  6. Maximum magnitude (Mmax) in the central and eastern United States for the 2014 U.S. Geological Survey Hazard Model

    Science.gov (United States)

    Wheeler, Russell L.

    2016-01-01

    Probabilistic seismic‐hazard assessment (PSHA) requires an estimate of Mmax, the moment magnitude M of the largest earthquake that could occur within a specified area. Sparse seismicity hinders Mmax estimation in the central and eastern United States (CEUS) and tectonically similar regions worldwide (stable continental regions [SCRs]). A new global catalog of moderate‐to‐large SCR earthquakes is analyzed with minimal assumptions about enigmatic geologic controls on SCR Mmax. An earlier observation that SCR earthquakes of M 7.0 and larger occur in young (250–23 Ma) passive continental margins and associated rifts but not in cratons is not strongly supported by the new catalog. SCR earthquakes of M 7.5 and larger are slightly more numerous and reach slightly higher M in young passive margins and rifts than in cratons. However, overall histograms of M from young margins and rifts and from cratons are statistically indistinguishable. This conclusion is robust under uncertainties inM, the locations of SCR boundaries, and which of two available global SCR catalogs is used. The conclusion stems largely from recent findings that (1) large southeast Asian earthquakes once thought to be SCR were in actively deforming crust and (2) long escarpments in cratonic Australia were formed by prehistoric faulting. The 2014 seismic‐hazard model of the U.S. Geological Survey represents CEUS Mmax as four‐point probability distributions. The distributions have weighted averages of M 7.0 in cratons and M 7.4 in passive margins and rifts. These weighted averages are consistent with Mmax estimates of other SCR PSHAs of the CEUS, southeastern Canada, Australia, and India.

  7. Modeling of the Tohoku-oki 2011 tsunami coastal hazard: effects of a mixed co-seismic and seabed failure source

    Science.gov (United States)

    Grilli, S. T.; Harris, J. C.; Tajali Bakhsh, T. S.; Tappin, D. R.; Masterlark, T.; Kirby, J. T.; Shi, F.; Ma, G.

    2012-12-01

    The devastating coastal impact of the 2011 Tohoku-oki tsunami cannot at present be fully explained from a pure co-seismic source. Indeed, no numerical simulation solely based on a source resulting from seismic or geodetic data inversion, has been able to reproduce the 40+ m tsunami runup heights measured along the (Sanriku) coast of northern Honshu, nor the higher frequency wave periods (3-4 min.) recorded at offshore buoys (both GPS and DART). Understanding the origin of such extreme coastal impact is key to proper tsunami hazard assessment for future events in this and other similar areas around the world. Here, we perform a detailed analysis of geological, seismic, geodetic and tsunami data and use the best available 3D hydrodynamic and long wave Boussinesq models, to simulate the tsunami generated from the combination of: (i) a new co-seismic source based on a detailed three-dimensional (3D) Finite Element Modeling (FEM) of the heterogeneous subduction zone, with geodetic data assimilation; and (ii) an additional tsunami source from a large seabed failure, seismically triggered to the North of the main rupture, with a 2' time delay. We show that the multi-source tsunami agrees well with all the available field observations, both offshore and onshore.

  8. A dynamic approach for the impact of a toxic gas dispersion hazard considering human behaviour and dispersion modelling.

    Science.gov (United States)

    Lovreglio, Ruggiero; Ronchi, Enrico; Maragkos, Georgios; Beji, Tarek; Merci, Bart

    2016-11-15

    The release of toxic gases due to natural/industrial accidents or terrorist attacks in populated areas can have tragic consequences. To prevent and evaluate the effects of these disasters different approaches and modelling tools have been introduced in the literature. These instruments are valuable tools for risk managers doing risk assessment of threatened areas. Despite the significant improvements in hazard assessment in case of toxic gas dispersion, these analyses do not generally include the impact of human behaviour and people movement during emergencies. This work aims at providing an approach which considers both modelling of gas dispersion and evacuation movement in order to improve the accuracy of risk assessment for disasters involving toxic gases. The approach is applied to a hypothetical scenario including a ship releasing Nitrogen dioxide (NO2) on a crowd attending a music festival. The difference between the results obtained with existing static methods (people do not move) and a dynamic approach (people move away from the danger) which considers people movement with different degrees of sophistication (either a simple linear path or more complex behavioural modelling) is discussed. PMID:27343142

  9. Independent screening for single-index hazard rate models with ultrahigh dimensional features

    DEFF Research Database (Denmark)

    Gorst-Rasmussen, Anders; Scheike, Thomas

    2013-01-01

    In data sets with many more features than observations, independent screening based on all univariate regression models leads to a computationally convenient variable selection method. Recent efforts have shown that, in the case of generalized linear models, independent screening may suffice to...... capture all relevant features with high probability, even in ultrahigh dimension. It is unclear whether this formal sure screening property is attainable when the response is a right-censored survival time. We propose a computationally very efficient independent screening method for survival data which...

  10. Modeling the use of sulfate additives for potassium chloride destruction in biomass combustion

    DEFF Research Database (Denmark)

    Wu, Hao; Grell, Morten Nedergaard; Jespersen, Jacob Boll;

    2013-01-01

    Potassium chloride, KCl, formed from biomass combustion may lead to ash deposition and corrosion problems in boilers. Sulfates are effective additives for converting KCl to the less harmful K2SO4. In the present study, the decomposition of ammonium sulfate, aluminum sulfate and ferric sulfate...... was studied respectively in a fast-heating rate thermogravimetric analyzer (TGA) for deriving a kinetic model. The yields of SO2 and SO3 from the decomposition were studied in a tube reactor, revealing that the ratio of the SO3/SO2 released varied for different sulfate and for ammonium sulfate the ratio...... rate of aluminum. Under the boiler conditions of the present work, the simulation results suggested that the desirable temperature for the ferric sulfate injection was around 950-900oC, whereas for ammonium sulfate the preferable injection temperature was below 800oC....

  11. The Additive Risk Model for Estimation of Effect of Haplotype Match in BMT Studies

    DEFF Research Database (Denmark)

    Scheike, Thomas; Martinussen, T; Zhang, MJ

    2011-01-01

    In this article we consider a problem from bone marrow transplant (BMT) studies where there is interest on assessing the effect of haplotype match for donor and patient on the overall survival. The BMT study we consider is based on donors and patients that are genotype matched, and this therefore...... be developed using product-integration theory. Small sample properties are investigated using simulations in a setting that mimics the motivating haplomatch problem....... leads to a missing data problem. We show how Aalen's additive risk model can be applied in this setting with the benefit that the time-varying haplomatch effect can be easily studied. This problem has not been considered before, and the standard approach where one would use the expected-maximization (EM...

  12. Estimation of the lag time in a subsequent monomer addition model for fibril elongation.

    Science.gov (United States)

    Shoffner, Suzanne K; Schnell, Santiago

    2016-08-01

    Fibrillogenesis, the production or development of protein fibers, has been linked to protein folding diseases. The progress curve of fibrils or aggregates typically takes on a sigmoidal shape with a lag phase, a rapid growth phase, and a final plateau regime. The study of the lag phase and the estimation of its critical timescale provide insight into the factors regulating the fibrillation process. However, methods to estimate a quantitative expression for the lag time rely on empirical expressions, which cannot connect the lag time to kinetic parameters associated with the reaction mechanisms of protein fibrillation. Here we introduce an approach for the estimation of the lag time using the governing rate equations of the elementary reactions of a subsequent monomer addition model for protein fibrillation as a case study. We show that the lag time is given by the sum of the critical timescales for each fibril intermediate in the subsequent monomer addition mechanism and therefore reveals causal connectivity between intermediate species. Furthermore, we find that single-molecule assays of protein fibrillation can exhibit a lag phase without a nucleation process, while dyes and extrinsic fluorescent probe bulk assays of protein fibrillation do not exhibit an observable lag phase during template-dependent elongation. Our approach could be valuable for investigating the effects of intrinsic and extrinsic factors to the protein fibrillation reaction mechanism and provides physicochemical insights into parameters regulating the lag phase. PMID:27250246

  13. A New On-Land Seismogenic Structure Source Database from the Taiwan Earthquake Model (TEM Project for Seismic Hazard Analysis of Taiwan

    Directory of Open Access Journals (Sweden)

    J. Bruce H. Shyu

    2016-09-01

    Full Text Available Taiwan is located at an active plate boundary and prone to earthquake hazards. To evaluate the island’s seismic risk, the Taiwan Earthquake Model (TEM project, supported by the Ministry of Sciences and Technology, evaluates earthquake hazard, risk, and related social and economic impact models for Taiwan through multidisciplinary collaboration. One of the major tasks of TEM is to construct a complete and updated seismogenic structure database for Taiwan to assess future seismic hazards. Toward this end, we have combined information from pre-existing databases and data obtained from new analyses to build an updated and digitized three-dimensional seismogenic structure map for Taiwan. Thirty-eight on-land active seismogenic structures are identified. For detailed information of individual structures such as their long-term slip rates and potential recurrence intervals, we collected data from existing publications, as well as calculated from results of our own field surveys and investigations. We hope this updated database would become a significant constraint for seismic hazard assessment calculations in Taiwan, and would provide important information for engineers and hazard mitigation agencies.

  14. Supra-additive effects of tramadol and acetaminophen in a human pain model.

    Science.gov (United States)

    Filitz, Jörg; Ihmsen, Harald; Günther, Werner; Tröster, Andreas; Schwilden, Helmut; Schüttler, Jürgen; Koppert, Wolfgang

    2008-06-01

    The combination of analgesic drugs with different pharmacological properties may show better efficacy with less side effects. Aim of this study was to examine the analgesic and antihyperalgesic properties of the weak opioid tramadol and the non-opioid acetaminophen, alone as well as in combination, in an experimental pain model in humans. After approval of the local Ethics Committee, 17 healthy volunteers were enrolled in this double-blind and placebo-controlled study in a cross-over design. Transcutaneous electrical stimulation at high current densities (29.6+/-16.2 mA) induced spontaneous acute pain (NRS=6 of 10) and distinct areas of hyperalgesia for painful mechanical stimuli (pinprick-hyperalgesia). Pain intensities as well as the extent of the areas of hyperalgesia were assessed before, during and 150 min after a 15 min lasting intravenous infusion of acetaminophen (650 mg), tramadol (75 mg), a combination of both (325 mg acetaminophen and 37.5mg tramadol), or saline 0.9%. Tramadol led to a maximum pain reduction of 11.7+/-4.2% with negligible antihyperalgesic properties. In contrast, acetaminophen led to a similar pain reduction (9.8+/-4.4%), but a sustained antihyperalgesic effect (34.5+/-14.0% reduction of hyperalgesic area). The combination of both analgesics at half doses led to a supra-additive pain reduction of 15.2+/-5.7% and an enhanced antihyperalgesic effect (41.1+/-14.3% reduction of hyperalgesic areas) as compared to single administration of acetaminophen. Our study provides first results on interactions of tramadol and acetaminophen on experimental pain and hyperalgesia in humans. Pharmacodynamic modeling combined with the isobolographic technique showed supra-additive effects of the combination of acetaminophen and tramadol concerning both, analgesia and antihyperalgesia. The results might act as a rationale for combining both analgesics. PMID:17709207

  15. Measures to assess the prognostic ability of the stratified Cox proportional hazards model

    DEFF Research Database (Denmark)

    (Tybjaerg-Hansen, A.) The Fibrinogen Studies Collaboration.The Copenhagen City Heart Study; Tybjærg-Hansen, Anne

    2009-01-01

    (Schemper and Henderson's V , Harrell's C-index and Royston and Sauerbrei's D), adapted them for use with the stratified CPH model and demonstrated how their values can be represented over time. Although each of these measures is promising in principle, we found the measure of explained variation V very...

  16. Citizens' Perceptions of Flood Hazard Adjustments: An Application of the Protective Action Decision Model

    Science.gov (United States)

    Terpstra, Teun; Lindell, Michael K.

    2013-01-01

    Although research indicates that adoption of flood preparations among Europeans is low, only a few studies have attempted to explain citizens' preparedness behavior. This article applies the Protective Action Decision Model (PADM) to explain flood preparedness intentions in the Netherlands. Survey data ("N" = 1,115) showed that…

  17. 75 FR 29587 - Notice of Availability of Revised Model Proposed No Significant Hazards Consideration...

    Science.gov (United States)

    2010-05-26

    ... of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, Washington, DC, 20555-0001... Processes Branch, Division of Policy and Rulemaking, Office of Nuclear Reactor Regulation. Revised Model... with the confidence in the ability of the fission product barriers (i.e., fuel cladding,...

  18. Life-Stage Physiologically-Based Pharmacokinetic (PBPK) Model Applications to Screen Environmental Hazards.

    Science.gov (United States)

    This presentation discusses methods used to extrapolate from in vitro high-throughput screening (HTS) toxicity data for an endocrine pathway to in vivo for early life stages in humans, and the use of a life stage PBPK model to address rapidly changing physiological parameters. A...

  19. Models for recurrent gas release event behavior in hazardous waste tanks

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, D.N. [Pacific Northwest Lab., Richland, WA (United States); Arnold, B.C. [California Univ., Riverside, CA (United States). Dept. of Statistics

    1994-08-01

    Certain radioactive waste storage tanks at the United States Department of Energy Hanford facilities continuously generate gases as a result of radiolysis and chemical reactions. The congealed sludge in these tanks traps the gases and causes the level of the waste within the tanks to rise. The waste level continues to rise until the sludge becomes buoyant and ``rolls over``, changing places with heavier fluid on top. During a rollover, the trapped gases are released, resulting, in a sudden drop in the waste level. This is known as a gas release event (GRE). After a GRE, the wastes leading to another GRE. We present nonlinear time waste re-congeals and gas again accumulates leading to another GRE. We present nonlinear time series models that produce simulated sample paths that closely resemble the temporal history of waste levels in these tanks. The models also imitate the random GRE, behavior observed in the temporal waste level history of a storage tank. We are interested in using the structure of these models to understand the probabilistic behavior of the random variable ``time between consecutive GRE`s``. Understanding the stochastic nature of this random variable is important because the hydrogen and nitrous oxide gases released from a GRE, are flammable and the ammonia that is released is a health risk. From a safety perspective, activity around such waste tanks should be halted when a GRE is imminent. With credible GRE models, we can establish time windows in which waste tank research and maintenance activities can be safely performed.

  20. Applying Distributed, Coupled Hydrological Slope-Stability Models for Landslide Hazard Assessments

    Science.gov (United States)

    Godt, J. W.; Baum, R. L.; Lu, N.; Savage, W. Z.; McKenna, J. P.

    2006-12-01

    Application of distributed, coupled hydrological slope-stability models requires knowledge of hydraulic and material-strength properties at the scale of landslide processes. We describe results from a suite of laboratory and field tests that were used to define the soil-water characteristics of landslide-prone colluvium on the steep coastal bluffs in the Seattle, Washington area and then use these results in a coupled model. Many commonly used tests to determine soil-water characteristics are performed for the drying process. Because most soils display a pronounced hysteresis in the relation between moisture content and matric suction, results from such tests may not accurately describe the soil-water characteristics for the wetting process during rainfall infiltration. Open-tube capillary-rise and constant-flow permeameter tests on bluff colluvium were performed in the laboratory to determine the soil-water characteristic curves (SWCC) and unsaturated hydraulic conductivity functions (HCF) for the wetting process. Field-tests using a borehole permeameter were used to determine the saturated hydraulic conductivity of colluvial materials. Measurements of pore-water response to rainfall were used in an inverse numerical modeling procedure to determine the in-situ hydraulic parameters of hillside colluvium at the scale of the instrument installation. Comparison of laboratory and field results show that although both techniques generally produce SWCCs and HCFs with similar shapes, differences in bulk density among field and lab tests yield differences in saturated moisture content and saturated hydrologic conductivity. We use these material properties in an application of a new version of a distributed transient slope stability model (TRIGRS) that accounts for the effects of the unsaturated zone on the infiltration process. Applied over a LiDAR-based digital landscape of part of the Seattle area for an hourly rainfall history known to trigger shallow landslides, the

  1. 77 FR 59675 - Compliance With Information Request, Flooding Hazard Reevaluation

    Science.gov (United States)

    2012-09-28

    ... COMMISSION Compliance With Information Request, Flooding Hazard Reevaluation AGENCY: Nuclear Regulatory... was needed in the areas of seismic and flooding design, and emergency preparedness. In addition to... licensees reevaluate flooding hazards at nuclear power plant sites using updated flooding hazard...

  2. Modelling of C2 addition route to the formation of C60

    CERN Document Server

    Khan, Sabih D

    2016-01-01

    To understand the phenomenon of fullerene growth during its synthesis, an attempt is made to model a minimum energy growth route using a semi-empirical quantum mechanics code. C2 addition leading to C60 was modelled and three main routes, i.e. cyclic ring growth, pentagon and fullerene road, were studied. The growth starts with linear chains and, at n = 10, ring structures begins to dominate. The rings continue to grow and, at some point n > 30, they transform into close-cage fullerenes and the growth is shown to progress by the fullerene road until C60 is formed. The computer simulations predict a transition from a C38 ring to fullerene. Other growth mechanisms could also occur in the energetic environment commonly encountered in fullerene synthesis, but our purpose was to identify a minimal energy route which is the most probable structure. Our results also indicate that, at n = 20, the corannulene structure is energetically more stable than the corresponding fullerene and graphene sheet, however a ring str...

  3. Uniting Mandelbrot’s Noah and Joseph Effects in Toy Models of Natural Hazard Time Series

    Science.gov (United States)

    Credgington, D.; Watkins, N. W.; Chapman, S. C.; Rosenberg, S. J.; Sanchez, R.

    2009-12-01

    The forecasting of extreme events is a highly topical, cross-disciplinary problem. One aspect which is potentially tractable even when the events themselves are stochastic is the probability of a “burst” of a given size and duration, defined as the area between a time series and a constant threshold. Many natural time series depart from the simplest, Brownian, case and in the 1960s Mandelbrot developed the use of fractals to describe these departures. In particular he proposed two kinds of fractal model to capture the way in which natural data is often persistent in time (his “Joseph effect”, common in hydrology and exemplified by fractional Brownian motion) and/or prone to heavy tailed jumps (the “Noah effect”, typical of economic index time series, for which he gave Levy flights as an examplar). Much of the earlier modelling, however, has emphasised one of the Noah and Joseph parameters (the tail exponent mu and one derived from the temporal behaviour such as power spectral beta) at the other one's expense. I will describe work [1] in which we applied a simple self-affine stable model-linear fractional stable motion (LFSM)-which unifies both effects to better describe natural data, in this case from space physics. I will show how we have resolved some contradictions seen in earlier work, where purely Joseph or Noah descriptions had been sought. I will also show recent work [2] using numerical simulations of LFSM and simple analytic scaling arguments to study the problem of the area between a fractional Levy model time series and a threshold. [1] Watkins et al, Space Science Reviews [2005] [2] Watkins et al, Physical Review E [2009

  4. Probabilistic seismic hazard in the San Francisco Bay area based on a simplified viscoelastic cycle model of fault interactions

    Science.gov (United States)

    Pollitz, F.F.; Schwartz, D.P.

    2008-01-01

    We construct a viscoelastic cycle model of plate boundary deformation that includes the effect of time-dependent interseismic strain accumulation, coseismic strain release, and viscoelastic relaxation of the substrate beneath the seismogenic crust. For a given fault system, time-averaged stress changes at any point (not on a fault) are constrained to zero; that is, kinematic consistency is enforced for the fault system. The dates of last rupture, mean recurrence times, and the slip distributions of the (assumed) repeating ruptures are key inputs into the viscoelastic cycle model. This simple formulation allows construction of stress evolution at all points in the plate boundary zone for purposes of probabilistic seismic hazard analysis (PSHA). Stress evolution is combined with a Coulomb failure stress threshold at representative points on the fault segments to estimate the times of their respective future ruptures. In our PSHA we consider uncertainties in a four-dimensional parameter space: the rupture peridocities, slip distributions, time of last earthquake (for prehistoric ruptures) and Coulomb failure stress thresholds. We apply this methodology to the San Francisco Bay region using a recently determined fault chronology of area faults. Assuming single-segment rupture scenarios, we find that fature rupture probabilities of area faults in the coming decades are the highest for the southern Hayward, Rodgers Creek, and northern Calaveras faults. This conclusion is qualitatively similar to that of Working Group on California Earthquake Probabilities, but the probabilities derived here are significantly higher. Given that fault rupture probabilities are highly model-dependent, no single model should be used to assess to time-dependent rupture probabilities. We suggest that several models, including the present one, be used in a comprehensive PSHA methodology, as was done by Working Group on California Earthquake Probabilities.

  5. Modeling of hazardous air pollutant removal in the pulsed corona discharge

    Energy Technology Data Exchange (ETDEWEB)

    Derakhshesh, Marzie [Department of Chemical and Petroleum Engineering, University of Calgary, Schulich School of Engineering, 2500 University Drive, N.W., Calgary, AB, T2N 1N4 (Canada); Abedi, Jalal [Department of Chemical and Petroleum Engineering, University of Calgary, Schulich School of Engineering, 2500 University Drive, N.W., Calgary, AB, T2N 1N4 (Canada)], E-mail: jabedi@ucalgary.ca; Omidyeganeh, Mohammad [Department of Chemical and Petroleum Engineering, University of Calgary, Schulich School of Engineering, 2500 University Drive, N.W., Calgary, AB, T2N 1N4 (Canada)

    2009-03-09

    This study investigated the effects of two parts of the performance equation of the pulsed corona reactor, which is one of the non-thermal plasma processing tools of atmospheric pressure for eliminating pollutant streams. First, the effect of axial dispersion in the diffusion term and then the effect of different orders of the reaction in the decomposition rate term were considered. The mathematical model was primarily developed to predict the effluent concentration of the pulsed corona reactor using mass balance, and considering axial dispersion, linear velocity and decomposition rate of pollutant. The steady state form of this equation was subsequently solved assuming different reaction orders. For the derivation of the performance equation of the reactor, it was assumed that the decomposition rate of the pollutant was directly proportional to discharge power and the concentration of the pollutant. The results were validated and compared with another predicted model using their experimental data. The model developed in this study was also validated with two other experimental data in the literature for N{sub 2}O.

  6. A combined M5P tree and hazard-based duration model for predicting urban freeway traffic accident durations.

    Science.gov (United States)

    Lin, Lei; Wang, Qian; Sadek, Adel W

    2016-06-01

    The duration of freeway traffic accidents duration is an important factor, which affects traffic congestion, environmental pollution, and secondary accidents. Among previous studies, the M5P algorithm has been shown to be an effective tool for predicting incident duration. M5P builds a tree-based model, like the traditional classification and regression tree (CART) method, but with multiple linear regression models as its leaves. The problem with M5P for accident duration prediction, however, is that whereas linear regression assumes that the conditional distribution of accident durations is normally distributed, the distribution for a "time-to-an-event" is almost certainly nonsymmetrical. A hazard-based duration model (HBDM) is a better choice for this kind of a "time-to-event" modeling scenario, and given this, HBDMs have been previously applied to analyze and predict traffic accidents duration. Previous research, however, has not yet applied HBDMs for accident duration prediction, in association with clustering or classification of the dataset to minimize data heterogeneity. The current paper proposes a novel approach for accident duration prediction, which improves on the original M5P tree algorithm through the construction of a M5P-HBDM model, in which the leaves of the M5P tree model are HBDMs instead of linear regression models. Such a model offers the advantage of minimizing data heterogeneity through dataset classification, and avoids the need for the incorrect assumption of normality for traffic accident durations. The proposed model was then tested on two freeway accident datasets. For each dataset, the first 500 records were used to train the following three models: (1) an M5P tree; (2) a HBDM; and (3) the proposed M5P-HBDM, and the remainder of data were used for testing. The results show that the proposed M5P-HBDM managed to identify more significant and meaningful variables than either M5P or HBDMs. Moreover, the M5P-HBDM had the lowest overall mean

  7. Seismic Hazard of the Uttarakhand Himalaya, India, from Deterministic Modeling of Possible Rupture Planes in the Area

    Directory of Open Access Journals (Sweden)

    Anand Joshi

    2013-01-01

    Full Text Available This paper presents use of semiempirical method for seismic hazard zonation. The seismotectonically important region of Uttarakhand Himalaya has been considered in this work. Ruptures along the lineaments in the area identified from tectonic map are modeled deterministically using semi empirical approach given by Midorikawa (1993. This approach makes use of attenuation relation of peak ground acceleration for simulating strong ground motion at any site. Strong motion data collected over a span of three years in this region have been used to develop attenuation relation of peak ground acceleration of limited magnitude and distance applicability. The developed attenuation relation is used in the semi empirical method to predict peak ground acceleration from the modeled rupture planes in the area. A set of values of peak ground acceleration from possible ruptures in the area at the point of investigation is further used to compute probability of exceedance of peak ground acceleration of values 100 and 200 gals. The prepared map shows that regions like Tehri, Chamoli, Almora, Srinagar, Devprayag, Bageshwar, and Pauri fall in a zone of 10% probability of exceedence of peak ground acceleration of value 200 gals.

  8. A random field model for the estimation of seismic hazard. Final report for the period 1 January 1990 - 31 December 1990

    International Nuclear Information System (INIS)

    The general theory of stationary random functions is utilized to assess the seismic hazard associated with a linearly extending seismic source. The past earthquake occurrence data associated with a portion of the North Anatolian fault are used to demonstrate the implementation of the proposed model. 18 refs, figs and tabs

  9. EFSA Panel on Biological Hazards (BIOHAZ); Scientific Opinion on Reflecting on the experiences and lessons learnt from modelling on biological hazards

    DEFF Research Database (Denmark)

    Hald, Tine

    , preferably before accepting the mandate, a scoping exercise is recommended. The scoping exercise could include an assessment of the mandate, possible interpretations of the terms of reference, deadlines, the modelling approaches possible and the data requirements. To support this process, a model catalogue...

  10. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zucca, J J; Walter, W R; Rodgers, A J; Richards, P; Pasyanos, M E; Myers, S C; Lay, T; Harris, D; Antoun, T

    2008-11-19

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of Earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring and seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D Earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes two specific paths by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas. Seismic monitoring agencies are tasked with detection, location, and characterization of seismic activity in near real time. In the case of nuclear explosion monitoring or seismic hazard, decisions to further investigate a suspect event or to launch disaster relief efforts may rely heavily on real-time analysis and results. Because these are weighty decisions, monitoring agencies are regularly called upon to meticulously document and justify every aspect of their monitoring system. In order to meet this level of scrutiny and maintain operational robustness requirements, only mature technologies are considered for operational monitoring systems, and operational technology necessarily lags

  11. Potential hazards to embryo implantation: A human endometrial in vitro model to identify unwanted antigestagenic actions of chemicals

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, L.; Deppert, W.R. [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany); Pfeifer, D. [Department of Hematology and Oncology, University Hospital Freiburg (Germany); Stanzel, S.; Weimer, M. [Department of Biostatistics, German Cancer Research Center, Heidelberg (Germany); Hanjalic-Beck, A.; Stein, A.; Straßer, M.; Zahradnik, H.P. [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany); Schaefer, W.R., E-mail: wolfgang.schaefer@uniklinik-freiburg.de [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany)

    2012-05-01

    Embryo implantation is a crucial step in human reproduction and depends on the timely development of a receptive endometrium. The human endometrium is unique among adult tissues due to its dynamic alterations during each menstrual cycle. It hosts the implantation process which is governed by progesterone, whereas 17β-estradiol regulates the preceding proliferation of the endometrium. The receptors for both steroids are targets for drugs and endocrine disrupting chemicals. Chemicals with unwanted antigestagenic actions are potentially hazardous to embryo implantation since many pharmaceutical antiprogestins adversely affect endometrial receptivity. This risk can be addressed by human tissue-specific in vitro assays. As working basis we compiled data on chemicals interacting with the PR. In our experimental work, we developed a flexible in vitro model based on human endometrial Ishikawa cells. Effects of antiprogestin compounds on pre-selected target genes were characterized by sigmoidal concentration–response curves obtained by RT-qPCR. The estrogen sulfotransferase (SULT1E1) was identified as the most responsive target gene by microarray analysis. The agonistic effect of progesterone on SULT1E1 mRNA was concentration-dependently antagonized by RU486 (mifepristone) and ZK137316 and, with lower potency, by 4-nonylphenol, bisphenol A and apigenin. The negative control methyl acetoacetate showed no effect. The effects of progesterone and RU486 were confirmed on the protein level by Western blotting. We demonstrated proof of principle that our Ishikawa model is suitable to study quantitatively effects of antiprogestin-like chemicals on endometrial target genes in comparison to pharmaceutical reference compounds. This test is useful for hazard identification and may contribute to reduce animal studies. -- Highlights: ► We compare progesterone receptor-mediated endometrial effects of chemicals and drugs. ► 4-Nonylphenol, bisphenol A and apigenin exert weak

  12. Coastal Digital Elevation Models (DEMs) for tsunami hazard assessment on the French coasts

    Science.gov (United States)

    Maspataud, Aurélie; Biscara, Laurie; Hébert, Hélène; Schmitt, Thierry; Créach, Ronan

    2015-04-01

    Building precise and up-to-date coastal DEMs is a prerequisite for accurate modeling and forecasting of hydrodynamic processes at local scale. Marine flooding, originating from tsunamis, storm surges or waves, is one of them. Some high resolution DEMs are being generated for multiple coast configurations (gulf, embayment, strait, estuary, harbor approaches, low-lying areas…) along French Atlantic and Channel coasts. This work is undertaken within the framework of the TANDEM project (Tsunamis in the Atlantic and the English ChaNnel: Definition of the Effects through numerical Modeling) (2014-2017). DEMs boundaries were defined considering the vicinity of French civil nuclear facilities, site effects considerations and potential tsunamigenic sources. Those were identified from available historical observations. Seamless integrated topographic and bathymetric coastal DEMs will be used by institutions taking part in the study to simulate expected wave height at regional and local scale on the French coasts, for a set of defined scenarii. The main tasks were (1) the development of a new capacity of production of DEM, (2) aiming at the release of high resolution and precision digital field models referred to vertical reference frameworks, that require (3) horizontal and vertical datum conversions (all source elevation data need to be transformed to a common datum), on the basis of (4) the building of (national and/or local) conversion grids of datum relationships based on known measurements. Challenges in coastal DEMs development deal with good practices throughout model development that can help minimizing uncertainties. This is particularly true as scattered elevation data with variable density, from multiple sources (national hydrographic services, state and local government agencies, research organizations and private engineering companies) and from many different types (paper fieldsheets to be digitized, single beam echo sounder, multibeam sonar, airborne laser

  13. Individual-level space-time analyses of emergency department data using generalized additive modeling

    Directory of Open Access Journals (Sweden)

    Vieira Verónica M

    2012-08-01

    Full Text Available Abstract Background Although daily emergency department (ED data is a source of information that often includes residence, its potential for space-time analyses at the individual level has not been fully explored. We propose that ED data collected for surveillance purposes can also be used to inform spatial and temporal patterns of disease using generalized additive models (GAMs. This paper describes the methods for adapting GAMs so they can be applied to ED data. Methods GAMs are an effective approach for modeling spatial and temporal distributions of point-wise data, producing smoothed surfaces of continuous risk while adjusting for confounders. In addition to disease mapping, the method allows for global and pointwise hypothesis testing and selection of statistically optimum degree of smoothing using standard statistical software. We applied a two-dimensional GAM for location to ED data of overlapping calendar time using a locally-weighted regression smoother. To illustrate our methods, we investigated the association between participants’ address and the risk of gastrointestinal illness in Cape Cod, Massachusetts over time. Results The GAM space-time analyses simultaneously smooth in units of distance and time by using the optimum degree of smoothing to create data frames of overlapping time periods and then spatially analyzing each data frame. When resulting maps are viewed in series, each data frame contributes a movie frame, allowing us to visualize changes in magnitude, geographic size, and location of elevated risk smoothed over space and time. In our example data, we observed an underlying geographic pattern of gastrointestinal illness with risks consistently higher in the eastern part of our study area over time and intermittent variations of increased risk during brief periods. Conclusions Spatial-temporal analysis of emergency department data with GAMs can be used to map underlying disease risk at the individual-level and view

  14. A Comparison between Accelerated Failure-time and Cox Pro-portional Hazard Models in Analyzing the Survival of Gastric Cancer Patients

    Directory of Open Access Journals (Sweden)

    Ali ZARE

    2015-10-01

    Full Text Available Background: Gastric cancer is the one of the most prevalent reason of cancer-related death in the world. Survival of patients after surgery involves identifying risk factors. There are various models to detect the effect of risk factors on patients’ survival. The present study aims at evaluating these models.Methods: Data from 330 gastric cancer patients diagnosed at the Iran cancer institute during 1995-99 and followed up the end of 2011 were analyzed. The survival status of these patients in 2011 was determined by reopening the files as well as phone calls and the effect of various factors such as demographic, clinical, treatment, and post-surgical on pa-tients’ survival was studied. To compare various models of survival, Akaike Information Criterion and Cox-Snell Re-siduals were used. STATA 11 was used for data analyses.Results: Based on Cox-Snell Residuals and Akaike Information Criterion, the exponential (AIC=969.14 and Gom-pertz (AIC=970.70 models were more efficient than other accelerated failure-time models. Results of Cox propor-tional hazard model as well as the analysis of accelerated failure-time models showed that variables such as age (at di-agnosis, marital status, relapse, number of supplementary treatments, disease stage, and type of surgery were among factors affecting survival (P<0.05.Conclusion: Although most cancer researchers tend to use proportional hazard model, accelerated failure-time mod-els in analogous conditions — as they do not require proportional hazards assumption and consider a parametric sta-tistical distribution for survival time— will be credible alternatives to proportional hazard model.

  15. Improvement of ash plume monitoring, modeling and hazard assessment in the MED-SUV project

    Science.gov (United States)

    Coltelli, Mauro; Andronico, Daniele; Boselli, Antonella; Corradini, Stefano; Costa, Antonio; Donnadieu, Franck; Leto, Giuseppe; Macedonio, Giovanni; Merucci, Luca; Neri, Augusto; Pecora, Emilio; Prestifilippo, Michele; Scarlato, Piergiorgio; Scollo, Simona; Spinelli, Nicola; Spata, Gaetano; Taddeucci, Jacopo; Wang, Xuan; Zanmar Sanchez, Ricardo

    2014-05-01

    Volcanic ash clouds produced by explosive eruptions represent a strong problem for civil aviation, road transportation and other human activities. Since Etna volcano produced in the last 35 years more the 200 explosive eruptions of small and medium size. The INGV, liable for its volcano monitoring, developed since 2006 a specific system for forecasting and monitoring Etna's volcanic ash plumes in collaboration with several national and international institutions. Between 12 January 2011 and 31 December 2013 Etna produced forty-six basaltic lava fountains. Every paroxysm produced an eruption column ranging from a few up to eleven kilometers of height above sea level. The ash cloud contaminated the controlled airspace (CTR) of Catania and Reggio Calabria airports and caused tephra fallout on eastern Sicily sometime disrupting the operations of these airports. In order to give prompt and detailed warnings to the Aviation and Civil Protection authorities, ash plumes monitoring at Osservatorio Etneo, the INGV department in Catania, is carried out using multispectral (from visible to infrared) satellite and ground-based video-surveillance images; seismic and infrasound signals processed in real-time, a Doppler RADAR (Voldorad IIB) able to detect the eruption column in all weather conditions and a LIDAR (AMPLE) for retrieving backscattering and depolarization values of the ash clouds. Forecasting is performed running tephra dispersal models using weather forecast data, and then plotting results on maps published on a dedicated website. 24/7 Control Room operators were able to timely inform Aviation and Civil Protection operators for an effective aviation safety management. A variety of multidisciplinary activities are planned in the MED-SUV project with reference to volcanic ash observations and studies. These include: 1) physical and analogue laboratory experiments on ash dispersal and aggregation; 2) integration of satellite data (e.g. METEOSAT, MODIS) and ground

  16. GeoClaw-STRICHE: A coupled model for Sediment TRansport In Coastal Hazard Events

    CERN Document Server

    Tang, Hui

    2016-01-01

    GeoClaw-STRICHE is designed for simulating the physical impacts of tsunami as it relates to erosion, transport and deposition. GeoClaw-STRICHE comprises of three components: (1) nonlinear shallow water equations; (2) advection-diffusion equation; (3) an equation for morphology updating. Multiple grain sizes and sediment layers are added into GeoClaw-STRICHE to simulate grain-size distribution and add the capability to develop grain-size trends from bottom to the top of a simulated deposit as well as along the inundation. Unlike previous models based on empirical equations or sediment concentration gradient, the standard Van Leer method is applied to calculate sediment flux. We tested and verified GeoClaw-STRICHE with flume experiment by \\citet{johnson2016experimental} and data from the 2004 Indian Ocean tsunami in Kuala Meurisi as published in \\citet{JGRF:JGRF786}. The comparison with experimental data shows GeoClaw-STRICHE's capability to simulate sediment thickness and grain-size distribution in experimenta...

  17. The 2007 Bengkulu earthquake, its rupture model and implications for seismic hazard

    Indian Academy of Sciences (India)

    A Ambikapathy; J K Catherine; V K Gahalaut; M Narsaiah; A Bansal; P Mahesh

    2010-08-01

    The 12 September 2007 great Bengkulu earthquake ( 8.4) occurred on the west coast of Sumatra about 130 km SW of Bengkulu. The earthquake was followed by two strong aftershocks of 7.9 and 7.0. We estimate coseismic offsets due to the mainshock, derived from near-field Global Positioning System (GPS) measurements from nine continuous SuGAr sites operated by the California Institute of Technology (Caltech) group. Using a forward modelling approach, we estimated slip distribution on the causative rupture of the 2007 Bengkulu earthquake and found two patches of large slip, one located north of the mainshock epicenter and the other, under the Pagai Islands. Both patches of large slip on the rupture occurred under the island belt and shallow water. Thus, despite its great magnitude, this earthquake did not generate a major tsunami. Further, we suggest that the occurrence of great earthquakes in the subduction zone on either side of the Siberut Island region, might have led to the increase in static stress in the region, where the last great earthquake occurred in 1797 and where there is evidence of strain accumulation.

  18. Enhancement of colour stability of anthocyanins in model beverages by gum arabic addition.

    Science.gov (United States)

    Chung, Cheryl; Rojanasasithara, Thananunt; Mutilangi, William; McClements, David Julian

    2016-06-15

    This study investigated the potential of gum arabic to improve the stability of anthocyanins that are used in commercial beverages as natural colourants. The degradation of purple carrot anthocyanin in model beverage systems (pH 3.0) containing L-ascorbic acid proceeded with a first-order reaction rate during storage (40 °C for 5 days in light). The addition of gum arabic (0.05-5.0%) significantly enhanced the colour stability of anthocyanin, with the most stable systems observed at intermediate levels (1.5%). A further increase in concentration (>1.5%) reduced its efficacy due to a change in the conformation of the gum arabic molecules that hindered their exposure to the anthocyanins. Fluorescence quenching measurements showed that the anthocyanin could have interacted with the glycoprotein fractions of the gum arabic through hydrogen bonding, resulting in enhanced stability. Overall, this study provides valuable information about enhancing the stability of anthocyanins in beverage systems using natural ingredients. PMID:26868542

  19. Cost-Sensitive Boosting: Fitting an Additive Asymmetric Logistic Regression Model

    Science.gov (United States)

    Li, Qiu-Jie; Mao, Yao-Bin; Wang, Zhi-Quan; Xiang, Wen-Bo

    Conventional machine learning algorithms like boosting tend to equally treat misclassification errors that are not adequate to process certain cost-sensitive classification problems such as object detection. Although many cost-sensitive extensions of boosting by directly modifying the weighting strategy of correspond original algorithms have been proposed and reported, they are heuristic in nature and only proved effective by empirical results but lack sound theoretical analysis. This paper develops a framework from a statistical insight that can embody almost all existing cost-sensitive boosting algorithms: fitting an additive asymmetric logistic regression model by stage-wise optimization of certain criterions. Four cost-sensitive versions of boosting algorithms are derived, namely CSDA, CSRA, CSGA and CSLB which respectively correspond to Discrete AdaBoost, Real AdaBoost, Gentle AdaBoost and LogitBoost. Experimental results on the application of face detection have shown the effectiveness of the proposed learning framework in the reduction of the cumulative misclassification cost.

  20. Enhancement of colour stability of anthocyanins in model beverages by gum arabic addition.

    Science.gov (United States)

    Chung, Cheryl; Rojanasasithara, Thananunt; Mutilangi, William; McClements, David Julian

    2016-06-15

    This study investigated the potential of gum arabic to improve the stability of anthocyanins that are used in commercial beverages as natural colourants. The degradation of purple carrot anthocyanin in model beverage systems (pH 3.0) containing L-ascorbic acid proceeded with a first-order reaction rate during storage (40 °C for 5 days in light). The addition of gum arabic (0.05-5.0%) significantly enhanced the colour stability of anthocyanin, with the most stable systems observed at intermediate levels (1.5%). A further increase in concentration (>1.5%) reduced its efficacy due to a change in the conformation of the gum arabic molecules that hindered their exposure to the anthocyanins. Fluorescence quenching measurements showed that the anthocyanin could have interacted with the glycoprotein fractions of the gum arabic through hydrogen bonding, resulting in enhanced stability. Overall, this study provides valuable information about enhancing the stability of anthocyanins in beverage systems using natural ingredients.

  1. Drivers' hazard perception modeling and experimental analysis%驾驶人危险感知建模与实验解析

    Institute of Scientific and Technical Information of China (English)

    杨京帅; 王文亮; 苏薇; 杨得婷; 孙正一

    2014-01-01

    为了量化驾驶人群体危险感知的差异性并指出差异性的原因所在,对驾驶人危险感知进行建模与实验测试分析.借鉴污染环境下生物种群生存这一自然现象,构建了驾驶人危险感知模型,通过实验测试了驾驶人在不同交通场景的危险识别时间与反应时间.模型分析与实验结果表明,驾驶人危险感知阈限值与交通情境危险输入的速率负相关,与驾驶人正确反应率和危险识别率正相关.不同危险程度的交通场景对驾驶人危险感知的总反应时间和识别时间具有显著影响(p<0.001);驾驶经验对驾驶人危险识别时间没有显著影响(p=0.080),但是对危险的反应时间具有显著影响(p=0.003).熟练驾驶人相比非熟练驾驶人具有较高的危险感知水平,这种差异性主要体现在熟练驾驶人能够更快速准确地预测评估交通情境中的危险并进行合理判断.%To quantify the difference of drivers' hazard perception and point out the causes of the difference,drivers' hazard perception was modeled and tested by experiments.The hazard percep-tion model was built based on the natural phenomena of biological population surviving in pollution environments.The hazard detection time and hazard reaction time were tested in different traffic situ-ations.The model analysis and experimental results show that drivers' hazard perception threshold limit value is negatively affected by the input rate of traffic situation dangers and positively affected by drivers' correct reaction ratios and hazards detection ratios.The results of drivers'hazard percep-tion test reveal a main effect of traffic situations with different risk level on drivers' overall reaction time and hazard detection time (p<0.001).Driving experiences have no significant effect on haz-ard detection time (p=0.080)but have significant effect on drivers' hazard reaction time (p=0.003).Experienced

  2. Precarious Rock Methodology for Seismic Hazard: Physical Testing, Numerical Modeling and Coherence Studies

    International Nuclear Information System (INIS)

    This report covers the following projects: Shake table tests of precarious rock methodology, field tests of precarious rocks at Yucca Mountain and comparison of the results with PSHA predictions, study of the coherence of the wave field in the ESF, and a limited survey of precarious rocks south of the proposed repository footprint. A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Structures Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where the objects on the shake table overturned a specified number of times. The results of these experiments have been compared with numerical overturning predictions. Numerical predictions for toppling of large objects with simple contact conditions (e.g., I-beams with sharp basal edges) agree well with shake-table results. The numerical model slightly underpredicts the overturning of small rectangular blocks. It overpredicts the overturning PGA for asymmetric granite boulders with complex basal contact conditions. In general the results confirm the approximate predictions of previous studies. Field testing of several rocks at Yucca Mountain has approximately confirmed the preliminary results from previous studies, suggesting that the PSHA predictions are too high, possibly because the uncertainty in the mean of the attenuation relations. Study of the coherence of wavefields in the ESF has provided results which will be very important in design of the canisters distribution, in particular a preliminary estimate of the wavelengths at which the wavefields become incoherent. No evidence was found for extreme focusing by lens-like inhomogeneities. A limited survey for precarious rocks confirmed that they extend south of the repository, and one of these has been field tested

  3. Nuclear subsurface explosion modeling and hydrodynamic fragmentation simulation of hazardous asteroids

    Science.gov (United States)

    Premaratne, Pavithra Dhanuka

    Disruption and fragmentation of an asteroid using nuclear explosive devices (NEDs) is a highly complex yet a practical solution to mitigating the impact threat of asteroids with short warning time. A Hypervelocity Asteroid Intercept Vehicle (HAIV) concept, developed at the Asteroid Deflection Research Center (ADRC), consists of a primary vehicle that acts as kinetic impactor and a secondary vehicle that houses NEDs. The kinetic impactor (lead vehicle) strikes the asteroid creating a crater. The secondary vehicle will immediately enter the crater and detonate its nuclear payload creating a blast wave powerful enough to fragment the asteroid. The nuclear subsurface explosion modeling and hydrodynamic simulation has been a challenging research goal that paves the way an array of mission critical information. A mesh-free hydrodynamic simulation method, Smoothed Particle Hydrodynamics (SPH) was utilized to obtain both qualitative and quantitative solutions for explosion efficiency. Commercial fluid dynamics packages such as AUTODYN along with the in-house GPU accelerated SPH algorithms were used to validate and optimize high-energy explosion dynamics for a variety of test cases. Energy coupling from the NED to the target body was also examined to determine the effectiveness of nuclear subsurface explosions. Success of a disruption mission also depends on the survivability of the nuclear payload when the secondary vehicle approaches the newly formed crater at a velocity of 10 km/s or higher. The vehicle may come into contact with debris ejecting the crater which required the conceptual development of a Whipple shield. As the vehicle closes on the crater, its skin may also experience extreme temperatures due to heat radiated from the crater bottom. In order to address this thermal problem, a simple metallic thermal shield design was implemented utilizing a radiative heat transfer algorithm and nodal solutions obtained from hydrodynamic simulations.

  4. Leaching of hazardous substances from a composite construction product – An experimental and modelling approach for fibre-cement sheets

    Energy Technology Data Exchange (ETDEWEB)

    Lupsea, Maria [University of Toulouse, INSA, UPS, INP, LISBP, 135 Avenue de Rangueil, F-31077 Toulouse (France); INRA, UMR 792, F-31400 Toulouse (France); CNRS, UMR 5504, F-31400 Toulouse (France); Paris–Est University, CSTB–Scientific and Technical Centre for the Building Industry, DEE/Environmentand Life Cycle Engineering Team, 24 rue Joseph Fourier, F–38400 Saint Martin d’Hères (France); Tiruta-Barna, Ligia, E-mail: ligia.barna@insa-toulouse.fr [University of Toulouse, INSA, UPS, INP, LISBP, 135 Avenue de Rangueil, F-31077 Toulouse (France); INRA, UMR 792, F-31400 Toulouse (France); CNRS, UMR 5504, F-31400 Toulouse (France); Schiopu, Nicoleta [Paris–Est University, CSTB–Scientific and Technical Centre for the Building Industry, DEE/Environmentand Life Cycle Engineering Team, 24 rue Joseph Fourier, F–38400 Saint Martin d’Hères (France)

    2014-01-15

    Highlights: • Biocide and heavy metals leaching from fibre-cement sheet was investigated. • Equilibrium and dynamic leaching tests were used as modelling support. • The chemical-transport model identifies the main fixation/solubilisation mechanisms. • Biocides as terbutryn and boron were released by the commercial product. • FCS exhibit a cement-like leaching behaviour with high organic carbon release. -- Abstract: The leaching behaviour of a commercial fibre-cement sheet (FCS) product has been investigated. A static pH dependency test and a dynamic surface leaching test have been performed at lab scale. These tests allowed the development of a chemical-transport model capable to predict the release of major and trace elements over the entire pH range, in function of time. FCS exhibits a cement-type leaching behaviour with respect to the mineral species. Potentially hazardous species are released in significant quantities when compared to their total content. These are mainly heavy metals commonly encountered in cement matrixes and boron (probably added as biocide). Organic compounds considered as global dissolved carbon are released in significant concentrations, originating probably from the partial degradation of the organic fibres. The pesticide terbutryn (probably added during the preservative treatment of the organic fibres) was systematically identified in the leachates. The simulation of an upscaled runoff scenario allowed the evaluation of the cumulative release over long periods and the distribution of the released quantities in time, in function of the local exposure conditions. After 10 years of exposure the release reaches significant fractions of the species’ total content – going from 4% for Cu to near 100% for B.

  5. Leaching of hazardous substances from a composite construction product – An experimental and modelling approach for fibre-cement sheets

    International Nuclear Information System (INIS)

    Highlights: • Biocide and heavy metals leaching from fibre-cement sheet was investigated. • Equilibrium and dynamic leaching tests were used as modelling support. • The chemical-transport model identifies the main fixation/solubilisation mechanisms. • Biocides as terbutryn and boron were released by the commercial product. • FCS exhibit a cement-like leaching behaviour with high organic carbon release. -- Abstract: The leaching behaviour of a commercial fibre-cement sheet (FCS) product has been investigated. A static pH dependency test and a dynamic surface leaching test have been performed at lab scale. These tests allowed the development of a chemical-transport model capable to predict the release of major and trace elements over the entire pH range, in function of time. FCS exhibits a cement-type leaching behaviour with respect to the mineral species. Potentially hazardous species are released in significant quantities when compared to their total content. These are mainly heavy metals commonly encountered in cement matrixes and boron (probably added as biocide). Organic compounds considered as global dissolved carbon are released in significant concentrations, originating probably from the partial degradation of the organic fibres. The pesticide terbutryn (probably added during the preservative treatment of the organic fibres) was systematically identified in the leachates. The simulation of an upscaled runoff scenario allowed the evaluation of the cumulative release over long periods and the distribution of the released quantities in time, in function of the local exposure conditions. After 10 years of exposure the release reaches significant fractions of the species’ total content – going from 4% for Cu to near 100% for B

  6. Leaching of hazardous substances from a composite construction product--an experimental and modelling approach for fibre-cement sheets.

    Science.gov (United States)

    Lupsea, Maria; Tiruta-Barna, Ligia; Schiopu, Nicoleta

    2014-01-15

    The leaching behaviour of a commercial fibre-cement sheet (FCS) product has been investigated. A static pH dependency test and a dynamic surface leaching test have been performed at lab scale. These tests allowed the development of a chemical-transport model capable to predict the release of major and trace elements over the entire pH range, in function of time. FCS exhibits a cement-type leaching behaviour with respect to the mineral species. Potentially hazardous species are released in significant quantities when compared to their total content. These are mainly heavy metals commonly encountered in cement matrixes and boron (probably added as biocide). Organic compounds considered as global dissolved carbon are released in significant concentrations, originating probably from the partial degradation of the organic fibres. The pesticide terbutryn (probably added during the preservative treatment of the organic fibres) was systematically identified in the leachates. The simulation of an upscaled runoff scenario allowed the evaluation of the cumulative release over long periods and the distribution of the released quantities in time, in function of the local exposure conditions. After 10 years of exposure the release reaches significant fractions of the species' total content - going from 4% for Cu to near 100% for B.

  7. Hazardous Air Pollutants

    Science.gov (United States)

    ... Facebook Twitter Google+ Pinterest Contact Us Hazardous Air Pollutants Hazardous air pollutants are those known to cause ... protect against adverse environmental effects. About Hazardous Air Pollutants What are hazardous air pollutants? Health and Environmental ...

  8. Comment on "Polynomial cointegration tests of anthropogenic impact on global warming" by Beenstock et al. (2012) – some hazards in econometric modelling of climate change

    OpenAIRE

    F. Pretis; Hendry, D. F.

    2013-01-01

    We outline six important hazards that can be encountered in econometric modelling of time-series data, and apply that analysis to demonstrate errors in the empirical modelling of climate data in Beenstock et al. (2012). We show that the claim made in Beenstock et al. (2012) as to the different degrees of integrability of CO2 and temperature is incorrect. In particular, the level of integration is not constant and not intrinsic to the process. Further, we illustrate that the ...

  9. Mathematical modeling and experimental validation of Phaeodactylum tricornutum microalgae growth rate with glycerol addition

    Energy Technology Data Exchange (ETDEWEB)

    Morais, Keli Cristiane Correia; Ribeiro, Robert Luis Lara; Santos, Kassiana Ribeiro dos; Mariano, Andre Bellin [Mariano Center for Research and Development of Sustainable Energy (NPDEAS), Curitiba, PR (Brazil); Vargas, Jose Viriato Coelho [Departament of Mechanical Engineering, Federal University of Parana (UFPR) Curitiba, PR (Brazil)

    2010-07-01

    The Brazilian National Program for Bio fuel Production has been encouraging diversification of feedstock for biofuel production. One of the most promising alternatives is the use of microalgae biomass for biofuel production. The cultivation of microalgae is conducted in aquatic systems, therefore microalgae oil production does not compete with agricultural land. Microalgae have greater photosynthetic efficiency than higher plants and are efficient fixing CO{sub 2}. The challenge is to reduce production costs, which can be minimized by increasing productivity and oil biomass. Aiming to increase the production of microalgae biomass, mixotrophic cultivation, with the addition of glycerol has been shown to be very promising. During the production of biodiesel from microalgae there is availability of glycerol as a side product of the transesterification reaction, which could be used as organic carbon source for microalgae mixotrophic growth, resulting in increased biomass productivity. In this paper, to study the effect of glycerol in experimental conditions, the batch culture of the diatom Phaeodactylum tricornutum was performed in a 2-liter flask in a temperature and light intensity controlled room. During 16 days of cultivation, the number of cells per ml was counted periodically in a Neubauer chamber. The calculation of dry biomass in the control experiment (without glycerol) was performed every two days by vacuum filtration. In the dry biomass mixotrophic experiment with glycerol concentration of 1.5 M, the number of cells was assessed similarly in the 10{sup th} and 14{sup th} days of cultivation. Through a volume element methodology, a mathematical model was written to calculate the microalgae growth rate. It was used an equation that describes the influence of irradiation and concentration of nutrients in the growth of microalgae. A simulation time of 16 days was used in the computations, with initial concentration of 0.1 g l{sup -1}. In order to compare

  10. Improving the all-hazards homeland security enterprise through the use of an emergency management intelligence model

    OpenAIRE

    Schulz, William N.

    2013-01-01

    CHDS State/Local As the all-hazards approach takes hold in our national Emergency Management and Homeland Security efforts and continues to seek greater collaboration between these two fields, an area that has yet to be explored to its fullest extent is the utilization of an intelligence process to enhance EM operations. Despite the existence of multiple Federal-level policies that outline the importance of intelligence and information sharing across the all-hazards community, EM is still ...

  11. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Hu, Qinhong

    2016-07-01

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in a stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.

  12. Efficient Semiparametric Marginal Estimation for the Partially Linear Additive Model for Longitudinal/Clustered Data

    KAUST Repository

    Carroll, Raymond

    2009-04-23

    We consider the efficient estimation of a regression parameter in a partially linear additive nonparametric regression model from repeated measures data when the covariates are multivariate. To date, while there is some literature in the scalar covariate case, the problem has not been addressed in the multivariate additive model case. Ours represents a first contribution in this direction. As part of this work, we first describe the behavior of nonparametric estimators for additive models with repeated measures when the underlying model is not additive. These results are critical when one considers variants of the basic additive model. We apply them to the partially linear additive repeated-measures model, deriving an explicit consistent estimator of the parametric component; if the errors are in addition Gaussian, the estimator is semiparametric efficient. We also apply our basic methods to a unique testing problem that arises in genetic epidemiology; in combination with a projection argument we develop an efficient and easily computed testing scheme. Simulations and an empirical example from nutritional epidemiology illustrate our methods.

  13. Evolutionary model of coal mine water hazards based on multi-agent simulation%基于multi—agent的煤矿水害演化模型

    Institute of Scientific and Technical Information of China (English)

    龚承柱; 李兰兰; 柯晓玲; 诸克军

    2012-01-01

    Based on coal mine hydrogeological conditions, the causes and evolution mechanism of coal mine water hazards were presented. By using complex systems theory and multi-agent modeling method, an evolutionary model of coal mine water hazards was developed. Then simulated the evolution process with different system conditions by Netlogo platform, and the coal mine water hazards process and vulnerability relationship were demonstration dynamical. The research shows that, the coal mine water hazards, influenced by multiple factors,is a complex and adaptive phenomenon that possesses nonlinear dynamic characteristics. Only by understanding the mine hydrogeological conditions, use complex systems theory and multi-agent modeling method, the mechanism of coal mine water hazards can be uncovered in nature.%根据矿区的水文地质条件,分析了煤矿水害事故的形成原因和演化机制,采用复杂系统理论和多主体建模方法,建立了煤矿水害演化模型;利用NetLogo仿真平台,对不同类型水害事故进行仿真模拟,动态表现煤矿水害演化过程以及影响因素之间的脆弱性关系。研究表明:煤矿水害是一种受控于多种因素、具有非线性动力特征的复杂自适应现象,只有了解矿区水文地质条件,并将复杂系统理论和多主体建模方法引入水害防治研究中.才能从本质上描述水害演化机制。

  14. Tracking hazardous air pollutants from a refinery fire by applying on-line and off-line air monitoring and back trajectory modeling

    International Nuclear Information System (INIS)

    Highlights: • An industrial fire can emit hazardous air pollutants into the surrounding areas. • Both on- and off-line monitoring are needed to study air pollution from fires. • Back trajectory and dispersion modeling can trace emission sources of fire-related pollution. -- Abstract: The air monitors used by most regulatory authorities are designed to track the daily emissions of conventional pollutants and are not well suited for measuring hazardous air pollutants that are released from accidents such as refinery fires. By applying a wide variety of air-monitoring systems, including on-line Fourier transform infrared spectroscopy, gas chromatography with a flame ionization detector, and off-line gas chromatography–mass spectrometry for measuring hazardous air pollutants during and after a fire at a petrochemical complex in central Taiwan on May 12, 2011, we were able to detect significantly higher levels of combustion-related gaseous and particulate pollutants, refinery-related hydrocarbons, and chlorinated hydrocarbons, such as 1,2-dichloroethane, vinyl chloride monomer, and dichloromethane, inside the complex and 10 km downwind from the fire than those measured during the normal operation periods. Both back trajectories and dispersion models further confirmed that high levels of hazardous air pollutants in the neighboring communities were carried by air mass flown from the 22 plants that were shut down by the fire. This study demonstrates that hazardous air pollutants from industrial accidents can successfully be identified and traced back to their emission sources by applying a timely and comprehensive air-monitoring campaign and back trajectory air flow models

  15. Modelling Short-Term Maximum Individual Exposure from Airborne Hazardous Releases in Urban Environments. Part ΙI: Validation of a Deterministic Model with Wind Tunnel Experimental Data

    Directory of Open Access Journals (Sweden)

    George C. Efthimiou

    2015-06-01

    Full Text Available The capability to predict short-term maximum individual exposure is very important for several applications including, for example, deliberate/accidental release of hazardous substances, odour fluctuations or material flammability level exceedance. Recently, authors have proposed a simple approach relating maximum individual exposure to parameters such as the fluctuation intensity and the concentration integral time scale. In the first part of this study (Part I, the methodology was validated against field measurements, which are governed by the natural variability of atmospheric boundary conditions. In Part II of this study, an in-depth validation of the approach is performed using reference data recorded under truly stationary and well documented flow conditions. For this reason, a boundary-layer wind-tunnel experiment was used. The experimental dataset includes 196 time-resolved concentration measurements which detect the dispersion from a continuous point source within an urban model of semi-idealized complexity. The data analysis allowed the improvement of an important model parameter. The model performed very well in predicting the maximum individual exposure, presenting a factor of two of observations equal to 95%. For large time intervals, an exponential correction term has been introduced in the model based on the experimental observations. The new model is capable of predicting all time intervals giving an overall factor of two of observations equal to 100%.

  16. Utilization of sulfate additives in biomass combustion: fundamental and modeling aspects

    DEFF Research Database (Denmark)

    Wu, Hao; Jespersen, Jacob Boll; Grell, Morten Nedergaard;

    2013-01-01

    Sulfates, such as ammonium sulfate, aluminum sulfate and ferric sulfate, are effective additives for converting the alkali chlorides released from biomass combustion to the less harmful alkali sulfates. Optimization of the use of these additives requires knowledge on their decomposition rate and ...

  17. Earthquake Hazard and Risk in Alaska

    Science.gov (United States)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  18. Seismic hazard: UK continental shelf

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    An evaluation of seismic hazard for offshore UK waters has been completed by a joint UK/Norwegian team. For the first time, consistency of hazard mapping has been achieved for the northern North Sea and peak acceleration hazard contour maps have been drawn for return periods of 100, 200, 475, 1000 and 10,000 years. The report describes (a) the spatial pattern of seismicity; (b) scam's hazard source model; (c) seismic ground motion and (d) seismic hazard computation. Diagrams show (1) a map of the structural framework of the UK and Norwegian North Sea; (2) epicentres; (3) zonation models; (4) peak ground acceleration contours and (5) generic offshore spectral shapes.

  19. Statistical analysis of Caterpillar 793D haul truck engine data and through-life diagnostic information using the proportional hazards model

    Directory of Open Access Journals (Sweden)

    Carstens, W. A.

    2013-08-01

    Full Text Available Physical asset management (PAM is of increasing concern for companies in industry today. A key performance area of PAM is asset care plans (ACPs, which consist of maintenance strategies such as usage based maintenance (UBM and condition based maintenance (CBM. Data obtained from the South African mining industry was modelled using a CBM prognostic model called the proportional hazards model (PHM. Results indicated that the developed model produced estimates that were reasonable representations of reality. These findings provide an exciting basis for the development of future Weibull PHMs that could result in huge maintenance cost savings and reduced failure occurrences.

  20. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 1.0 m: wave-hazard projections

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  1. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 2.0 m: wave-hazard projections

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  2. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 1.5 m: wave-hazard projections

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  3. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 0.5 m: wave-hazard projections

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  4. CoSMoS (Coastal Storm Modeling System) Southern California v3.0 Phase 1 (100-year storm) sea-level rise 0.0 m: wave-hazard projections

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Projected Hazard: Model-derived significant wave height (in meters) for the given storm condition and sea-level rise (SLR) scenario. Model Summary: The Coastal...

  5. A review of successful aging models: proposing proactive coping as an important additional strategy.

    OpenAIRE

    Ouwehand, C.; Ridder, D.T.D. de; BENSING, J

    2007-01-01

    Successful aging is an important concept, and one that has been the subject of much research. During the last 15 years, the emphasis of this research has shifted from formulating criteria for successful aging to describing the processes involved in successful aging. The main purpose of the present article is to review psychological models of successful aging. The model of Selective Optimization with Compensation (SOC-model) proves to be one of the leading models in this field. Although eviden...

  6. Quaternary eruptive history and hazard-zone model at Nevado del Tolima and Cerro Machin volcanoes, Colombia

    Science.gov (United States)

    Thouret, J. C.; Cantagrel, J.-M.; Robin, C.; Murcia, A.; Salinas, R.; Cepeda, H.

    1995-07-01

    areas potentially affected by future eruptions both at Nevado del Tolima and at active Cerro Machin 12 km southward. The extent of areas likely to be affected by tephra-falls, debris flows, pyroclastic flows or surges, debris avalanches and lava flows is shown. Subplinian and plinian eruptions of Nevado del Tolima were used to represent the moderate and large events to be expected. 300,000 people live within a 35-km distance from those volcanoes, which have exhibited a behaviour more explosive than Nevado del Ruiz. Despite the small-sized ice cap, debris flows are the most probable hazard for even a minor eruption, because of the very steep slope gradient, and because of probable interactions of hot eruptive products with ice and snow. Additionally, scoria flows and debris avalanches can be directed toward the southeast and could be transformed into debris flows that would devastate the Combeima valley and suburbs of Ibaguécity, where about 50,000 people live

  7. Hydrodynamic Modeling of Flash Floods in an Andean Stream: Challenges for Assessing Flood Hazards in Mountain Rivers

    Science.gov (United States)

    Contreras, M. T.; Escauriaza, C. R.

    2015-12-01

    Rain-induced flash floods are common events in regions close to the southern Andes, in north and central Chile. Rapid urban development combined to the changing climate and ENSO effects have resulted in an alarming proximity of flood-prone streams to densely populated areas in the Andean foothills, increasing the risk for cities and infrastructure. Simulations of rapid floods in these complex watersheds are particularly challenging, especially if there is insufficient geomorphological and hydrometeorological data. In the Quebrada de Ramón, an Andean stream that passes through a highly populated area in the east part of Santiago, Chile, previous events have demonstrated that sediment concentration, flow resistance, and the characteristic temporal and spatial scales of the hydrograph, are important variables to predict the arrival time of the peak discharge, flow velocities and the extension of inundated areas. The objective of this investigation is to improve our understanding of the dynamics of flash floods in the Quebrada de Ramón, quantifying the effects of these factors on the flood propagation. We implement a two-dimensional model based on the shallow water equations (Guerra et al. 2014) modified to account for hyperconcentrated flows over natural topography. We evaluate events of specific return periods and sediment concentrations, using different methodologies to quantify the flow resistance in the channel and floodplains. Through this work we provide a framework for future studies aimed at improving hazard assessment, urban planning, and early warning systems in urban areas near mountain streams with limited data, and affected by rapid flood events. Work supported by Fondecyt grant 1130940 and CONICYT/FONDAP grant 15110017.

  8. Utilization of sulfate additives in biomass combustion: fundamental and modeling aspects

    OpenAIRE

    Wu, Hao; Jespersen, Jacob Boll; Grell, Morten Nedergaard; Aho, Martti; Jappe Frandsen, Flemming; Glarborg, Peter

    2013-01-01

    Sulfates, such as ammonium sulfate, aluminum sulfate and ferric sulfate, are effective additives for converting the alkali chlorides released from biomass combustion to the less harmful alkali sulfates. Optimization of the use of these additives requires knowledge on their decomposition rate and product distribution under high temperature conditions. In the present work, the decomposition of ammonium sulfate, aluminum sulfate and ferric sulfate wasstudied respectively in a fast-heating rate t...

  9. Sustainable manufacturing: evaluation and modeling of environmental impacts in additive manufacturing

    OpenAIRE

    Le Bourhis, Florent; Kerbrat, Olivier; Hascoët, Jean-Yves; MOGNOL, Pascal

    2013-01-01

    International audience Cleaner production and sustainability are of crucial importance in the field of manufacturing processes where great amounts of energy and materials are being consumed. Nowadays, additive manufacturing technologies such as direct additive laser manufacturing allow us to manufacture functional products with high added value. Insofar as environmental considerations become an important issue in our society, as well as legislation regarding environment become prominent (N...

  10. Can an energy balance model provide additional constraints on how to close the energy imbalance?

    OpenAIRE

    Wohlfahrt, Georg; Widmoser, Peter

    2013-01-01

    Elucidating the causes for the energy imbalance, i.e. the phenomenon that eddy covariance latent and sensible heat fluxes fall short of available energy, is an outstanding problem in micrometeorology. This paper tests the hypothesis that the full energy balance, through incorporation of additional independent measurements which determine the driving forces of and resistances to energy transfer, provides further insights into the causes of the energy imbalance and additional constraints on ene...

  11. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Hu, Qinhong

    2016-07-31

    The additivity model assumed that field-scale reaction properties in a sediment including surface area, reactive site concentration, and reaction rate can be predicted from field-scale grain-size distribution by linearly adding reaction properties estimated in laboratory for individual grain-size fractions. This study evaluated the additivity model in scaling mass transfer-limited, multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment. Experimental data of rate-limited U(VI) desorption in a stirred flow-cell reactor were used to estimate the statistical properties of the rate constants for individual grain-size fractions, which were then used to predict rate-limited U(VI) desorption in the composite sediment. The result indicated that the additivity model with respect to the rate of U(VI) desorption provided a good prediction of U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel-size fraction (2 to 8 mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.

  12. Bank Bailouts and Moral Hazard : Evidence from Germany

    NARCIS (Netherlands)

    Dam, L.; Koetter, M.

    2012-01-01

    We use a structural econometric model to provide empirical evidence that safety nets in the banking industry lead to additional risk taking. To identify the moral hazard effect of bailout expectations on bank risk, we exploit the fact that regional political factors explain bank bailouts but not ban

  13. Mechanics model of additional longitudinal force transmission between bridges and continuously welded rails with small resistance fasteners

    Institute of Scientific and Technical Information of China (English)

    徐庆元; 周小林; 曾志平; 杨小礼

    2004-01-01

    A new mechanics model, which reveals additional longitudinal force transmission between the continuously welded rails and the bridges, is established on the fact that the influence of the mutual relative displacement among the rail, the sleeper and the beam is taken into account. An example is presented and numerical results are compared. The results show that the additional longitudinal forces calculated with the new model are less than those of the previous, especially in the case of the flexible pier bridges. The new model is also suitable for the analysis of the additional longitudinal force transmission between rails and bridges of ballastless track with small resistance fasteners without taking the sleeper displacement into account, and compared with the ballast bridges, the ballastless bridges have a much stronger additional longitudinal force transmission between the continuously welded rails and the bridges.

  14. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  15. Remote sensing and GIS-based landslide hazard analysis and cross-validation using multivariate logistic regression model on three test areas in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet

    2010-05-01

    This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross

  16. A framework of integrated hydrological and hydrodynamic models using synthetic rainfall for flash flood hazard mapping of ungauged catchments in tropical zones

    OpenAIRE

    Lohpaisankrit, Worapong; Meon, Günter; Tingsanchali, Tawatchai

    2016-01-01

    Flash flood hazard maps provide a scientific support to mitigate flash flood risk. The present study develops a practical framework with the help of integrated hydrological and hydrodynamic modelling in order to estimate the potential flash floods. We selected a small pilot catchment which has already suffered from flash floods in the past. This catchment is located in the Nan River basin, northern Thailand. Reliable meteorological and hydrometric data are missing in the cat...

  17. Inclusion of Additional Plant Species and Trait Information in Dynamic Vegetation Modeling of Arctic Tundra and Boreal Forest Ecosystem

    Science.gov (United States)

    Euskirchen, E. S.; Patil, V.; Roach, J.; Griffith, B.; McGuire, A. D.

    2015-12-01

    Dynamic vegetation models (DVMs) have been developed to model the ecophysiological characteristics of plant functional types in terrestrial ecosystems. They have frequently been used to answer questions pertaining to processes such as disturbance, plant succession, and community composition under historical and future climate scenarios. While DVMs have proved useful in these types of applications, it has often been questioned if additional detail, such as including plant dynamics at the species-level and/or including species-specific traits would make these models more accurate and/or broadly applicable. A sub-question associated with this issue is, 'How many species, or what degree of functional diversity, should we incorporate to sustain ecosystem function in modeled ecosystems?' Here, we focus on how the inclusion of additional plant species and trait information may strengthen dynamic vegetation modeling in applications pertaining to: (1) forage for caribou in northern Alaska, (2) above- and belowground carbon storage in the boreal forest and lake margin wetlands of interior Alaska, and (3) arctic tundra and boreal forest leaf phenology. While the inclusion of additional information generally proved valuable in these three applications, this additional detail depends on field data that may not always be available and may also result in increased computational complexity. Therefore, it is important to assess these possible limitations against the perceived need for additional plant species and trait information in the development and application of dynamic vegetation models.

  18. TURBHO - Higher order turbulence modeling for industrial appications. Design document: Module Test Phase (MTP). Software engineering module: Additional physical models; TURBHO. Turbulenzmodellierung hoeherer Ordnung fuer industrielle Anwendungen. Design document: Module Test Phase (MTP). Software engineering module: additional physical models

    Energy Technology Data Exchange (ETDEWEB)

    Grotjans, H.

    1998-04-01

    In the current Software Engineering Module (SEM2) three additional test cases have been investigated, as listed in Chapter 2. For all test cases it has been shown that the computed results are grid independent. This has been done by systematic grid refinement studies. The main objective of the current SEM2 was the verification and validation of the new wall function implementation for the k-{epsilon} mode and the SMC-model. Analytical relations and experimental data have been used for comparison of the computational results. The agreement of the results is good. Therefore, the correct implementation of the new wall function has been demonstrated. As the results in this report have shown, a consistent grid refinement can be done for any test case. This is an important improvement for industrial applications, as no model specific requirements must be considered during grid generation. (orig.)

  19. Seismic hazard assessment of Iran

    Directory of Open Access Journals (Sweden)

    M. Ghafory-Ashtiany

    1999-06-01

    Full Text Available The development of the new seismic hazard map of Iran is based on probabilistic seismic hazard computation using the historical earthquakes data, geology, tectonics, fault activity and seismic source models in Iran. These maps have been prepared to indicate the earthquake hazard of Iran in the form of iso-acceleration contour lines, and seismic hazard zoning, by using current probabilistic procedures. They display the probabilistic estimates of Peak Ground Acceleration (PGA for the return periods of 75 and 475 years. The maps have been divided into intervals of 0.25 degrees in both latitudinal and longitudinal directions to calculate the peak ground acceleration values at each grid point and draw the seismic hazard curves. The results presented in this study will provide the basis for the preparation of seismic risk maps, the estimation of earthquake insurance premiums, and the preliminary site evaluation of critical facilities.

  20. Assessment of Chinese sturgeon habitat suitability in the Yangtze River (China): Comparison of generalized additive model, data-driven fuzzy logic model, and preference curve model

    Science.gov (United States)

    Yi, Yujun; Sun, Jie; Zhang, Shanghong; Yang, Zhifeng

    2016-05-01

    To date, a wide range of models have been applied to evaluate aquatic habitat suitability. In this study, three models, including the expert knowledge-based preference curve model (PCM), data-driven fuzzy logic model (DDFL), and generalized additive model (GAM), are used on a common data set to compare their effectiveness and accuracy. The true skill statistic (TSS) and the area under the receiver operating characteristics curve (AUC) are used to evaluate the accuracy of the three models. The results indicate that the two data-based methods (DDFL and GAM) yield better accuracy than the expert knowledge-based PCM, and the GAM yields the best accuracy. There are minor differences in the suitable ranges of the physical habitat variables obtained from the three models. The hydraulic habitat suitability index (HHSI) calculated by the PCM is the largest, followed by the DDFL and then the GAM. The results illustrate that data-based models can describe habitat suitability more objectively and accurately when there are sufficient data. When field data are lacking, combining expertise with data-based models is recommended. When field data are difficult to obtain, an expert knowledge-based model can be used as a replacement for the data-based methods.

  1. Generalized Additive Models for Location Scale and Shape (GAMLSS in R

    Directory of Open Access Journals (Sweden)

    D. Mikis Stasinopoulos

    2007-11-01

    Full Text Available GAMLSS is a general framework for fitting regression type models where the distribution of the response variable does not have to belong to the exponential family and includes highly skew and kurtotic continuous and discrete distribution. GAMLSS allows all the parameters of the distribution of the response variable to be modelled as linear/non-linear or smooth functions of the explanatory variables. This paper starts by defining the statistical framework of GAMLSS, then describes the current implementation of GAMLSS in R and finally gives four different data examples to demonstrate how GAMLSS can be used for statistical modelling.

  2. Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)

    Science.gov (United States)

    Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián

    2015-04-01

    The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases

  3. A multilevel excess hazard model to estimate net survival on hierarchical data allowing for non-linear and non-proportional effects of covariates.

    Science.gov (United States)

    Charvat, Hadrien; Remontet, Laurent; Bossard, Nadine; Roche, Laurent; Dejardin, Olivier; Rachet, Bernard; Launoy, Guy; Belot, Aurélien

    2016-08-15

    The excess hazard regression model is an approach developed for the analysis of cancer registry data to estimate net survival, that is, the survival of cancer patients that would be observed if cancer was the only cause of death. Cancer registry data typically possess a hierarchical structure: individuals from the same geographical unit share common characteristics such as proximity to a large hospital that may influence access to and quality of health care, so that their survival times might be correlated. As a consequence, correct statistical inference regarding the estimation of net survival and the effect of covariates should take this hierarchical structure into account. It becomes particularly important as many studies in cancer epidemiology aim at studying the effect on the excess mortality hazard of variables, such as deprivation indexes, often available only at the ecological level rather than at the individual level. We developed here an approach to fit a flexible excess hazard model including a random effect to describe the unobserved heterogeneity existing between different clusters of individuals, and with the possibility to estimate non-linear and time-dependent effects of covariates. We demonstrated the overall good performance of the proposed approach in a simulation study that assessed the impact on parameter estimates of the number of clusters, their size and their level of unbalance. We then used this multilevel model to describe the effect of a deprivation index defined at the geographical level on the excess mortality hazard of patients diagnosed with cancer of the oral cavity. Copyright © 2016 John Wiley & Sons, Ltd.

  4. A multilevel excess hazard model to estimate net survival on hierarchical data allowing for non-linear and non-proportional effects of covariates.

    Science.gov (United States)

    Charvat, Hadrien; Remontet, Laurent; Bossard, Nadine; Roche, Laurent; Dejardin, Olivier; Rachet, Bernard; Launoy, Guy; Belot, Aurélien

    2016-08-15

    The excess hazard regression model is an approach developed for the analysis of cancer registry data to estimate net survival, that is, the survival of cancer patients that would be observed if cancer was the only cause of death. Cancer registry data typically possess a hierarchical structure: individuals from the same geographical unit share common characteristics such as proximity to a large hospital that may influence access to and quality of health care, so that their survival times might be correlated. As a consequence, correct statistical inference regarding the estimation of net survival and the effect of covariates should take this hierarchical structure into account. It becomes particularly important as many studies in cancer epidemiology aim at studying the effect on the excess mortality hazard of variables, such as deprivation indexes, often available only at the ecological level rather than at the individual level. We developed here an approach to fit a flexible excess hazard model including a random effect to describe the unobserved heterogeneity existing between different clusters of individuals, and with the possibility to estimate non-linear and time-dependent effects of covariates. We demonstrated the overall good performance of the proposed approach in a simulation study that assessed the impact on parameter estimates of the number of clusters, their size and their level of unbalance. We then used this multilevel model to describe the effect of a deprivation index defined at the geographical level on the excess mortality hazard of patients diagnosed with cancer of the oral cavity. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26924122

  5. Additive effects of dietary glycotoxins and androgen excess on the kidney of a female rat model

    Directory of Open Access Journals (Sweden)

    Sotiria Palimeri

    2016-06-01

    Conclusions: The above mentioned data suggest that dietary glycotoxins, in combination with increased androgen exposure, exert a more profound negative impact on the kidney of an androgenized female rat model that mimics the metabolic characteristics of polycystic ovary syndrome.

  6. Semi-Parametric, Generalized Additive Vector Autoregressive Models of Spatial Price Dynamics

    OpenAIRE

    Guney, Selin; Barry K. Goodwin

    2013-01-01

    An extensive empirical literature addressing the behavior of prices over time and across spatially distinct markets has grown substantially over time. A fundamental axiom of economics--the "Law of One Price"--underlies the arbitrage behavior thought to characterize such relationships. This literature has progressed from a simple consideration of correlation coecents and linear regression models to classes of models that address particular time series properties of price data and consider nonl...

  7. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models

    OpenAIRE

    Fan, Jianqing; Feng, Yang; Song, Rui

    2009-01-01

    A variable screening procedure via correlation learning was proposed in Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening p...

  8. Possibilities of Preoperative Medical Models Made by 3D Printing or Additive Manufacturing

    Science.gov (United States)

    2016-01-01

    Most of the 3D printing applications of preoperative models have been focused on dental and craniomaxillofacial area. The purpose of this paper is to demonstrate the possibilities in other application areas and give examples of the current possibilities. The approach was to communicate with the surgeons with different fields about their needs related preoperative models and try to produce preoperative models that satisfy those needs. Ten different kinds of examples of possibilities were selected to be shown in this paper and aspects related imaging, 3D model reconstruction, 3D modeling, and 3D printing were presented. Examples were heart, ankle, backbone, knee, and pelvis with different processes and materials. Software types required were Osirix, 3Data Expert, and Rhinoceros. Different 3D printing processes were binder jetting and material extrusion. This paper presents a wide range of possibilities related to 3D printing of preoperative models. Surgeons should be aware of the new possibilities and in most cases help from mechanical engineering side is needed.

  9. Possibilities of Preoperative Medical Models Made by 3D Printing or Additive Manufacturing

    Directory of Open Access Journals (Sweden)

    Mika Salmi

    2016-01-01

    Full Text Available Most of the 3D printing applications of preoperative models have been focused on dental and craniomaxillofacial area. The purpose of this paper is to demonstrate the possibilities in other application areas and give examples of the current possibilities. The approach was to communicate with the surgeons with different fields about their needs related preoperative models and try to produce preoperative models that satisfy those needs. Ten different kinds of examples of possibilities were selected to be shown in this paper and aspects related imaging, 3D model reconstruction, 3D modeling, and 3D printing were presented. Examples were heart, ankle, backbone, knee, and pelvis with different processes and materials. Software types required were Osirix, 3Data Expert, and Rhinoceros. Different 3D printing processes were binder jetting and material extrusion. This paper presents a wide range of possibilities related to 3D printing of preoperative models. Surgeons should be aware of the new possibilities and in most cases help from mechanical engineering side is needed.

  10. Possibilities of Preoperative Medical Models Made by 3D Printing or Additive Manufacturing.

    Science.gov (United States)

    Salmi, Mika

    2016-01-01

    Most of the 3D printing applications of preoperative models have been focused on dental and craniomaxillofacial area. The purpose of this paper is to demonstrate the possibilities in other application areas and give examples of the current possibilities. The approach was to communicate with the surgeons with different fields about their needs related preoperative models and try to produce preoperative models that satisfy those needs. Ten different kinds of examples of possibilities were selected to be shown in this paper and aspects related imaging, 3D model reconstruction, 3D modeling, and 3D printing were presented. Examples were heart, ankle, backbone, knee, and pelvis with different processes and materials. Software types required were Osirix, 3Data Expert, and Rhinoceros. Different 3D printing processes were binder jetting and material extrusion. This paper presents a wide range of possibilities related to 3D printing of preoperative models. Surgeons should be aware of the new possibilities and in most cases help from mechanical engineering side is needed. PMID:27433470

  11. Possibilities of Preoperative Medical Models Made by 3D Printing or Additive Manufacturing

    Science.gov (United States)

    2016-01-01

    Most of the 3D printing applications of preoperative models have been focused on dental and craniomaxillofacial area. The purpose of this paper is to demonstrate the possibilities in other application areas and give examples of the current possibilities. The approach was to communicate with the surgeons with different fields about their needs related preoperative models and try to produce preoperative models that satisfy those needs. Ten different kinds of examples of possibilities were selected to be shown in this paper and aspects related imaging, 3D model reconstruction, 3D modeling, and 3D printing were presented. Examples were heart, ankle, backbone, knee, and pelvis with different processes and materials. Software types required were Osirix, 3Data Expert, and Rhinoceros. Different 3D printing processes were binder jetting and material extrusion. This paper presents a wide range of possibilities related to 3D printing of preoperative models. Surgeons should be aware of the new possibilities and in most cases help from mechanical engineering side is needed. PMID:27433470

  12. Possibilities of Preoperative Medical Models Made by 3D Printing or Additive Manufacturing.

    Science.gov (United States)

    Salmi, Mika

    2016-01-01

    Most of the 3D printing applications of preoperative models have been focused on dental and craniomaxillofacial area. The purpose of this paper is to demonstrate the possibilities in other application areas and give examples of the current possibilities. The approach was to communicate with the surgeons with different fields about their needs related preoperative models and try to produce preoperative models that satisfy those needs. Ten different kinds of examples of possibilities were selected to be shown in this paper and aspects related imaging, 3D model reconstruction, 3D modeling, and 3D printing were presented. Examples were heart, ankle, backbone, knee, and pelvis with different processes and materials. Software types required were Osirix, 3Data Expert, and Rhinoceros. Different 3D printing processes were binder jetting and material extrusion. This paper presents a wide range of possibilities related to 3D printing of preoperative models. Surgeons should be aware of the new possibilities and in most cases help from mechanical engineering side is needed.

  13. Guarana Provides Additional Stimulation over Caffeine Alone in the Planarian Model

    OpenAIRE

    Dimitrios Moustakas; Michael Mezzio; Branden R Rodriguez; Mic Andre Constable; Mulligan, Margaret E.; Voura, Evelyn B.

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of g...

  14. Ten-year-old children strategies in mental addition: A counting model account.

    Science.gov (United States)

    Thevenot, Catherine; Barrouillet, Pierre; Castel, Caroline; Uittenhove, Kim

    2016-01-01

    For more than 30 years, it has been admitted that individuals from the age of 10 mainly retrieve the answer of simple additions from long-term memory, at least when the sum does not exceed 10. Nevertheless, recent studies challenge this assumption and suggest that expert adults use fast, compacted and unconscious procedures in order to solve very simple problems such as 3+2. If this is true, automated procedures should be rooted in earlier strategies and therefore observable in their non-compacted form in children. Thus, contrary to the dominant theoretical position, children's behaviors should not reflect retrieval. This is precisely what we observed in analyzing the responses times of a sample of 42 10-year-old children who solved additions with operands from 1 to 9. Our results converge towards the conclusion that 10-year-old children still use counting procedures in order to solve non-tie problems involving operands from 2 to 4. Moreover, these counting procedures are revealed whatever the expertise of children, who differ only in their speed of execution. Therefore and contrary to the dominant position in the literature according to which children's strategies evolve from counting to retrieval, the key change in development of mental addition solving appears to be a shift from slow to quick counting procedures.

  15. Strengthen forensic entomology in court--the need for data exploration and the validation of a generalised additive mixed model.

    Science.gov (United States)

    Baqué, Michèle; Amendt, Jens

    2013-01-01

    Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.

  16. Experiments to Populate and Validate a Processing Model for Polyurethane Foam: Additional Data for Structural Foams.

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R.; Celina, Mathias C.; Giron, Nicholas Henry; Long, Kevin Nicholas; Russick, Edward M.

    2015-01-01

    We are developing computational models to help understand manufacturing processes, final properties and aging of structural foam, polyurethane PMDI. Th e resulting model predictions of density and cure gradients from the manufacturing process will be used as input to foam heat transfer and mechanical models. BKC 44306 PMDI-10 and BKC 44307 PMDI-18 are the most prevalent foams used in structural parts. Experiments needed to parameterize models of the reaction kinetics and the equations of motion during the foam blowing stages were described for BKC 44306 PMDI-10 in the first of this report series (Mondy et al. 2014). BKC 44307 PMDI-18 is a new foam that will be used to make relatively dense structural supports via over packing. It uses a different catalyst than those in the BKC 44306 family of foams; hence, we expect that the reaction kineti cs models must be modified. Here we detail the experiments needed to characteriz e the reaction kinetics of BKC 44307 PMDI-18 and suggest parameters for the model based on these experiments. In additi on, the second part of this report describes data taken to provide input to the preliminary nonlinear visco elastic structural response model developed for BKC 44306 PMDI-10 foam. We show that the standard cu re schedule used by KCP does not fully cure the material, and, upon temperature elevation above 150 o C, oxidation or decomposition reactions occur that alter the composition of the foam. These findings suggest that achieving a fully cured foam part with this formulation may be not be possible through therma l curing. As such, visco elastic characterization procedures developed for curing thermosets can provide only approximate material properties, since the state of the material continuously evolves during tests.

  17. 77 FR 37961 - Hazardous Materials: Incorporating Rail Special Permits Into the Hazardous Materials Regulations

    Science.gov (United States)

    2012-06-25

    ... transport hazardous materials by railcar. We also proposed requirements concerning the generation of residue... commenters note that the UN Model Regulations, ICAO Technical Instructions, and International Maritime... authorizes a rail freight carrier to accept hazardous materials shipping paper information by...

  18. Using Swiss Webster mice to model Fetal Alcohol Spectrum Disorders (FASD): An analysis of multilevel time-to-event data through mixed-effects Cox proportional hazards models.

    Science.gov (United States)

    Chi, Peter; Aras, Radha; Martin, Katie; Favero, Carlita

    2016-05-15

    Fetal Alcohol Spectrum Disorders (FASD) collectively describes the constellation of effects resulting from human alcohol consumption during pregnancy. Even with public awareness, the incidence of FASD is estimated to be upwards of 5% in the general population and is becoming a global health problem. The physical, cognitive, and behavioral impairments of FASD are recapitulated in animal models. Recently rodent models utilizing voluntary drinking paradigms have been developed that accurately reflect moderate consumption, which makes up the majority of FASD cases. The range in severity of FASD characteristics reflects the frequency, dose, developmental timing, and individual susceptibility to alcohol exposure. As most rodent models of FASD use C57BL/6 mice, there is a need to expand the stocks of mice studied in order to more fully understand the complex neurobiology of this disorder. To that end, we allowed pregnant Swiss Webster mice to voluntarily drink ethanol via the drinking in the dark (DID) paradigm throughout their gestation period. Ethanol exposure did not alter gestational outcomes as determined by no significant differences in maternal weight gain, maternal liquid consumption, litter size, or pup weight at birth or weaning. Despite seemingly normal gestation, ethanol-exposed offspring exhibit significantly altered timing to achieve developmental milestones (surface righting, cliff aversion, and open field traversal), as analyzed through mixed-effects Cox proportional hazards models. These results confirm Swiss Webster mice as a viable option to study the incidence and causes of ethanol-induced neurobehavioral alterations during development. Future studies in our laboratory will investigate the brain regions and molecules responsible for these behavioral changes. PMID:26765502

  19. Efectivity of Additive Spline for Partial Least Square Method in Regression Model Estimation

    Directory of Open Access Journals (Sweden)

    Ahmad Bilfarsah

    2005-04-01

    Full Text Available Additive Spline of Partial Least Square method (ASPL as one generalization of Partial Least Square (PLS method. ASPLS method can be acommodation to non linear and multicollinearity case of predictor variables. As a principle, The ASPLS method approach is cahracterized by two idea. The first is to used parametric transformations of predictors by spline function; the second is to make ASPLS components mutually uncorrelated, to preserve properties of the linear PLS components. The performance of ASPLS compared with other PLS method is illustrated with the fisher economic application especially the tuna fish production.

  20. Additional disinfection with a modified salt solution in a root canal model

    NARCIS (Netherlands)

    S.V. van der Waal; C.A.M. Oonk; S.H. Nieman; P.R. Wesselink; J.J. de Soet; W. Crielaard

    2015-01-01

    Objectives The aim of this study is to investigate the disinfecting properties of a modified salt solution (MSS) and calcium hydroxide (Ca(OH)2) in a non-direct-contact ex-vivo model. Methods Seventy-four single-canal roots infected with Enterococcus faecalis were treated with 1% sodium hypochlorite