WorldWideScience

Sample records for hazard analysis process

  1. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  2. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  3. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  4. Integrating human factors into process hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kariuki, S.G. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany); Loewe, K. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany)]. E-mail: katharina.loewe@tu-berlin.de

    2007-12-15

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors.

  5. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    Energy Technology Data Exchange (ETDEWEB)

    SHULTZ MV

    2008-05-15

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process.

  6. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  7. Choosing Appropriate Hazards Analysis Techniques For Your Process

    Science.gov (United States)

    1996-08-21

    Study ( HAZOP ); (v) Failure Mode and Effects Analysis (FMEA); (vi ) Fault Tree Analysis; or (vii) An appropriate equivalent methodology.” The safety...CFR 1910.119: ! Checklist ! What-if ! What-if Checklist ! Hazards and Operability Study ( HAZOP ) ! Fault Tree / Logic Diagram ! Failure Modes and...than the other methods and are more appropriate for a simple process. The HAZOP has found much use in the petroleum and chemical industries and the

  8. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, Robert D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  9. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  10. Example process hazard analysis of a Department of Energy water chlorination process

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  11. Hazard analysis and critical control point (HACCP) for an ultrasound food processing operation.

    Science.gov (United States)

    Chemat, Farid; Hoarau, Nicolas

    2004-05-01

    Emerging technologies, such as ultrasound (US), used for food and drink production often cause hazards for product safety. Classical quality control methods are inadequate to control these hazards. Hazard analysis of critical control points (HACCP) is the most secure and cost-effective method for controlling possible product contamination or cross-contamination, due to physical or chemical hazard during production. The following case study on the application of HACCP to an US food-processing operation demonstrates how the hazards at the critical control points of the process are effectively controlled through the implementation of HACCP.

  12. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  13. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  14. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  15. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    Science.gov (United States)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    incorporated with additional GIS and statistic data to a comprehensive property-by-property geodatabase of the existing elements and values. This stock of elements and values geodatabase is furthermore the consistent basis for all natural hazard analyses and enables the comparison of the results. The study follows the general accepted moduls (i) hazard analysis, (ii) exposition analysis, and (iii) consequence analysis, whereas the exposition analysis estimates the elements at risk with their corresponding damage potentials and the consequence analysis estimates the PMLs. This multi-hazard analysis focuses on process types with a high to extreme potential of negative consequences on a regional scale. In this context, (i) floodings, (ii) rockslides with the potential of corresponding consequence effects (backwater ponding and outburst flood), (iii) earthquakes, (iv) hail events, and (v) winter storms were considered as hazard processes. Based on general hazard analyses (hazard maps) concrete scenarios and their spatial affectedness were determined. For the different hazard processes, different vulnerability approaches were considered to demonstrate their sensitivity and implication on the results. Thus, no absolute values of losses but probable loss ranges were estimated. It can be shown, that the most serious amount of losses would arise from extreme earthquake events with loss burdens up to more than € 7 bn. solely on buildings and inventory. Possible extreme flood events could lead to losses between € 2 and 2.5 bn., whereas a severe hail swath which affects the central Inn valley could result in losses of ca. € 455 mill. (thereof € 285 mill. on vehicles). The potential most serious rockslide with additional consequence effects would result in losses up to ca. € 185 mill. and extreme winter storms can induce losses between € 100 mill. and 150 mill..

  16. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  17. Analysis of aerosol emission and hazard evaluation of electrical discharge machining (EDM) process.

    Science.gov (United States)

    Jose, Mathew; Sivapirakasam, S P; Surianarayanan, M

    2010-01-01

    The safety and environmental aspects of a manufacturing process are important due to increased environmental regulations and life quality. In this paper, the concentration of aerosols in the breathing zone of the operator of Electrical Discharge Machining (EDM), a commonly used non traditional manufacturing process is presented. The pattern of aerosol emissions from this process with varying process parameters such as peak current, pulse duration, dielectric flushing pressure and the level of dielectric was evaluated. Further, the HAZOP technique was employed to identify the inherent safety aspects and fire risk of the EDM process under different working conditions. The analysis of aerosol exposure showed that the concentration of aerosol was increased with increase in the peak current, pulse duration and dielectric level and was decreased with increase in the flushing pressure. It was also found that at higher values of peak current (7A) and pulse duration (520 micros), the concentration of aerosols at breathing zone of the operator was above the permissible exposure limit value for respirable particulates (5 mg/m(3)). HAZOP study of the EDM process showed that this process is vulnerable to fire and explosion hazards. A detailed discussion on preventing the fire and explosion hazard is presented in this paper. The emission and risk of fire of the EDM process can be minimized by selecting proper process parameters and employing appropriate control strategy.

  18. Job Hazard Analysis

    National Research Council Canada - National Science Library

    1998-01-01

    .... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...

  19. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  20. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  1. Hazardous waste characterization among various thermal processes in South Korea: a comparative analysis.

    Science.gov (United States)

    Shin, Sun Kyoung; Kim, Woo-Il; Jeon, Tae-Wan; Kang, Young-Yeul; Jeong, Seong-Kyeong; Yeon, Jin-Mo; Somasundaram, Swarnalatha

    2013-09-15

    Ministry of Environment, Republic of Korea (South Korea) is in progress of converting its current hazardous waste classification system to harmonize it with the international standard and to set-up the regulatory standards for toxic substances present in the hazardous waste. In the present work, the concentrations along with the trend of 13 heavy metals, F(-), CN(-) and 19 PAH present in the hazardous waste generated among various thermal processes (11 processes) in South Korea were analyzed along with their leaching characteristics. In all thermal processes, the median concentrations of Cu (3.58-209,000 mg/kg), Ni (BDL-1560 mg/kg), Pb (7.22-5132.25mg/kg) and Zn (83.02-31419 mg/kg) were comparatively higher than the other heavy metals. Iron & Steel thermal process showed the highest median value of the heavy metals Cd (14.76 mg/kg), Cr (166.15 mg/kg) and Hg (2.38 mg/kg). Low molecular weight PAH (BDL-37.59 mg/kg) was predominant in sludge & filter cake samples present in most of the thermal processes. Comparatively flue gas dust present in most of the thermal processing units resulted in the higher leaching of the heavy metals. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Flood hazards analysis based on changes of hydrodynamic processes in fluvial systems of Sao Paulo, Brazil.

    Science.gov (United States)

    Simas, Iury; Rodrigues, Cleide

    2016-04-01

    The metropolis of Sao Paulo, with its 7940 Km² and over 20 million inhabitants, is increasingly being consolidated with disregard for the dynamics of its fluvial systems and natural limitations imposed by fluvial terraces, floodplains and slopes. Events such as floods and flash floods became particularly persistent mainly in socially and environmentally vulnerable areas. The Aricanduva River basin was selected as the ideal area for the development of the flood hazard analysis since it presents the main geological and geomorphological features found in the urban site. According to studies carried out by Anthropic Geomorphology approach in São Paulo, to study this phenomenon is necessary to take into account the original hydromorphological systems and its functional conditions, as well as in which dimensions the Anthropic factor changes the balance between the main variables of surface processes. Considering those principles, an alternative model of geographical data was proposed and enabled to identify the role of different driving forces in terms of spatial conditioning of certain flood events. Spatial relationships between different variables, such as anthropogenic and original morphology, were analyzed for that purpose in addition to climate data. The surface hydrodynamic tendency spatial model conceived for this study takes as key variables: 1- The land use present at the observed date combined with the predominant lithological group, represented by a value ranging 0-100, based on indexes of the National Soil Conservation Service (NSCS-USA) and the Hydraulic Technology Center Foundation (FCTH-Brazil) to determine the resulting balance of runoff/infiltration. 2- The original slope, applying thresholds from which it's possible to determine greater tendency for runoff (in percents). 3- The minimal features of relief, combining the curvature of surface in plant and profile. Those three key variables were combined in a Geographic Information System in a series of

  3. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    Science.gov (United States)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  4. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  5. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    Energy Technology Data Exchange (ETDEWEB)

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  6. An overview of process hazard evaluation techniques.

    Science.gov (United States)

    Gressel, M G; Gideon, J A

    1991-04-01

    Since the 1985 release of methyl isocyanate in Bhopal, India, which killed thousands, the chemical industry has begun to use process hazard analysis techniques more widely to protect the public from catastrophic chemical releases. These techniques can provide a systematic method for evaluating a system design to ensure that it operates as intended, help identify process areas that may result in the release of a hazardous chemical, and help suggest modifications to improve process safety. Eight different techniques are discussed, with some simple examples of how they might be applied. These techniques include checklists, "what if" analysis, safety audits and reviews, preliminary hazard analysis (PHA), failure modes and effect analysis (FMEA), fault tree analysis (FTA), event tree analysis (ETA), and hazard and operability studies (HAZOP). The techniques vary in sophistication and scope, and no single one will always be the best. These techniques can also provide the industrial hygienist with the tools needed to protect both workers and the community from both major and small-scale chemical releases. A typical industrial hygiene evaluation of a facility would normally include air sampling. If the air sampling does detect a specific hazardous substance, the source will probably be a routine or continuous emission. However, air sampling will not be able to identify or predict the location of a nonroutine emission reliably. By incorporating these techniques with typical evaluations, however, industrial hygienists can proactively help reduce the hazards to the workers they serve.

  7. The impact of overlapping processes on rockfall hazard analysis - the Bolonia Bay study (southern Spain)

    Science.gov (United States)

    Fernandez-Steeger, T.; Grützner, C.; Reicherter, K.; Braun, A.; Höbig, N.

    2009-04-01

    For rockfall simulations, competitive case studies and data sets are important to develop and evaluate the models or software. Especially for empirical or data driven stochastic modelling the quality of the reference data sets has a major impact on model skills and knowledge discovery. Therefore, rockfalls in the Bolonia Bay close to Tarifa (Spain) were mapped. Here, the siliciclastic Miocene rocks (megaturbidites) are intensively joined and disaggregated by a perpendicular joint system. Although bedding supports stability as the dip is not directed towards the rock face, the deposits indicate a continuous process of material loss from the 80 m high cliff of the San Bartolome mountain front by single large rock falls. For more than 300 blocks data on size, shape, type of rock, and location were collected. The work concentrated on rockfall blocks with a volume of more than 2 m³ and up to 350 m³. Occasionally very long "runout" distances of up to 2 km have been observed. For all major source areas and deposits, runout analysis using empirical models and a numerical trajectorian model has been performed. The most empirical models are principally based on the relation between fall height and travel distance. Beside the "Fahrböschung" from Heim (1932) the "shadow angle" introduced by Evans and Hungr (1993) is most common today. However, studies from different sites show a wide variance of the angle relations (Dorren 2003, Corominas 1996). The reasons for that might be different environments and trigger mechanisms, or varying secondary effects such as post-depositional movement. Today, "semi" numerical approaches based on trajectorian models are quite common to evaluate the rockfall energy and the runout distance for protection measures and risk evaluations. The results of the models highly depend on the quality of the input parameters. One problem here might be that some of the parameters, especially the dynamic ones, are not easy to determine and the quality of the

  8. Counterfactual Volcano Hazard Analysis

    Science.gov (United States)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  9. [Optimization of ethanol reflux extraction process of red ginseng using a design space approach based on hazard and operability analysis].

    Science.gov (United States)

    Zhao, Fang; Gong, Xing-Chu; Qu, Hai-Bin

    2017-03-01

    Quality by design principle was used as a guideline in this study to optimize ethanol reflux extraction of red ginseng. Firstly, hazard and operability analysis(HAZOP) was used as a risk assessment tool to evaluate the hazard degree of process parameters.Ethanol concentration, the ratio of alcohol and herbal material (A/M ratio), and extraction time were identified as the critical process parameters(CPPs) according to HAZOP method.Secondly, Box-Behnken experimental design was applied to establish theregression models between CPPs and the process indices. Finally, the design space was calculated. The recommended operation space of parametersis as follows:alcohol concentration of 90.3%-90.7%, A/M ratio of 2.5-3.1 mL•g ⁻¹ and extraction time of 124-130 min. The study shows that the design space approach combined with the risk assessment using HAZOP has the potential to reduce the risk of red ginseng extraction process, which might ultimately improve the process control.. Copyright© by the Chinese Pharmaceutical Association.

  10. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard... fish or fishery product being processed in the absence of those controls. (b) The HACCP plan. Every...

  11. Application of hazard analysis and critical control points (HACCP) to the processing of compost used in the cultivation of button mushroom

    National Research Council Canada - National Science Library

    José Emilio Pardo; Diego Cunha Zied; Manuel Alvarez-Ortí; Jesús Ángel Peñaranda; Carmen Gómez-Cantó; Arturo Pardo-Giménez

    2017-01-01

    .... Methods In this paper, the Hazard Analysis and Critical Control Points system is applied to the processing line of compost used in the cultivation of mushrooms and other edible cultivated fungi...

  12. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-12

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios received increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.

  13. Rockfall Hazard Process Assessment : Final Project Report

    Science.gov (United States)

    2017-10-01

    After a decade of using the Rockfall Hazard Rating System (RHRS), the Montana Department of Transportation (MDT) sought a reassessment of their rockfall hazard evaluation process. Their prior system was a slightly modified version of the RHRS and was...

  14. MGR External Events Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  15. Analisis Risk Assessment Menggunakan Process Hazard Analysis (PHA dan Safety Objective Analysis (SOA pada Central Gathering Station (CGS di Onshore Facilities

    Directory of Open Access Journals (Sweden)

    Dimas Jouhari

    2014-03-01

    Full Text Available Keselamatan proses merupakan faktor utama yang sering dibahas oleh industri-industri kimia beberapa tahun terakhir ini. Salah satu metode semi-kuantitatif yang dapat digunakan untuk mengidentifikasi, menganalisis, dan menetapkan tingkat risiko bahaya yaitu dengan Process Hazard Analysis (PHA dan Safety Objective Analysis (SOA. Hazard and Operability Studies (HAZOP dan What-If Analysis merupakan metode identifikasi bahaya kualitatif yang sering diterapkan secara simultan untuk PHA-SOA. Process Hazard Analysis (PHA ialah rangkaian aktivitas mengidentifikasi hazard, mengestimasi konsekuensi, mengestimasi likelihood suatu skenario proses disertai dengan safeguard, dan mendapatkan risk ranking yang dapat dilihat pada matrik PHA 6x6. Sedangkan Safety Objective Analysis (SOA merupakan rangkaian aktivitas yang bergantung pada penyebab skenario, dan konsekuensi dari PHA, menghasilkan kebutuhan IPL (Independent Protective Layer menggunakan matrik SOA 6x6. Risk ranking 6 pada penilaian PHA diketegorikan aman jika safeguard yang ada selalu siap mengurangi risiko yang timbul dari skenario tersebut. Namun tidak semua safeguard dapat selalu siap mengurangi risiko tersebut. Oleh karena itu, perlu adanya analisis tambahan untuk memastikan risiko dari skenario dapat diperkecil. Analisis safety suatu skenario dengan SOA menghasilkan kebutuhan IPL yang dapat ditutup dengan mengkonfirmasi safeguard yang sesuai menjadi IPL. Hasil penilaian PHA-SOA CGS 1, CGS 3, CGS 4, dan CGS 5 menunjukkan bahwa ada penilaian severity dan PHA-SOA likelihood yang berbeda di tiap CGS padahal proses pada CGS tersebut identik, maka perlu adanya analisis konsistensi. Hasil analisis konsistensi ini dapat dijadikan pedoman untuk melakukan safety review pada risk assessment workshop kedepannya, yang biasanya diadakan setiap tiga hingga lima tahun sekali oleh industri.

  16. Hygienic-sanitary working practices and implementation of a Hazard Analysis and Critical Control Point (HACCP plan in lobster processing industries

    Directory of Open Access Journals (Sweden)

    Cristina Farias da Fonseca

    2013-03-01

    Full Text Available This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP plan resulted in the detection of two critical control points (CCPs including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP hazards.

  17. 14 CFR 437.29 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  18. Multicriteria analysis in hazards assessment in Libya

    Science.gov (United States)

    Zeleňáková, Martina; Gargar, Ibrahim; Purcz, Pavol

    2012-11-01

    Environmental hazards (natural and man-made) have always constituted problem in many developing and developed countries. Many applications proved that these problems could be solved through planning studies and detailed information about these prone areas. Determining time and location and size of the problem are important for decision makers for planning and management activities. It is important to know the risk represented by those hazards and take actions to protect against them. Multicriteria analysis methods - Analytic hierarchy process, Pairwise comparison, Ranking method are used to analyse which is the most dangerous hazard facing Libya country. The multicriteria analysis ends with a more or less stable ranking of the given alternatives and hence a recommendation as to which alternative(s) problems should be preferred. Regarding our problem of environmental risk assessment, the result will be a ranking or categorisation of hazards with regard to their risk level.

  19. 21 CFR 120.7 - Hazard analysis.

    Science.gov (United States)

    2010-04-01

    ... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.7 Hazard... to occur and thus, constitutes a food hazard that must be addressed in the HACCP plan. A food hazard... intended consumer. (e) HACCP plans for juice need not address the food hazards associated with...

  20. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  1. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  2. The Integrated Hazard Analysis Integrator

    Science.gov (United States)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  3. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  4. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    16 Table 2. HAZOP Process ................................................................................. 21...Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...analysis techniques.28 c. Hazards and Operability Analysis Hazards and Operability ( HAZOP ) Analysis applies a systematic exploration of system

  5. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    Energy Technology Data Exchange (ETDEWEB)

    Tang, A; Samost, A [Massachusetts Institute of Technology, Cambridge, Massachusetts (United States); Viswanathan, A; Cormack, R; Damato, A [Dana-Farber Cancer Institute - Brigham and Women’s Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios were then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in

  6. 14 CFR 437.55 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  7. [Design of a Hazard Analysis and Critical Control Points (HACCP) plan to assure the safety of a bologna product produced by a meat processing plant].

    Science.gov (United States)

    Bou Rached, Lizet; Ascanio, Norelis; Hernández, Pilar

    2004-03-01

    The Hazard Analysis and Critical Control Point (HACCP) is a systematic integral program used to identify and estimate the hazards (microbiological, chemical and physical) and the risks generated during the primary production, processing, storage, distribution, expense and consumption of foods. To establish a program of HACCP has advantages, being some of them: to emphasize more in the prevention than in the detection, to diminish the costs, to minimize the risk of manufacturing faulty products, to allow bigger trust to the management, to strengthen the national and international competitiveness, among others. The present work is a proposal based on the design of an HACCP program to guarantee the safety of the Bologna Special Type elaborated by a meat products industry, through the determination of hazards (microbiological, chemical or physical), the identification of critical control points (CCP), the establishment of critical limits, plan corrective actions and the establishment of documentation and verification procedures. The used methodology was based in the application of the seven basic principles settled down by the Codex Alimentarius, obtaining the design of this program. In view of the fact that recently the meat products are linked with pathogens like E. coli O157:H7 and Listeria monocytogenes, these were contemplated as microbiological hazard for the establishment of the HACCP plan whose application will guarantee the obtaining of a safe product.

  8. Global Positioning System data collection, processing, and analysis conducted by the U.S. Geological Survey Earthquake Hazards Program

    Science.gov (United States)

    Murray, Jessica R.; Svarc, Jerry L.

    2017-01-01

    The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.

  9. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    Science.gov (United States)

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  10. Advanced Manufacturing Processes Laboratory Building 878 hazards assessment document

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.; Thornton, W.; Swihart, A.; Gilman, T.

    1994-07-01

    The introduction of the hazards assessment process is to document the impact of the release of hazards at the Advanced Manufacturing Processes Laboratory (AMPL) that are significant enough to warrant consideration in Sandia National Laboratories` operational emergency management program. This hazards assessment is prepared in accordance with the Department of Energy Order 5500.3A requirement that facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment provides an analysis of the potential airborne release of chemicals associated with the operations and processes at the AMPL. This research and development laboratory develops advanced manufacturing technologies, practices, and unique equipment and provides the fabrication of prototype hardware to meet the needs of Sandia National Laboratories, Albuquerque, New Mexico (SNL/NM). The focus of the hazards assessment is the airborne release of materials because this requires the most rapid, coordinated emergency response on the part of the AMPL, SNL/NM, collocated facilities, and surrounding jurisdiction to protect workers, the public, and the environment.

  11. Submarine landslides: processes, triggers and hazard prediction.

    Science.gov (United States)

    Masson, D G; Harbitz, C B; Wynn, R B; Pedersen, G; Løvholt, F

    2006-08-15

    Huge landslides, mobilizing hundreds to thousands of km(3) of sediment and rock are ubiquitous in submarine settings ranging from the steepest volcanic island slopes to the gentlest muddy slopes of submarine deltas. Here, we summarize current knowledge of such landslides and the problems of assessing their hazard potential. The major hazards related to submarine landslides include destruction of seabed infrastructure, collapse of coastal areas into the sea and landslide-generated tsunamis. Most submarine slopes are inherently stable. Elevated pore pressures (leading to decreased frictional resistance to sliding) and specific weak layers within stratified sequences appear to be the key factors influencing landslide occurrence. Elevated pore pressures can result from normal depositional processes or from transient processes such as earthquake shaking; historical evidence suggests that the majority of large submarine landslides are triggered by earthquakes. Because of their tsunamigenic potential, ocean-island flank collapses and rockslides in fjords have been identified as the most dangerous of all landslide related hazards. Published models of ocean-island landslides mainly examine 'worst-case scenarios' that have a low probability of occurrence. Areas prone to submarine landsliding are relatively easy to identify, but we are still some way from being able to forecast individual events with precision. Monitoring of critical areas where landslides might be imminent and modelling landslide consequences so that appropriate mitigation strategies can be developed would appear to be areas where advances on current practice are possible.

  12. Microbiological hazard analysis of ready-to-eat meats processed at a food plant in Trinidad, West Indies

    Directory of Open Access Journals (Sweden)

    Stacey-Marie Syne

    2013-07-01

    Full Text Available Background: A bacteriological assessment of the environment and food products at different stages of processing was conducted during the manufacture of ready-to-eat (RTE chicken franks, chicken bologna and bacon at a large meat processing plant in Trinidad, West Indies. Methods: Samples of air, surfaces (swabs, raw materials, and in-process and finished food products were collected during two separate visits for each product type and subjected to qualitative or quantitative analysis for bacterial zoonotic pathogens and fecal indicator organisms. Results: Staphylococcus aureus was the most common pathogen detected in pre-cooked products (mean counts = 0.66, 1.98, and 1.95 log10CFU/g for franks, bologna, and bacon, respectively. This pathogen was also found in unacceptable levels in 4 (16.7% of 24 post-cooked samples. Fifty percent (10 of 20 of pre-cooked mixtures of bacon and bologna were contaminated with Listeria spp., including four with L. monocytogenes. Pre-cooked mixtures of franks and bologna also contained E. coli (35 and 0.72 log10 CFU/g, respectively while 5 (12.5% of 40 pre-cooked mixtures of chicken franks had Salmonella spp. Aerobic bacteria exceeded acceptable international standards in 46 (82.1% of 56 pre-cooked and 6 (16.7% of 36 post-cooked samples. Both pre-and post-cooking air and surfaces had relatively high levels of aerobic bacteria, Staphylococcus aureus and coliforms, including equipment and gloves of employees. A drastic decrease in aerobic counts and Staphylococcus aureus levels following heat treatment and subsequent increase in counts of these bacteria are suggestive of post-cooking contamination. Conclusion: A relatively high level of risk exists for microbial contamination of RTE meats at the food plant investigated and there is a need for enhancing the quality assurance programs to ensure the safety of consumers of products manufactured at this plant.

  13. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-15

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, each based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.

  14. Hazard and operability (HAZOP) analysis. A literature review.

    Science.gov (United States)

    Dunjó, Jordi; Fthenakis, Vasilis; Vílchez, Juan A; Arnaldos, Josep

    2010-01-15

    Hazard and operability (HAZOP) methodology is a Process Hazard Analysis (PHA) technique used worldwide for studying not only the hazards of a system, but also its operability problems, by exploring the effects of any deviations from design conditions. Our paper is the first HAZOP review intended to gather HAZOP-related literature from books, guidelines, standards, major journals, and conference proceedings, with the purpose of classifying the research conducted over the years and define the HAZOP state-of-the-art.

  15. Canister storage building hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  16. Cold Vacuum Drying Facility hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  17. A public health hazard mitigation planning process.

    Science.gov (United States)

    Griffith, Jennifer M; Kay Carpender, S; Crouch, Jill Artzberger; Quiram, Barbara J

    2014-01-01

    The Texas A&M Health Science Center School of Rural Public Health, a member of the Training and Education Collaborative System Preparedness and Emergency Response Learning Center (TECS-PERLC), has long-standing partnerships with 2 Health Service Regions (Regions) in Texas. TECS-PERLC was contracted by these Regions to address 2 challenges identified in meeting requirements outlined by the Risk-Based Funding Project. First, within Metropolitan Statistical Areas, there is not a formal authoritative structure. Second, preexisting tools and processes did not adequately satisfy requirements to assess public health, medical, and mental health needs and link mitigation strategies to the Public Health Preparedness Capabilities, which provide guidance to prepare for, respond to, and recover from public health incidents. TECS-PERLC, with its partners, developed a framework to interpret and apply results from the Texas Public Health Risk Assessment Tool (TxPHRAT). The 3-phase community engagement-based TxPHRAT Mitigation Planning Process (Mitigation Planning Process) and associated tools facilitated the development of mitigation plans. Tools included (1) profiles interpreting TxPHRAT results and identifying, ranking, and prioritizing hazards and capability gaps; (2) a catalog of intervention strategies and activities linked to hazards and capabilities; and (3) a template to plan, evaluate, and report mitigation planning efforts. The Mitigation Planning Process provided a framework for Regions to successfully address all funding requirements. TECS-PERLC developed more than 60 profiles, cataloged and linked 195 intervention strategies, and developed a template resulting in 20 submitted mitigation plans. A public health-focused, community engagement-based mitigation planning process was developed by TECS-PERLC and successfully implemented by the Regions. The outcomes met all requirements and reinforce the effectiveness of academic practice partnerships and importance of

  18. Decision analysis for INEL hazardous waste storage

    Energy Technology Data Exchange (ETDEWEB)

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  19. 49 CFR 659.31 - Hazard management process.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Hazard management process. 659.31 Section 659.31... Agency § 659.31 Hazard management process. (a) The oversight agency must require the rail transit agency..., operational changes, or other changes within the rail transit environment. (b) The hazard management process...

  20. Idaho Chemical Processing Plant safety document ICPP hazardous chemical evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Harwood, B.J.

    1993-01-01

    This report presents the results of a hazardous chemical evaluation performed for the Idaho Chemical Processing Plant (ICPP). ICPP tracks chemicals on a computerized database, Haz Track, that contains roughly 2000 individual chemicals. The database contains information about each chemical, such as its form (solid, liquid, or gas); quantity, either in weight or volume; and its location. The Haz Track database was used as the primary starting point for the chemical evaluation presented in this report. The chemical data and results presented here are not intended to provide limits, but to provide a starting point for nonradiological hazards analysis.

  1. Safety analysis and hazard control during food processing and storage in the BIO-Plex Interconnecting Transfer Tunnel.

    Science.gov (United States)

    Hentges, D L

    2000-01-01

    The food system, being designed for the BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex), will be a plant-based diet that requires most of the food to be grown, processed, and prepared in the BIO-Plex. Conversion of crops to edible foods will require extensive food processing within the closed environment of this habitat. Because all consumables in the BIO-Plex will be recycled and reused, food safety is a primary concern. Multifunctional equipment necessary for food processing of the baseline crops (wheat, soybeans, rice, peanuts, dried beans, potatoes, sweet potatoes, lettuce, chard, tomatoes, green onions, carrots, and radishes) was identified. Recommendations for placement of the food processing equipment in the Interconnecting Transfer Tunnel (ITT) of the BIO-Plex were made to facilitate the processing flow diagrams, increase work efficiency, and prevent cross-contamination of pathogens and antinutrients. Sanitation equipment and procedures necessary during food processing in the ITT are described.

  2. Rockfall Hazard Process Assessment : [Project Summary

    Science.gov (United States)

    2017-10-01

    The Montana Department of Transportation (MDT) implemented its Rockfall Hazard Rating System (RHRS) between 2003 and 2005, obtaining information on the state's rock slopes and their associated hazards. The RHRS data facilitated decision-making in an ...

  3. Risk analysis based on hazards interactions

    Science.gov (United States)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  4. In silico analysis sheds light on the structural basis underlying the ribotoxicity of trichothecenes-A tool for supporting the hazard identification process.

    Science.gov (United States)

    Dellafiora, Luca; Galaverna, Gianni; Dall'Asta, Chiara

    2017-03-15

    Deoxynivalenol is a food borne mycotoxin belonging to the trichothecenes family that may cause severe injuries in human and animals. The inhibition of protein synthesis via the interaction with the ribosome has been identified as a crucial mechanism underlying toxic action. However, it is not still fully understood how and to what extent compounds belonging to trichothecenes family affect human and animal health. In turn, this scenario causes delay in managing the related health risk. Aimed at supporting the hazard identification process, the in silico analysis may be a straightforward tool to investigate the structure-activity relationship of trichothecenes, finding out molecules of possible concern to carry forth in the risk assessment process. In this framework, this work investigated through a molecular modeling approach the structural basis underlying the interaction with the ribosome under a structure-activity relationship perspective. To identify further forms possibly involved in the total trichothecenes-dependent ribotoxic load, the model was challenged with a set of 16 trichothecene modified forms found in plants, fungi and animals, including also compounds never tested before for the capability to bind and inhibit the ribosome. Among them, only the regiospecific glycosylation in the position 3 of the sesquiterpenoid scaffold (i.e. T-2 toxin-3-glucuronide, α and β isomers of T-2 toxin-3-glucoside and deoxynivalenol-3-glucuronide) was found impairing the interaction with the ribosome, while the other compounds tested (i.e. neosolaniol, nivalenol, fusarenon-X, diacetoxyscirpenol, NT-1 toxin, HT-2 toxin, 19- and 20-hydroxy-T-2 toxin, T-2 toxin triol and tetraol, and 15-deacetyl-T-2 toxin), were found potentially able to inhibit the ribosome. Accordingly, they should be included with high priority in further risk assessment studies in order to better characterize the trichothecenes-related hazard. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Food safety and nutritional quality for the prevention of non communicable diseases: the Nutrient, hazard Analysis and Critical Control Point process (NACCP).

    Science.gov (United States)

    Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino

    2015-04-23

    The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows

  6. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    Science.gov (United States)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  7. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  8. Microbiological performance of Hazard Analysis Critical Control Point (HACCP)-based food safety management systems: A case of Nile perch processing company

    NARCIS (Netherlands)

    Kussaga, J.B.; Luning, P.A.; Tiisekwa, B.P.M.; Jacxsens, L.

    2017-01-01

    This study aimed at giving insight into microbiological safety output of a Hazard Analysis Critical Control Point (HACCP)-based Food Safety Management System (FSMS) of a Nile perch exporting company by using a combined assessment, This study aimed at giving insight into microbiological safety output

  9. Hazard screening application guide. Safety Analysis Report Update Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  10. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Appendices; Volume 2

    Science.gov (United States)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  11. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Volume 1; Appendices

    Science.gov (United States)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  12. Rockfall Hazard Process Assessment : Implementation Report

    Science.gov (United States)

    2017-10-01

    The Montana Department of Transportation (MDT) commissioned a new research program to improve assessment and management of its rock slope assets. The Department implemented a Rockfall Hazard Rating System (RHRS) program in 2005 and wished to add valu...

  13. Potential biological hazard of importance for HACCP plans in fresh fish processing

    Directory of Open Access Journals (Sweden)

    Baltić Milan Ž.

    2009-01-01

    Full Text Available The Hazard Analysis and Critical Control Point (HACCP system is scientifically based and focused on problem prevention in order to assure the produced food products are safe to consume. Prerequisite programs such as GMP (Good Manufacturing Practices, GHP (Good Hygienic Practices are an essential foundation for the development and implementation of successful HACCP plans. One of the preliminary tasks in the development of HACCP plan is to conduct a hazard analysis. The process of conducting a hazard analysis involves two stages. The first is hazard identification and the second stage is the HACCP team decision which potential hazards must be addressed in the HACCP plan. By definition, the HACCP concept covers all types of potential food safety hazards: biological, chemical and physical, whether they are naturally occurring in the food, contributed by the environment or generated by a mistake in the manufacturing process. In raw fish processing, potential significant biological hazards which are reasonably likely to cause illness of humans are parasites (Trematodae, Nematodae, Cestodae, bacteria (Salmonella, E. coli, Vibrio parahemolyticus, Vibrio vulnificus, Listeria monocytogenes, Clostridium botulinum, Staphyloccocus aureus, viruses (Norwalk virus, Entero virusesi, Hepatitis A, Rotovirus and bio-toxins. Upon completion of hazard analysis, any measure(s that are used to control the hazard(s should be described.

  14. Repository Subsurface Preliminary Fire Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  15. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  16. Hazardous Waste Site Analysis (Small Site Technology)

    Science.gov (United States)

    1990-08-01

    information. " RCRA required all treaters , storers, and/or disposers to either have permits by November 1980, or qualify for interim status, by notifying...carbon dioxide or compressed liquid state propane ) is used as a solvent to extract organic hazardous constituents from waste. Additional processing

  17. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard...) Physical hazards. (b) The HACCP plan. (1) Every establishment shall develop and implement a written HACCP...

  18. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  19. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    Science.gov (United States)

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  20. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-08-01

    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  1. Criteria and Processes for the Certification of Non-Radioactive Hazardous and Non-Hazardous Wastes

    Energy Technology Data Exchange (ETDEWEB)

    Dominick, J

    2008-12-18

    This document details Lawrence Livermore National Laboratory's (LLNL) criteria and processes for determining if potentially volumetrically contaminated or potentially surface contaminated wastes are to be managed as material containing residual radioactivity or as non-radioactive. This document updates and replaces UCRL-AR-109662, Criteria and Procedures for the Certification of Nonradioactive Hazardous Waste (Reference 1), also known as 'The Moratorium', and follows the guidance found in the U.S. Department of Energy (DOE) document, Performance Objective for Certification of Non-Radioactive Hazardous Waste (Reference 2). The 1992 Moratorium document (UCRL-AR-109662) is three volumes and 703 pages. The first volume provides an overview of the certification process and lists the key radioanalytical methods and their associated Limits of Sensitivities. Volumes Two and Three contain supporting documents and include over 30 operating procedures, QA plans, training documents and organizational charts that describe the hazardous and radioactive waste management system in place in 1992. This current document is intended to update the previous Moratorium documents and to serve as the top-tier LLNL institutional Moratorium document. The 1992 Moratorium document was restricted to certification of Resource Conservation and Recovery Act (RCRA), State and Toxic Substances Control Act (TSCA) hazardous waste from Radioactive Material Management Areas (RMMA). This still remains the primary focus of the Moratorium; however, this document increases the scope to allow use of this methodology to certify other LLNL wastes and materials destined for off-site disposal, transfer, and re-use including non-hazardous wastes and wastes generated outside of RMMAs with the potential for DOE added radioactivity. The LLNL organization that authorizes off-site transfer/disposal of a material or waste stream is responsible for implementing the requirements of this document. The LLNL

  2. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  3. Fire hazard analysis for fusion energy experiments

    Energy Technology Data Exchange (ETDEWEB)

    Alvares, N.J.; Hasegawa, H.K.

    1979-01-01

    The 2XIIB mirror fusion facility at Lawrence Livermore Laboratory (LLL) was used to evaluate the fire safety of state-of-the-art fusion energy experiments. The primary objective of this evaluation was to ensure the parallel development of fire safety and fusion energy technology. Through fault-tree analysis, we obtained a detailed engineering description of the 2XIIB fire protection system. This information helped us establish an optimum level of fire protection for experimental fusion energy facilities as well as evaluate the level of protection provided by various systems. Concurrently, we analyzed the fire hazard inherent to the facility using techniques that relate the probability of ignition to the flame spread and heat-release potential of construction materials, electrical and thermal insulations, and dielectric fluids. A comparison of the results of both analyses revealed that the existing fire protection system should be modified to accommodate the range of fire hazards inherent to the 2XIIB facility.

  4. Historical analysis of US pipeline accidents triggered by natural hazards

    Science.gov (United States)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  5. Tree-ring analysis in natural hazards research - an overview

    Science.gov (United States)

    Stoffel, M.; Bollschweiler, M.

    2008-03-01

    The understanding of geomorphic processes and knowledge of past events are important tasks for the assessment of natural hazards. Tree rings have on varied occasions proved to be a reliable tool for the acquisition of data on past events. In this review paper, we provide an overview on the use of tree rings in natural hazards research, starting with a description of the different types of disturbances by geomorphic processes and the resulting growth reactions. Thereafter, a summary is presented on the different methods commonly used for the analysis and interpretation of reactions in affected trees. We illustrate selected results from dendrogeomorphological investigations of geomorphic processes with an emphasis on fluvial (e.g., flooding, debris flows) and mass-movement processes (e.g., landslides, snow avalanche), where lots of data have been generated over the past few decades. We also present results from rockfall and permafrost studies, where data are much scarcer, albeit data from tree-ring studies have proved to be of great value in these fields as well. Most studies using tree rings have focused on alpine environments in Europe and North America, whereas other parts of the world have been widely neglected by dendrogeomorphologists so far. We therefore challenge researchers to focus on other regions with distinct climates as well, to look on less frequently studied processes as well and to broaden and improve approaches and methods commonly used in tree-ring research so as to allow a better understanding of geomorphic processes, natural hazards and risk.

  6. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    Science.gov (United States)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  7. Processing LiDAR Data to Predict Natural Hazards

    Science.gov (United States)

    Fairweather, Ian; Crabtree, Robert; Hager, Stacey

    2008-01-01

    ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.

  8. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  9. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    Science.gov (United States)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  10. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    Science.gov (United States)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  11. Guidance Index for Shallow Landslide Hazard Analysis

    Directory of Open Access Journals (Sweden)

    Cheila Avalon Cullen

    2016-10-01

    Full Text Available Rainfall-induced shallow landslides are one of the most frequent hazards on slanted terrains. Intense storms with high-intensity and long-duration rainfall have high potential to trigger rapidly moving soil masses due to changes in pore water pressure and seepage forces. Nevertheless, regardless of the intensity and/or duration of the rainfall, shallow landslides are influenced by antecedent soil moisture conditions. As of this day, no system exists that dynamically interrelates these two factors on large scales. This work introduces a Shallow Landslide Index (SLI as the first implementation of antecedent soil moisture conditions for the hazard analysis of shallow rainfall-induced landslides. The proposed mathematical algorithm is built using a logistic regression method that systematically learns from a comprehensive landslide inventory. Initially, root-soil moisture and rainfall measurements modeled from AMSR-E and TRMM respectively, are used as proxies to develop the index. The input dataset is randomly divided into training and verification sets using the Hold-Out method. Validation results indicate that the best-fit model predicts the highest number of cases correctly at 93.2% accuracy. Consecutively, as AMSR-E and TRMM stopped working in October 2011 and April 2015 respectively, root-soil moisture and rainfall measurements modeled by SMAP and GPM are used to develop models that calculate the SLI for 10, 7, and 3 days. The resulting models indicate a strong relationship (78.7%, 79.6%, and 76.8% respectively between the predictors and the predicted value. The results also highlight important remaining challenges such as adequate information for algorithm functionality and satellite based data reliability. Nevertheless, the experimental system can potentially be used as a dynamic indicator of the total amount of antecedent moisture and rainfall (for a given duration of time needed to trigger a shallow landslide in a susceptible area. It is

  12. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-03-01

    When training for a job in which human error has the potential of producing catastrophic results, an understanding of the hazards that may be encountered is of paramount importance. In high consequence activities, it is important that the training program be conducted in a safe environment and yet emphasize the potential hazards. Because of the high consequence of a human error the use of a high-fidelity simulation is of great importance to provide the safe environment the worker needs to learn and hone required skills. A hazards analysis identifies the operation hazards, potential human error, and associated positive measures that aid in the mitigation or prevention of the hazard. The information gained from the hazards analysis should be used in the development of training. This paper will discuss the integration of information from the hazards analysis into the development of simulation components of a training program.

  13. Fire hazards analysis of transuranic waste storage and assay facility

    Energy Technology Data Exchange (ETDEWEB)

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  14. Preliminary Tsunami Hazard Analysis for Uljin NPP Site using Tsunami Propagation Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyunme; KIm, Minkyu; Choi, Inkil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Donghoon [Chonnam National Univ., Gwangju (Korea, Republic of)

    2014-05-15

    The tsunami hazard analysis is based on the seismic hazard analysis method. The seismic hazard analysis had been performed by using the deterministic or probabilistic method. Recently, the probabilistic method has been received more attention than the deterministic method because the probabilistic approach can be considered well uncertainties of hazard analysis. Therefore the studies on the probabilistic tsunami hazard analysis (PTHA) have been performed in this study. This study was focused on the wave propagation analysis which was the most different thing between seismic hazard analysis and tsunami hazard analysis.

  15. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-12-01

    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  16. Defaultable Game Options in a Hazard Process Model

    Directory of Open Access Journals (Sweden)

    Tomasz R. Bielecki

    2009-01-01

    Full Text Available The valuation and hedging of defaultable game options is studied in a hazard process model of credit risk. A convenient pricing formula with respect to a reference filteration is derived. A connection of arbitrage prices with a suitable notion of hedging is obtained. The main result shows that the arbitrage prices are the minimal superhedging prices with sigma martingale cost under a risk neutral measure.

  17. Fire Hazard Analysis for Turbine Building of NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Seung Jun [KMENT, Seoul (Korea, Republic of); Park, Jun Hyun [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    In order to prove fire safety of operating nuclear power plants, plant-specific fire hazard analysis should be performed. Furthermore the effect of design changes on fire safety should be reviewed periodically. At the estimating fire vulnerability stage, the factors that influence fire vulnerability include ignition sources, combustibles, fire barriers, fire protection features such as detection, alarm, suppression, evacuation are investigated. At the stage of fire hazard assessment, ignition and propagation hazard, passive and active fire protection features, and fire protection program such as pre-fire plan and related procedures are investigated. Based on the result of fire hazard analysis, reasonable improvement plan for fire protection can be established. This paper describes the result of fire hazard analysis classified by fire area for turbine building of which fire hazards and fire frequencies are relatively high in operating nuclear power plant.

  18. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  19. 14 CFR 417.227 - Toxic release hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... members of the public on land and on any waterborne vessels, populated offshore structures, and aircraft... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  20. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2010-07-14

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA's regulations in part 120 (21 CFR part 120) mandate the application of HACCP procedures to fruit and vegetable juice processing. HACCP is a preventative system of hazard control that can...

  1. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2013-11-20

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA regulations in part 120 (21 CFR part 120) mandate the application of HACCP principles to the processing of fruit and vegetable juices. HACCP is a preventive system of hazard control...

  2. Safety analysis of contained low-hazard biotechnology applications.

    Science.gov (United States)

    Pettauer, D; Käppeli, O; van den Eede, G

    1998-06-01

    A technical safety analysis has been performed on a containment-level-2 pilot plant in order to assess an upgrading of the existing facility, which should comply with good manufacturing practices. The results were obtained by employing the hazard and operability (HAZOP) assessment method and are discussed in the light of the appropriateness of this procedural tool for low-hazard biotechnology applications. The potential release of micro-organisms accounts only for a minor part of the hazardous consequences. However, in certain cases the release of a large or moderate amount of micro-organisms would not be immediately identified. Most of the actions required to avoid these consequences fall into the realm of operational procedures. As a major part of potential failures result from human errors, standard operating procedures play a prominent role when establishing the concept of safety management. The HAZOP assessment method was found to be adequate for the type of process under investigation. The results also may be used for the generation of checklists which, in most cases, are sufficient for routine safety assurance.

  3. Natural hazard modeling and uncertainty analysis [Chapter 2

    Science.gov (United States)

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  4. User’s Guide - Seismic Hazard Analysis

    Science.gov (United States)

    1993-02-01

    Eartquake Magnitude Cutoff 8.5 example 8.8 Enter Site Longitude (Degrees) 117 example 115.0 Enter Site Latitude (Degrees) 38 example 38.5 Any Chnges? Y / H...the art for assessing earthquake hazards in the United States catalogue of strong motion eartquake records, Wtaerways Experiment Station, Vicks- burg

  5. Seismic Hazard analysis of Adjaria Region in Georgia

    Science.gov (United States)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  6. A Situational Analysis of Priority Disaster Hazards in Uganda ...

    African Journals Online (AJOL)

    Background: Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for ...

  7. Seismic Hazard Analysis of the Bandung Triga 2000 Reactor Site

    OpenAIRE

    Parithusta, Rizkita; P, Sindur; Mangkoesoebroto

    2004-01-01

    SEISMIC HAZARD ANALYSIS OF THE BANDUNG TRIGA 2000 REACTOR SITE. A seismic hazard analysis of the West Java region is carried out to estimate the peak ground acceleration at the Bandung TRIGA 2000 nuclear reactor site. Both the probabilistic and deterministic approaches are employed to better capture the uncertainties considering the enclosing fault systems. Comprehensive analysis is performed based on the newly revised catalog of seismic data, the most recent results of the construction of se...

  8. Arc flash hazard analysis and mitigation

    CERN Document Server

    Das, J C

    2012-01-01

    "All the aspects of arc flash hazard calculations and their mitigation have been covered. Knowledge of electrical power systems up to undergraduate level is assumed. The calculations of short-circuits, protective relaying and varied electrical system configurations in industrial power systems are addressed. Protection systems address differential relays, arc flash sensing relays, protective relaying coordination, current transformer operation and saturation and applications to major electrical equipments from the arc flash considerations. Current technologies and strategies for arc flash mitigation have been covered. A new algorithm for the calculation of arc flash hazard accounting for the decaying nature of the short-circuit currents is included. There are many practical examples and study cases. Review questions and references follow each chapter"--

  9. The contribution of bedload transport processes to natural hazard damage costs in Switzerland

    Science.gov (United States)

    Andres, Norina; Badoux, Alexandre; Turowski, Jens

    2013-04-01

    In Alpine regions, floods are often associated with erosion along the stream channels and with bedload transport in mountain rivers. These bedload transport processes pose hazard in addition to the elevated water discharge. However, it is unclear how much bedload transport processes contribute to total damage caused by natural hazards, an information that may be vital for flood mitigation measures and for the design of protective infrastructure. Using the Swiss flood and landslide data base, which collects direct financial damage data of naturally triggered floods, debris flows and landslides since 1972, we estimated the contribution of bedload transport processes to total natural hazard damage costs in Switzerland. For each data base entry an upper and lower limit of financial damage caused by or related to fluvial bedload transport processes was estimated, and the quality of the estimate was judged. When compared to total damage, the fraction of bedload transport damage lies between 32 and 37% (lower and upper estimates). In the 40 year study period, the bedload transport processes have induced a cumulative financial damage between 4.3 and 5.1 billion CHF. Spatial analysis revealed highest damage for mountainous regions. The analysis of the seasonal distribution of bedload erosion and deposition shows that more than 75% of the costs occurs in summer (June through August), and ~23% in autumn (September through November). With roughly 56%, by far most of the damage has been registered in the month of August. In winter and spring damage due to bedload processes is very low. Despite more than a hundred years of research, bedload transport processes are inadequately understood, and the predictive quality of common bedload equations is still poor. The importance of bedload transport processes as a natural hazard and financial source of risk, and thus the need for future structured research programmes on transport processes in steep streams has been demonstrated in our

  10. Design characteristics for facilities which process hazardous particulate

    Energy Technology Data Exchange (ETDEWEB)

    Abeln, S.P.; Creek, K.; Salisbury, S.

    1998-12-01

    Los Alamos National Laboratory is establishing a research and processing capability for beryllium. The unique properties of beryllium, including light weight, rigidity, thermal conductivity, heat capacity, and nuclear properties make it critical to a number of US defense and aerospace programs. Concomitant with the unique engineering properties are the health hazards associated with processing beryllium in a particulate form and the potential for worker inhalation of aerosolized beryllium. Beryllium has the lowest airborne standard for worker protection compared to all other nonradioactive metals by more than an order of magnitude. This paper describes the design characteristics of the new beryllium facility at Los Alamos as they relate to protection of the workforce. Design characteristics to be reviewed include; facility layout, support systems to minimize aerosol exposure and spread, and detailed review of the ventilation system design for general room air cleanliness and extraction of particulate at the source.

  11. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  12. Seismic hazard analysis for Jayapura city, Papua

    Energy Technology Data Exchange (ETDEWEB)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A. [Geological Agency, Diponegoro Road No.57, Bandung, 40122 (Indonesia)

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  13. Occupational exposures to uranium: processes, hazards, and regulations

    Energy Technology Data Exchange (ETDEWEB)

    Stoetzel, G.A.; Fisher, D.R.; McCormack, W.D.; Hoenes, G.R.; Marks, S.; Moore, R.H.; Quilici, D.G.; Breitenstein, B.D.

    1981-04-01

    The United States Uranium Registry (USUR) was formed in 1978 to investigate potential hazards from occupational exposure to uranium and to assess the need for special health-related studies of uranium workers. This report provides a summary of Registry work done to date. The history of the uranium industry is outlined first, and the current commercial uranium industry (mining, milling, conversion, enrichment, and fuel fabrication) is described. This description includes information on basic processes and areas of greatest potential radiological exposure. In addition, inactive commercial facilities and other uranium operations are discussed. Regulation of the commercial production industry for uranium fuel is reported, including the historic development of regulations and the current regulatory agencies and procedures for each phase of the industry. A review of radiological health practices in the industry - facility monitoring, exposure control, exposure evaluation, and record-keeping - is presented. A discussion of the nonradiological hazards of the industry is provided, and the final section describes the tissue program developed as part of the Registry.

  14. PENERAPAN SISTEM HAZARD ANALYSIS CRITICAL CONTROL POINT (HACCP PADA PROSES PEMBUATAN KERIPIK TEMPE

    Directory of Open Access Journals (Sweden)

    Rahmi Yuniarti

    2015-06-01

    Full Text Available Malang is one of the industrial centers of tempe chips. To maintain the quality and food safety, analysis is required to identify the hazards during the production process. This study was conducted to identify the hazards during the production process of tempe chips and provide recommendations for developing a HACCP system. The phases of production process of tempe chips are started from slice the tempe, move it to the kitchen, coat it with flour dough, fry it in the pan, drain it, package it, and then storage it. There are 3 types of potential hazards in terms of biological, physical, and chemical during the production process. With the CCP identification, there are three processes that have Critical Control Point. There are the process of slicing tempe, immersion of tempe into the flour mixture and draining. Recommendations for the development of HACCP systems include recommendations related to employee hygiene, supporting equipment, 5-S analysis, and the production layout.

  15. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    Science.gov (United States)

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  16. Pedestrian Evacuation Analysis for Tsunami Hazards

    Science.gov (United States)

    Jones, J. M.; Ng, P.; Wood, N. J.

    2014-12-01

    Recent catastrophic tsunamis in the last decade, as well as the 50th anniversary of the 1964 Alaskan event, have heightened awareness of the threats these natural hazards present to large and increasing coastal populations. For communities located close to the earthquake epicenter that generated the tsunami, strong shaking may also cause significant infrastructure damage, impacting the road network and hampering evacuation. There may also be insufficient time between the earthquake and first wave arrival to rely on a coordinated evacuation, leaving at-risk populations to self-evacuate on foot and across the landscape. Emergency managers evaluating these coastal risks need tools to assess the evacuation potential of low-lying areas in order to discuss mitigation options, which may include vertical evacuation structures to provide local safe havens in vulnerable communities. The U.S. Geological Survey has developed the Pedestrian Evacuation Analyst software tool for use by researchers and emergency managers to assist in the assessment of a community's evacuation potential by modeling travel times across the landscape and producing both maps of travel times and charts of population counts with corresponding times. The tool uses an anisotropic (directionally dependent) least cost distance model to estimate evacuation potential and allows for the variation of travel speed to measure its effect on travel time. The effectiveness of vertical evacuation structures on evacuation time can also be evaluated and compared with metrics such as travel time maps showing each structure in place and graphs displaying the percentage change in population exposure for each structure against the baseline. Using the tool, travel time maps and at-risk population counts have been generated for some coastal communities of the U.S. Pacific Northwest and Alaska. The tool can also be used to provide valuable decision support for tsunami vertical evacuation siting.

  17. Seismic Hazard Analysis and Uniform Hazard Spectra for Different Regions of Kerman

    Directory of Open Access Journals (Sweden)

    Gholamreza Ghodrati Amiri

    2015-09-01

    Full Text Available This paper was present a seismic hazard analysis and uniform hazard spectra for different regions of Kerman city. A collected catalogue containing both historical and instrumental events and covering the period from 8th century AD until now within the area of 200 Km in radius were used and Seismic sources are modeled. Kijko method has been applied for estimating the seismic parameters considering lack of suitable seismic data, inaccuracy of the available information and uncertainty of magnitude in different periods. To determine the peak ground acceleration the calculations were performed by using the logic tree method. Two weighted attenuation relations were used; including Ghodrati et al, 0.6 and Zare et al, 0.4. Analysis was conducted for 13×8 grid points over Kerman region and adjacent areas with SEISRISK III software and in order to determine the seismic spectra Ghodrati et al, spectral attenuation relationships was used.

  18. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    Science.gov (United States)

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  19. 78 FR 3646 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-01-16

    ... Critical Control Points (HACCP) Systems D. Food Safety Problems Associated With Manufacturing, Processing..., explains the principles and history of the use of Hazard Analysis and Critical Control Point (HACCP... as the Hazard Analysis and Critical Control Points (HACCP) approach to food safety. HACCP was...

  20. Application of systems and control theory-based hazard analysis to radiation oncology.

    Science.gov (United States)

    Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G

    2016-03-01

    Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve

  1. Microbiological quality of seasoned roasted laver and potential hazard control in a real processing line.

    Science.gov (United States)

    Choi, Eun Sook; Kim, Nam Hee; Kim, Hye Won; Kim, Sun Ae; Jo, Jun Il; Kim, Soon Han; Lee, Soon Ho; Ha, Sang Do; Rhee, Min Suk

    2014-12-01

    Microbiological quality of laver, one of the edible seaweeds, has not been reported in a real processing line. Laver or supplements were collected from six manufacturers (A to F) to assess potential microbiological hazards and the critical control points in commercial processing lines. Aerobic plate counts (APC), coliform counts, Bacillus cereus, Staphylococcus aureus, and Vibrio parahaemolyticus were enumerated, and the presence of B. cereus, Listeria monocytogenes, Salmonella, S. aureus, and V. parahaemolyticus were confirmed during processing. The raw material, i.e., dried laver, had a high initial APC level (4.4 to 7.8 log CFU/g), which decreased gradually during processing (final products, 1.3 to 5.9 log CFU/g). Coliforms and B. cereus were not detected in any of the final products, but they were present in some raw materials and semiprocessed products in quantitative analysis. After enrichment for recovery of stress-injured cells, E. coli and foodborne pathogens were not detected in any samples, with the exception of B. cereus. Heat-injured and spore-forming B. cereus isolates were occasionally obtained from some of the raw materials and products after enrichment, thus B. cereus may be a potential microbiological hazard that should be controlled using strategic intervention measures. Secondary roasting (260 to 400°C, 2 to 10 s) significantly reduced the APC (maximum log reduction, 4.7 log CFU/g), and this could be a key intervention step for controlling microbiological hazards during processing (critical control point). When this step was performed appropriately, according to the processing guide for each plant, the microorganisms were inactivated more successfully in the products. This study provides scientific evidence that may facilitate the development of strategies for microbiological hazard control and hygienic management guidelines for real manufacturing plants.

  2. Analysis of temporal and spatial overlapping of hazards interactions at different scales

    Science.gov (United States)

    De Angeli, Silvia; Trasforini, Eva; Taylor, Faith; Rudari, Roberto; Rossi, Lauro

    2017-04-01

    The aim of this work is to develop a methodological framework to analyse the impact of multiple hazards on complex territorial systems, not only focusing on multi-hazard interactions but evaluating also the multi-risk, i.e. considering the impact of multiple hazards also in terms of exposure and vulnerability. Impacts generated by natural hazards in the last years are growing also because many regions of the world become subject to multiple hazards and cascading effects. The modelling of the multi-hazard dimension is a new challenge that allows the stakeholder to face with the chain effects between hazards and to model the risk in a real holistic way. Despite the recognition of the importance of a multi-hazard approach in risk assessment, there are only a few multi-risk approaches developed up to now. The examination of multiple hazards, in contrast to single-hazard cases, poses a series of challenges in each step of the risk analysis, starting from the assessment of the hazard level, passing trough the vulnerability evaluation, and arriving finally at the resultant risk level. Hazard interactions and hazard contemporaneity arising from their spatial and temporal overlap may not only influence the overall hazard level, but also the vulnerability of elements at risk. In the proposed approach a series of possible interactions between hazards are identified and classified. These interactions are then analysed looking at the temporal and spatial evolution of the hazards and the consequent impacts and represented through an explicative graphical framework. Different temporal dimensions are identified. The time of the impact differs from the time of the damage because, even after the end of the impact, damages remain until recovery and restoration processes are completed. The discrepancy between the time of the impact and time of the damage is very important for the modelling of multi-hazard damage. Whenever a certain interval of time occurs between two impacts

  3. Hazardous-waste analysis plan for LLNL operations

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  4. Landslide hazards and systems analysis: A Central European perspective

    Science.gov (United States)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  5. Seismic hazard analysis of Sinop province, Turkey using ...

    Indian Academy of Sciences (India)

    Using 4.0 and greater magnitude earthquakes which occurred between 1 January 1900 and 31 December 2008 in the Sinop province of Turkey this study presents a seismic hazard analysis based on the probabilistic and statistical methods. According to the earthquake zonation map, Sinop is divided into first, second, third ...

  6. Environmental Impact and Hazards Analysis Critical Control Point ...

    African Journals Online (AJOL)

    Tsire is a local meat delicacy (kebab) in northern Nigeria, which has become popular and widely acceptable throughout the country and even beyond. Three production sites of tsire were evaluated for the environmental impact and hazard analysis critical control point (HACCP) on the microbiological and chemical qualities ...

  7. Development of Hazard Analysis Critical Control Points (HACCP ...

    African Journals Online (AJOL)

    Development of Hazard Analysis Critical Control Points (HACCP) and Enhancement of Microbial Safety Quality during Production of Fermented Legume Based ... Nigerian Food Journal ... Critical control points during production of iru and okpehe, two fermented condiments, were identified in four processors in Nigeria.

  8. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    Energy Technology Data Exchange (ETDEWEB)

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  9. 14 CFR 417.223 - Flight hazard area analysis.

    Science.gov (United States)

    2010-01-01

    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff to the planned safe flight state of § 417.219(c), including each planned impact, for an orbital... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard areas...

  10. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults.

  11. HADES: Microprocessor Hazard Analysis via Formal Verification of Parameterized Systems

    Directory of Open Access Journals (Sweden)

    Lukáš Charvát

    2016-12-01

    Full Text Available HADES is a fully automated verification tool for pipeline-based microprocessors that aims at flaws caused by improperly handled data hazards. It focuses on single-pipeline microprocessors designed at the register transfer level (RTL and deals with read-after-write, write-after-write, and write-after-read hazards. HADES combines several techniques, including data-flow analysis, error pattern matching, SMT solving, and abstract regular model checking. It has been successfully tested on several microprocessors for embedded applications.

  12. Hazard analysis of Clostridium perfringens in the Skylab Food System

    Science.gov (United States)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  13. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    Science.gov (United States)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  14. Frequency Analysis of Aircraft hazards for License Application

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  15. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    Science.gov (United States)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  16. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    OpenAIRE

    Kristian Beckers; Jürgen Dürrwang; Dominik Holling

    2016-01-01

    The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied to...

  17. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  18. ANALYSIS OF INTERNAL SOURCES OF HAZARDS IN CIVIL AIR OPERATIONS

    Directory of Open Access Journals (Sweden)

    Katarzyna CHRUZIK

    2017-03-01

    Full Text Available International air law imposes an obligation on the part of transport operators to operationalize risk management, and hence develop records of hazards and estimate the level of risk in the respective organization. Air transport is a complex system combining advanced technical systems, operators and procedures. Sources of hazards occur in all of these closely related and mutually interacting areas, which operate in highly dispersed spaces with a short time horizon. A highly important element of risk management is therefore to identify sources of danger, not only in terms of their own internal risks (the source of threats and activation of threats within the same transport organization, but also in the area of common risk (sources of threats beyond the transport system to which the activation of the hazard is related and external risks (sources of threats outside the transport system. The overall risk management of a transport organization should consider all three risk areas. The paper presents an analysis of internal sources of threats to civil air operations and the resulting main risk areas. The article complements a previous paper by the same authors entitled “Analysis of external sources of hazards in civil air operations”.

  19. Source processes for the probabilistic assessment of tsunami hazards

    Science.gov (United States)

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  20. Uncertainty analysis for seismic hazard in Northern and Central Italy

    Science.gov (United States)

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  1. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    Science.gov (United States)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  2. Hazardous Materials Routing Study Phase II: Analysis of Hazardous Materials Truck Routes in Proximity to the Dallas Central Business District

    Science.gov (United States)

    1985-10-01

    This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...

  3. Environmental risk analysis of hazardous material rail transportation.

    Science.gov (United States)

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials. Published by Elsevier B.V.

  4. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    Science.gov (United States)

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  5. Survey of knowledge of hazards of chemicals potentially associated with the advanced isotope separation processes

    Energy Technology Data Exchange (ETDEWEB)

    Chester, R.O.; Kirkscey, K.A.; Randolph, M.L.

    1979-09-01

    Hazards of chemical potentially associated with the advanced isotope separation processes are estimated based on open literature references. The tentative quantity of each chemical associated with the processes and the toxicity of the chemical are used to estimate this hazard. The chemicals thus estimated to be the most potentially hazardous to health are fluorine, nitric acid, uranium metal, uranium hexafluoride, and uranium dust. The estimated next most hazardous chemicals are bromine, hydrobromic acid, hydrochloric acid, and hydrofluoric acid. For each of these chemicals and for a number of other process-associated chemicals the following information is presented: (1) any applicable standards, recommended standards and their basis; (2) a brief discussion to toxic effects including short exposure tolerance, atmospheric concentration immediately hazardous to life, evaluation of exposures, recommended control procedures, chemical properties, and a list of any toxicology reviews; and (3) recommendations for future research.

  6. Hazard Identification of the Offshore Three-phase Separation Process Based on Multilevel Flow Modeling and HAZOP

    DEFF Research Database (Denmark)

    Wu, Jing; Zhang, Laibin; Lind, Morten

    2013-01-01

    HAZOP studies are widely accepted in chemical and petroleum industries as the method for conducting process hazard analysis related to design, maintenance and operation of the systems. Different tools have been developed to automate HAZOP studies. In this paper, a HAZOP reasoning method based...

  7. Flood Hazard and Risk Analysis in Urban Area

    Science.gov (United States)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  8. Approaches and practices related to hazardous waste management, processing and final disposal in germany and Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Passos, J.A.L.; Pereira, F.A.; Tomich, S. [CETREL S.A., Camacari, BA (Brazil)

    1993-12-31

    A general overview of the existing management and processing of hazardous wastes technologies in Germany and Brazil is presented in this work. Emphasis has been given to the new technologies and practices adopted in both countries, including a comparison of the legislation, standards and natural trends. Two case studies of large industrial hazardous waste sites are described. 9 refs., 2 figs., 9 tabs.

  9. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Sevougian, S. David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  10. The importance of source area mapping for rockfall hazard analysis

    Science.gov (United States)

    Valagussa, Andrea; Frattini, Paolo; Crosta, Giovanni B.

    2013-04-01

    A problem in the characterization of the area affected by rockfall is the correct source areas definition. Different positions or different size of the source areas along a cliff result in different possibilities of propagation and diverse interaction with passive countermeasures present in the area. Through the use of Hy-Stone (Crosta et al., 2004), a code able to perform 3D numerical modeling of rockfall processes, different types of source areas were tested on a case study slope along the western flank of the Mt. de La Saxe (Courmayeur, AO), developing between 1200 and 2055 m s.l.m. The first set of source areas consists of unstable rock masses identified on the basis of field survey and Terrestrial Laser Scanning (IMAGEO, 2011). A second set of source areas has been identified by using different thresholds of slope gradient. We tested slope thresholds between 50° and 75° at 5° intervals. The third source area dataset has been generating by performing a kinematic stability analysis. For this analysis, we mapped the join sets along the rocky cliff by means of the software COLTOP 3D (Jaboyedoff, 2004), and then we identified the portions of rocky cliff where planar/wedge and toppling failures are possible assuming an average friction angle of 35°. Through the outputs of the Hy-Stone models we extracted and analyzed the kinetic energy, height of fly and velocity of the blocks falling along the rocky cliff in order to compare the controls of different source areas. We observed strong variations of kinetic energy and fly height among the different models, especially when using unstable masses identified through Terrestrial Laser Scanning. This is mainly related to the size of the blocks identified as susceptible to failure. On the contrary, the slope gradient thresholds does not have a strong impact on rockfall propagation. This contribution highlights the importance of a careful and appropriate mapping of rockfall source area for rockfall hazard analysis and the

  11. Real-time Microseismic Processing for Induced Seismicity Hazard Detection

    Energy Technology Data Exchange (ETDEWEB)

    Matzel, Eric M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-31

    Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterization phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.

  12. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each...

  13. Hazardous materials transportation: a risk-analysis-based routing methodology.

    Science.gov (United States)

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  14. Evaluation of an active learning module to teach hazard and risk in Hazard Analysis and Critical Control Points (HACCP) classes.

    Science.gov (United States)

    Oyarzabal, Omar A; Rowe, Ellen

    2017-04-01

    The terms hazard and risk are significant building blocks for the organization of risk-based food safety plans. Unfortunately, these terms are not clear for some personnel working in food manufacturing facilities. In addition, there are few examples of active learning modules for teaching adult participants the principles of hazard analysis and critical control points (HACCP). In this study, we evaluated the effectiveness of an active learning module to teach hazard and risk to participants of HACCP classes provided by the University of Vermont Extension in 2015 and 2016. This interactive module is comprised of a questionnaire; group playing of a dice game that we have previously introduced in the teaching of HACCP; the discussion of the terms hazard and risk; and a self-assessment questionnaire to evaluate the teaching of hazard and risk. From 71 adult participants that completed this module, 40 participants (56%) provided the most appropriate definition of hazard, 19 participants (27%) provided the most appropriate definition of risk, 14 participants (20%) provided the most appropriate definitions of both hazard and risk, and 23 participants (32%) did not provide an appropriate definition for hazard or risk. Self-assessment data showed an improvement in the understanding of these terms (P < 0.05). Thirty participants (42%) stated that the most valuable thing they learned with this interactive module was the difference between hazard and risk, and 40 participants (65%) responded that they did not attend similar presentations in the past. The fact that less than one third of the participants answered properly to the definitions of hazard and risk at baseline is not surprising. However, these results highlight the need for the incorporation of modules to discuss these important food safety terms and include more active learning modules to teach food safety classes. This study suggests that active learning helps food personnel better understand important food safety

  15. Evaluation of an active learning module to teach hazard and risk in Hazard Analysis and Critical Control Points (HACCP classes

    Directory of Open Access Journals (Sweden)

    Omar A. Oyarzabal

    2017-04-01

    Full Text Available The terms hazard and risk are significant building blocks for the organization of risk-based food safety plans. Unfortunately, these terms are not clear for some personnel working in food manufacturing facilities. In addition, there are few examples of active learning modules for teaching adult participants the principles of hazard analysis and critical control points (HACCP. In this study, we evaluated the effectiveness of an active learning module to teach hazard and risk to participants of HACCP classes provided by the University of Vermont Extension in 2015 and 2016. This interactive module is comprised of a questionnaire; group playing of a dice game that we have previously introduced in the teaching of HACCP; the discussion of the terms hazard and risk; and a self-assessment questionnaire to evaluate the teaching of hazard and risk. From 71 adult participants that completed this module, 40 participants (56% provided the most appropriate definition of hazard, 19 participants (27% provided the most appropriate definition of risk, 14 participants (20% provided the most appropriate definitions of both hazard and risk, and 23 participants (32% did not provide an appropriate definition for hazard or risk. Self-assessment data showed an improvement in the understanding of these terms (P < 0.05. Thirty participants (42% stated that the most valuable thing they learned with this interactive module was the difference between hazard and risk, and 40 participants (65% responded that they did not attend similar presentations in the past. The fact that less than one third of the participants answered properly to the definitions of hazard and risk at baseline is not surprising. However, these results highlight the need for the incorporation of modules to discuss these important food safety terms and include more active learning modules to teach food safety classes. This study suggests that active learning helps food personnel better understand important

  16. Can the Hazard Assessment and Critical Control Points (HACCP) system be used to design process-based hygiene concepts?

    Science.gov (United States)

    Hübner, N-O; Fleßa, S; Haak, J; Wilke, F; Hübner, C; Dahms, C; Hoffmann, W; Kramer, A

    2011-01-01

    Recently, the HACCP (Hazard Analysis and Critical Control Points) concept was proposed as possible way to implement process-based hygiene concepts in clinical practice, but the extent to which this food safety concept can be transferred into the health care setting is unclear. We therefore discuss possible ways for a translation of the principles of the HACCP for health care settings. While a direct implementation of food processing concepts into health care is not very likely to be feasible and will probably not readily yield the intended results, the underlying principles of process-orientation, in-process safety control and hazard analysis based counter measures are transferable to clinical settings. In model projects the proposed concepts should be implemented, monitored, and evaluated under real world conditions.

  17. Hazard analysis of a computer based medical diagnostic system.

    Science.gov (United States)

    Chudleigh, M F

    1994-07-01

    Medical screening of sectors of the population is now a routine and vital part of health care: an example is cervical smear testing. There is currently significant interest in the possible introduction of semi-automated microscopy systems for cervical cytology and one such experimental system is now undergoing laboratory trials. A collaborative project has been set up to demonstrate the benefits and constraints that arise from applying safety-critical methods developed in other domains to such a diagnostic system. We have carried out a system hazard analysis, successfully using the HAZOP technique adapted from the petrochemical industry.

  18. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  19. Identifying nursing hazards in the emergency department: a new approach to nursing job hazard analysis.

    Science.gov (United States)

    Ramsay, Jim; Denny, Frank; Szirotnyak, Kara; Thomas, Jonathan; Corneliuson, Elizabeth; Paxton, Kim L

    2006-01-01

    It is widely acknowledged that nurses are crucial components in healthcare system. In their roles, nurses are regularly confronted with a variety of biological, physical, and chemical hazards during the course of performing their duties. The safety of nurses themselves, and subsequently that of their patients, depends directly upon the degree to which nurses have knowledge of occupational hazards specific to their jobs and managerial mechanisms for mitigating those hazards. The level of occupational safety and health training resources available to nurses, as well as management support, are critical factors in preventing adverse outcomes from routine job-related hazards. This study will identify gaps in self protective safety education for registered nurses working in emergency departments as well as for nursing students. Furthermore, this study reviews the nature and scope of occupational nursing hazards, and the degree to which current nursing education and position descriptions (or functional statements) equip nurses to recognize and address the hazards inherent in their jobs. This study has three parts. First, a literature review was performed to summarize the nature and scope of occupational nursing hazards. Second, the safety components of position descriptions from 29 Veterans Affairs (VA) hospitals across the United States were obtained and evaluated by an expert panel of occupational health nurses. Finally, an expert panel of occupational health nurses evaluated the degree to which nursing accreditation standards are integrated with OSHA's list of known emergency department hazards; and a separate expert panel of occupational health nurses evaluated the degree to which current VA emergency department nursing position descriptions incorporated hazard recognition and control strategies. Ultimately, prevention of job-related injuries for nurses, and subsequently their patients, will depend directly on the degree to which nurses can identify and control the

  20. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  1. Occupational hazards control of hazardous substances in clean room of semiconductor manufacturing plant using CFD analysis.

    Science.gov (United States)

    Li, Jianfeng; Zhou, Ya-Fei

    2015-02-01

    The manufacturing processes in chip industries are complex, and many kinds of raw materials and solvents of different nature are used, most of which are highly toxic and dangerous. During the machine preventive maintenance period, these toxic and harmful substances will escape from the sealed reaction chamber to the clean workshop environment and endanger the health of the workers on-site, resulting in occupational diseases. From the perspective of prevention, the spread and prediction of hydrochloric acid (HCl) that escaped from the metal-etching chamber during maintenance were studied in this article. The computational fluid dynamics technology was used for a three-dimensional numerical simulation of the indoor air velocity field and the HCl concentration field, and the simulation results were then compared with the on-site monitoring data to verify the correctness and feasibility. The occupational hazards and control measures were analyzed based on the numerical simulation, and the optimal control measure was obtained. In this article, using the method of ambient air to analyze the occupational exposure can provide a new idea to the field of occupational health research in the integrated circuit industry and had theoretical and practical significance. © The Author(s) 2012.

  2. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.

    Science.gov (United States)

    Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W

    2017-08-18

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  3. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis

    Science.gov (United States)

    Karlsson, Caroline S. J.; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W.

    2017-11-01

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  4. An (even) broader perspective: Combining environmental processes and natural hazards education in a MSc programme

    Science.gov (United States)

    Heckmann, Tobias; Haas, Florian; Trappe, Martin; Cyffka, Bernd; Becht, Michael

    2010-05-01

    Natural hazards are processes occurring in the natural environment that negatively affect human society. In most instances, the definition of natural hazards implies sudden events as different as earthquakes, floods or landslides. In addition, there are other phenomena that occur more subtly or slowly, and nevertheless may have serious adverse effects on the human environment. Hence, a comprehensive study programme in natural hazards has to include not only the conspicuous causes and effects of natural catastrophes, but of environmental processes in general. Geography as a discipline is located at the interface of natural, social and economic sciences; the physical geography programme described here is designed to include the social and economic dimension as well as management issues. Modules strengthening the theoretical background of geomorphic, geological, hydrological and meteorological processes and hazards are complemented by practical work in the field and the laboratory, dealing with measuring and monitoring environmental processes. On this basis, modeling and managing skills are developed. Another thread in the transdisciplinary programme deals with sustainability and environmental policy issues, and environmental psychology (e.g. perception of and reaction to hazards). This will improve the communication and team working skills of students wherever they are part of an interdisciplinary working group. Through the involvement in research programmes, students are confronted ‘hands on' with the different aspects of environmental processes and their consequences; thus, they will be excellently but not exclusively qualified for positions in the ‘natural hazards' sector.

  5. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Science.gov (United States)

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  6. Fire hazard analysis for Plutonium Finishing Plant complex

    Energy Technology Data Exchange (ETDEWEB)

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  7. Global process industry initiatives to reduce major accident hazards

    Energy Technology Data Exchange (ETDEWEB)

    Pitblado, Robin [DNV Energy Houston, TX (United States). SHE Risk Management; Pontes, Jose [DNV Energy Rio de Janeiro, RJ (Brazil). Americas Region; Oliveira, Luiz [DNV Energy Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Since 2000, disasters at Texas City, Toulouse, Antwerp, Buncefield, P-36 and several near total loss events offshore in Norway have highlighted that major accident process safety is still a serious issue. Hopes that Process Safety Management or Safety Case regulations would solve these issues have not proven true. The Baker Panel recommended to BP several actions mainly around leadership, incentives, metrics, safety culture and more effective implementation of PSM systems. In Europe, an approach built around mechanical integrity and safety barriers, especially relating to technical safety systems, is being widely adopted. DNV has carried out a global survey of process industry initiatives, by interview and by literature review, for both upstream and downstream activities, to identify what the industry itself is planning to implement to enhance process safety in the next 5 - 10 years. This shows that an approach combining Baker Panel and EU barrier approaches and some nuclear industry real-time risk management approaches might be the best means to achieve a factor of 3-4 improvement in process safety. (author)

  8. REPLACEMENT OF HAZARDOUS MATERIAL IN WIDE WEB FLEXOGRAPHIC PRINTING PROCESS

    Science.gov (United States)

    This study examined on a technical and economic basis, the effect of substituting water-based inks in a flexographic printing process. To reduce volatile organic compound (VOC) emissions by switching from the use of solvent-based inks to water based inks, several equipment modifi...

  9. Job Hazards Analysis Among A Group Of Surgeons At Zagazig ...

    African Journals Online (AJOL)

    Methods: A cross section study was don upon a random sample of surgeons working at Zagazig University teaching hospitals evaluated to their job hazards using quantitative hazard assessment questionnaire and calculating job steps total hazards score by standardized risk assessment score followed by expert panel ...

  10. Application of hazard analysis critical control points (HACCP) to organic chemical contaminants in food.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-03-01

    Hazard Analysis Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards that was developed as an effective alternative to conventional end-point analysis to control food safety. It has been described as the most effective means of controlling foodborne diseases, and its application to the control of microbiological hazards has been accepted internationally. By contrast, relatively little has been reported relating to the potential use of HACCP, or HACCP-like procedures, to control chemical contaminants of food. This article presents an overview of the implementation of HACCP and discusses its application to the control of organic chemical contaminants in the food chain. Although this is likely to result in many of the advantages previously identified for microbiological HACCP, that is, more effective, efficient, and economical hazard management, a number of areas are identified that require further research and development. These include: (1) a need to refine the methods of chemical contaminant identification and risk assessment employed, (2) develop more cost-effective monitoring and control methods for routine chemical contaminant surveillance of food, and (3) improve the effectiveness of process optimization for the control of chemical contaminants in food.

  11. Benzene as a Chemical Hazard in Processed Foods

    Directory of Open Access Journals (Sweden)

    Vânia Paula Salviano dos Santos

    2015-01-01

    Full Text Available This paper presents a literature review on benzene in foods, including toxicological aspects, occurrence, formation mechanisms, and mitigation measures and analyzes data reporting benzene levels in foods. Benzene is recognized by the IARC (International Agency for Research on Cancer as carcinogenic to humans, and its presence in foods has been attributed to various potential sources: packaging, storage environment, contaminated drinking water, cooking processes, irradiation processes, and degradation of food preservatives such as benzoates. Since there are no specific limits for benzene levels in beverages and food in general studies have adopted references for drinking water in a range from 1–10 ppb. The presence of benzene has been reported in various food/beverage substances with soft drinks often reported in the literature. Although the analyses reported low levels of benzene in most of the samples studied, some exceeded permissible limits. The available data on dietary exposure to benzene is minimal from the viewpoint of public health. Often benzene levels were low as to be considered negligible and not a consumer health risk, but there is still a need of more studies for a better understanding of their effects on human health through the ingestion of contaminated food.

  12. Research on the spatial analysis method of seismic hazard for island

    Science.gov (United States)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  13. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis.

    Science.gov (United States)

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA.

  14. Causes of some hazardous engineeringgeological processes on urban territories

    Science.gov (United States)

    Kril, Tetiana

    2017-11-01

    Population growth in cities, the need to expand the living space requires of rational use of territories within the existing boundaries of the city. The necessity of compliance with the functional zones of the city is shown on the example of a representative part of Kiev, that should be performed taking into account engineering-geological features of the territory. It is necessary to comply with the underlying zones in the underground space to ensure the bearing capacity of the soil mass. The changes in soil bases are defined as a result of changes in the stress-strain state under the construction, development of underground space, changes of soils water content as the result of soaking from the surface, formation of "perched water", raising the groundwater level. The vibration analysis of high-rise building - the main library building is made from the dynamic loads that arise during the movement of the vehicle, taking into account the work of the pile foundation as a rigid body relative to the longitudinal axis, which passes through the center of the building at the level of the cap of piles.

  15. Analysis of hazardous material releases due to natural hazards in the United States.

    Science.gov (United States)

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts. © 2012 The Author(s). Journal compilation © Overseas Development Institute, 2012.

  16. Radiation doses and hazards from processing of crude oil at the Tema oil refinery in Ghana.

    Science.gov (United States)

    Darko, E O; Kpeglo, D O; Akaho, E H K; Schandorf, C; Adu, P A S; Faanu, A; Abankwah, E; Lawluvi, H; Awudu, A R

    2012-02-01

    Processing of crude oil has been carried out in Ghana for more than four decades without measures to assess the hazards associated with the naturally occurring radionuclides in the raw and processed materials. This study investigates the exposure of the public to (226)Ra, (232)Th and (40)K in crude oil, petroleum products and wastes at the Tema oil refinery in Ghana using gamma-ray spectrometry. The study shows higher activity concentrations of the natural radionuclides in the wastes than the crude oil and the products with estimated hazard indices less than unity. The values obtained in the study are within recommended limits for public exposure indicating that radiation exposure from processing of the crude oil at the refinery does not pose any significant radiological hazard but may require monitoring to establish long-term effect on both public and workers.

  17. Public acceptability of the use of gamma rays from spent nuclear fuel as a hazardous waste treatment process

    Energy Technology Data Exchange (ETDEWEB)

    Mincher, B.J.; Wells, R.P.; Reilly, H.J.

    1992-01-01

    Three methods were used to estimate public reaction to the use of gamma irradiation of hazardous wastes as a hazardous waste treatment process. The gamma source of interest is spent nuclear fuel. The first method is Benefit-Risk Decision Making, where the benefits of the proposed technology are compared to its risks. The second analysis compares the proposed technology to the other, currently used nuclear technologies and estimates public reaction based on that comparison. The third analysis is called Analysis of Public Consent, and is based on the professional methods of the Institute for Participatory Management and Planning. The conclusion of all three methods is that the proposed technology should not result in negative public reaction sufficient to prevent implementation.

  18. Hygienic-sanitary working practices and implementation of a Hazard Analysis and Critical Control Point (HACCP plan in lobster processing industries Condições higiênico-sanitárias e implementação do plano de Análise de Perigos e Pontos Críticos de Controle (APPCC em indústrias processadoras de lagosta

    Directory of Open Access Journals (Sweden)

    Cristina Farias da Fonseca

    2013-03-01

    Full Text Available This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP) in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP) plan resulted in the detection of two critical control points (CCPs) including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP) was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP) plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP) hazards.Objetivou-se com este estudo verificar as condições higiênico-sanitárias e criar um plano de Análise de Perigos e Pontos Críticos de Controle (APPCC) para implantação em duas indústrias de processamento de lagosta no Estado de Pernambuco, Brasil. As indústrias estudadas processam lagosta inteira congelada, lagostas inteiras cozidas congeladas e caudas de lagosta congelada para exportação. A aplicação de um checklist de controle higiênico-sanitário nas indústrias visitadas resultou em uma classificação global de conformidades maior que 96% dos aspectos analisados. O desenvolvimento do plano APPCC resultou na detecção de dois pontos críticos de controle (PCC), incluindo o recebimento e etapas de classificação, no processamento de lagosta congelada e caudas de lagosta congelada, e um PCC adicional foi detectado no processamento de lagosta inteira cozida

  19. CHARACTERIZATION AND USES OF THE “QUALITATIVE TECHNIQUES" FOR HAZARD IDENTIFICATION AND ASSESSMENT OF CHEMICAL PROCESS INDUSTRIES

    Directory of Open Access Journals (Sweden)

    Eusebio V. Ibarra-Hernández

    2015-01-01

    Full Text Available This paper determines and studies, analyzes and elaborates and classifies and categorizes the main qualitative techniques for hazards identification and assessment in chemical industrial processes. It specifies that these techniques base their effectiveness both, on analytical estimation processes and on the safety managers-engineers ability. It enumerates also those that present a bigger use frequency as well as the dangers that identify and the results that they give. Their use is linked, in function of the complexity level of the analysis technique, with the different stages of the life of industrial projects / processes.

  20. What can we learn from sediment connectivitiy indicies regarding natural hazard processes in torrent catchments?

    Science.gov (United States)

    Schmutz, Daria; Zimmermann, Markus; Keiler, Margreth

    2017-04-01

    Sediment connectivity is defined as the degree of coupling between sediment sources and sinks in a system and describes the effectiveness of the transfer of sediment from hillslopes into channels and within channels (Bracken et al. 2015). Borselli et al. (2008) developed a connectivity index (IC) based on digital terrain models (DTMs). Cavalli et al. (2013) adapted this index for mountainous catchments. These measures of connectivity provide overall information about connectivity pattern in the catchment, thus the understanding of sediment connectivity can help to improve the hazard analysis in these areas. Considering the location of settlements in the alpine regions, high sediment transfer can pose a threat to villages located nearby torrents or at the debris cones. However, there is still a lack of studies on the linkage between IC and hazardous events with high sediment yield in alpine catchments. In this study, the expressiveness and applicability of IC is tested in relation with hazardous events in several catchments of the Bernese and Pennine Alps (Switzerland). The IC is modelled based on DTMs (resolution 2 m or if available 0.5 m) indicating the surface from the time before and after a documented hazardous event and analysed with respect to changes in connectivity caused by the event. The spatial pattern of connectivity is compared with the observed sediment dynamic during the event using event documentations. In order to validate the IC, a semi-quantitative field connectivity index (FIC) is developed addressing characteristics of the channel, banks and slopes and applied in a selection of the case studies. First analysis shows that the IC is highly sensitive to the resolution and quality of the DTM. Connectivity calculated by the IC is highest along the channel. The general pattern of connectivity is comparable applying the IC for the DTM before and after the event. Range of the connectivity values gained from IC modelling is highly specific for each

  1. Spatio-Temporal Risk Assessment Process Modeling for Urban Hazard Events in Sensor Web Environment

    Directory of Open Access Journals (Sweden)

    Wei Wang

    2016-11-01

    Full Text Available Immediate risk assessment and analysis are crucial in managing urban hazard events (UHEs. However, it is a challenge to develop an immediate risk assessment process (RAP that can integrate distributed sensors and data to determine the uncertain model parameters of facilities, environments, and populations. To solve this problem, this paper proposes a RAP modeling method within a unified spatio-temporal framework and forms a 10-tuple process information description structure based on a Meta-Object Facility (MOF. A RAP is designed as an abstract RAP chain that collects urban information resources and performs immediate risk assessments. In addition, we propose a prototype system known as Risk Assessment Process Management (RAPM to achieve the functions of RAP modeling, management, execution and visualization. An urban gas leakage event is simulated as an example in which individual risk and social risk are used to illustrate the applicability of the RAP modeling method based on the 10-tuple metadata framework. The experimental results show that the proposed RAP immediately assesses risk by the aggregation of urban sensors, data, and model resources. Moreover, an extension mechanism is introduced in the spatio-temporal RAP modeling method to assess risk and to provide decision-making support for different UHEs.

  2. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette Jackson [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the new methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.

  3. Concept of a spatial data infrastructure for web-mapping, processing and service provision for geo-hazards

    Science.gov (United States)

    Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara

    2017-04-01

    Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned

  4. Spatial prediction of landslide hazard using discriminant analysis and GIS

    Science.gov (United States)

    Peter V. Gorsevski; Paul Gessler; Randy B. Foltz

    2000-01-01

    Environmental attributes relevant for spatial prediction of landslides triggered by rain and snowmelt events were derived from digital elevation model (DEM). Those data in conjunction with statistics and geographic information system (GIS) provided a detailed basis for spatial prediction of landslide hazard. The spatial prediction of landslide hazard in this paper is...

  5. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Science.gov (United States)

    2010-01-01

    ... either the methodology provided in the Risk Management Plan (RMP) Offsite Consequence Analysis Guidance..., App. I Appendix I to Part 417—Methodologies for Toxic Release Hazard Analysis and Operational Procedures I417.1General This appendix provides methodologies for performing toxic release hazard analysis...

  6. Mapping mass movement processes using terrestrial LIDAR: a swift mechanism for hazard and disaster risk assessment

    Science.gov (United States)

    Garnica-Peña, Ricardo; Murillo-García, Franny; Alcántara-Ayala, Irasema

    2014-05-01

    The impact of disasters associated with mass movement processes has increased in the past decades. Either triggered by earthquakes, volcanic activity or rainfall, mass movement processes have affected people, infrastructure, economic activities and the environment in different parts of the world. Extensive damage is particularly linked to rainfall induced landslides due to the occurrence of tropical storms, hurricanes, and the combination of different meteorological phenomenon on exposed vulnerable communities. Therefore, landslide susceptibility analysis, hazard and risk assessments are considered as significant mechanisms to lessen the impact of disasters. Ideally, these procedures ought to be carried out before disasters take place. However, under intense or persistent periods of rainfall, the evaluation of potentially unstable slopes becomes a critical issue. Such evaluations are constrained by the availability of resources, capabilities and scientific and technological tools. Among them, remote sensing has proved to be a valuable tool to evaluate areas affected by mass movement processes during the post-disaster stage. Nonetheless, the high cost of imagery acquisition inhibits their wide use. High resolution topography field surveys consequently, turn out to be an essential approach to address landslide evaluation needs. In this work, we present the evaluation and mapping of a series of mass movement processes induced by hurricane Ingrid in September, 2013, in Teziutlán, Puebla, México, a municipality situated 265 km Northeast of Mexico City. Geologically, Teziutlán is characterised by the presence, in the North, of siltstones and conglomerates of the Middle Jurassic, whereas the central and Southern sectors consist of volcanic deposits of various types: andesitic tuffs of Tertiary age, and basalts, rhyolitic tuffs and ignimbrites from the Quaternary. Major relief structures are formed by the accumulation of volcanic material; lava domes, partially buried

  7. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis

    Science.gov (United States)

    Schaefer, K.; Prakash, A.; Witte, W.

    2011-12-01

    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  8. Analysis of hazardous biological material by MALDI mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  9. The hazard analysis and critical control point system in food safety.

    Science.gov (United States)

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series.

  10. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-risk informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.

  11. 77 FR 66638 - The Standard on Process Safety Management of Highly Hazardous Chemicals; Extension of the Office...

    Science.gov (United States)

    2012-11-06

    ... Occupational Safety and Health Administration The Standard on Process Safety Management of Highly Hazardous... the Standard on Process Safety Management of Highly Hazardous Chemicals. DATES: Comments must be... elements of the standard; completing a compilation of written process safety information; performing a...

  12. Open Source Seismic Hazard Analysis Software Framework (OpenSHA)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — OpenSHA is an effort to develop object-oriented, web- & GUI-enabled, open-source, and freely available code for conducting Seismic Hazard Analyses (SHA). Our...

  13. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran.

    Science.gov (United States)

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran

    2009-10-01

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  14. ANALYSIS OF INTERNAL SOURCES OF HAZARDS IN CIVIL AIR OPERATIONS

    OpenAIRE

    Katarzyna CHRUZIK; Karolina WIŚNIEWSKA; Radosław FELLNER

    2017-01-01

    International air law imposes an obligation on the part of transport operators to operationalize risk management, and hence develop records of hazards and estimate the level of risk in the respective organization. Air transport is a complex system combining advanced technical systems, operators and procedures. Sources of hazards occur in all of these closely related and mutually interacting areas, which operate in highly dispersed spaces with a short time horizon. A highly important element o...

  15. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    Science.gov (United States)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  16. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    Directory of Open Access Journals (Sweden)

    Yu-Ting Hung

    2015-09-01

    Full Text Available To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management.

  17. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    Science.gov (United States)

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  18. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  19. Geophysics in Mejillones Basin, Chile: Dynamic analysis and associatedseismic hazard

    Science.gov (United States)

    Maringue, J. I.; Yanez, G. A.; Lira, E.; Podestá, L., Sr.; Figueroa, R.; Estay, N. P.; Saez, E.

    2016-12-01

    The active margin of South America has a high seismogenic potential. In particular, the Mejillones peninsula, located in northern Chile, represents a site of interest for seismic hazard due to 100-year seismic gap, the potentially large site effects, and the presence of the most important port in the region. We perform a dynamic analysis of the zone from a spatial and petrophysical model of the Mejillones Basin, to understand its behavior under realistic seismic scenarios. Geometry and petrophysics of the basin were obtained from an integrated modeling of geophysics observations (gravity, seismic and electromagnetic data) distributed mainly in Pampa Mejillones whose western edge is limited by Mejillones Fault, oriented north-south. This regional-scale normal fault shows a half-graben geometry which controls the development of the Mejillones basin eastwards. The gravimetric and magnetotelluric methods allow to define the geometry of the basin, through a cover/basement density contrast, and the transition zone from very low-moderate electrical resistivities, respectively. The seismic method complements the petrophysics in terms of the shear wave depth profile. The results show soil's thicknesses up to 700 meters on deeper zone, with steeper slopes to the west and lower slopes to the east, in agreement with the normal-fault-half-graben basin geometry. Along the N-S direction there are not great differences in basin depth, comprising an almost 2D problem. In terms of petrophysics, the sedimentary stratum is characterized by shear velocities between 300-700 m/s, extremely low electrical resistivities, below 1 ohm-m, and densities from 1.4 to 1.8 gr/cc. The numerical simulation of the seismic waves amplification gives values in the order of 0.8g, which implying large surface damages. The results demonstrate a potential risk in Mejillones bay to future events, therefore is very important to generate mitigations policies for infrastructure and human settlements.

  20. Fire Hazard Analysis for the Cold Vacuum Drying facility (CVD) Facility

    CERN Document Server

    Singh, G

    2000-01-01

    The CVDF is a nonreactor nuclear facility that will process the Spent Nuclear Fuels (SNF) presently stored in the 105-KE and 105-KW SNF storage basins. Multi-canister overpacks (MCOs) will be loaded (filled) with K Basin fuel transported to the CVDF. The MCOs will be processed at the CVDF to remove free water from the fuel cells (packages). Following processing at the CVDF, the MCOs will be transported to the CSB for interim storage until a long-term storage solution can be implemented. This operation is expected to start in November 2000. A Fire Hazard Analysis (FHA) is required for all new facilities and all nonreactor nuclear facilities, in accordance with U.S. Department of Energy (DOE) Order 5480.7A, Fire Protection. This FHA has been prepared in accordance with DOE 5480.7A and HNF-PRO-350, Fire Hazard Analysis Requirements. Additionally, requirements or criteria contained in DOE, Richland Operations Office (RL) RL Implementing Directive (RLID) 5480.7, Fire Protection, or other DOE documentation are cite...

  1. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    Energy Technology Data Exchange (ETDEWEB)

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  2. Pasteurised milk and implementation of HACCP (Hazard Analysis Critical Control Point

    Directory of Open Access Journals (Sweden)

    T.B Murdiati

    2004-10-01

    Full Text Available The purpose of pasteurisation is to destroy pathogen bacteria without affecting the taste, flavor, and nutritional value. A study on the implementation of HACCP (Hazard Analysis Critical Control Point in producing pasteurized milk was carried out in four processing unit of pasteurised milk, one in Jakarta, two in Bandung and one in Bogor. The critical control points in the production line were identified. Milk samples were collected from the critical points and were analysed for the total number of microbes. Antibiotic residues were detected on raw milks. The study indicated that one unit in Bandung dan one unit in Jakarta produced pasteurized milk with lower number of microbes than the other units, due to better management and control applied along the chain of production. Penisilin residues was detected in raw milk used by unit in Bogor. Six critical points and the hazard might arise in those points were identified, as well as how to prevent the hazards. Quality assurance system such as HACCP would be able to produce high quality and safety of pasteurised milk, and should be implemented gradually.

  3. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  4. Investigating Coastal Processes and Hazards Along the Coastline of Ghana, West Africa (Invited)

    Science.gov (United States)

    Hapke, C. J.; Ashton, A. D.; Wiafe, G.; Addo, K. A.; Ababio, S.; Agyekum, K. A.; Lippmann, T. C.; Roelvink, J.

    2010-12-01

    coast and responding to erosion issues. Funding for program development and equipment has been provided via the Coastal Geosciences Program of the U.S. Office of Naval Research through the Navy’s Africa Partnership Station. Data collection and analysis to date include the first regional shoreline change assessment of the Ghana coast, utilizing aerial photography spanning 31 years and RTK-GPS field surveys and reconnaissance mapping. Initial results from the shoreline change analysis indicate highly variable alongshore rates of change, although the trend is predominantly erosional. The highest erosion rates are found in the east, on the downdrift flank of the low-lying, sandy Volta Delta complex. The rapid erosion rates are likely due to the disruption of sediment supplied to the coast by the damming of the Volta River in the 1960s, as well as alongshore transport gradients generated by the progradation and morphologic evolution of the delta. Continuing investigations of coastal processes in Ghana will allow for a better understanding of erosion hazards and will aid in the development of appropriate, systematic, and sustainable responses to future increased hazards associated with rising sea-levels.

  5. Morphometric and landuse analysis: implications on flood hazards ...

    African Journals Online (AJOL)

    This study assessed the morphometric, landuse and lithological attributes of five basins (Iwaraja, Ilesa, Olupona, Osogbo I and Osogbo II) with particular reference to flood hazards in Ilesa and Osogbo metropolis, Osun State Nigeria. Ilesa town is situated within Iwaraja and Ilesa basins while Osogbo metropolis spread ...

  6. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  7. Chemical hazards database and detection system for Microgravity and Materials Processing Facility (MMPF)

    Science.gov (United States)

    Steele, Jimmy; Smith, Robert E.

    1991-01-01

    The ability to identify contaminants associated with experiments and facilities is directly related to the safety of the Space Station. A means of identifying these contaminants has been developed through this contracting effort. The delivered system provides a listing of the materials and/or chemicals associated with each facility, information as to the contaminant's physical state, a list of the quantity and/or volume of each suspected contaminant, a database of the toxicological hazards associated with each contaminant, a recommended means of rapid identification of the contaminants under operational conditions, a method of identifying possible failure modes and effects analysis associated with each facility, and a fault tree-type analysis that will provide a means of identifying potential hazardous conditions related to future planned missions.

  8. The Impact of Physical and Ergonomic Hazards on Poultry Abattoir Processing Workers: A Review.

    Science.gov (United States)

    Harmse, Johannes L; Engelbrecht, Jacobus C; Bekker, Johan L

    2016-02-06

    The poultry abattoir industry continues to grow and contribute significantly to the gross domestic product in many countries. The industry expects working shifts of eight to eleven hours, during which workers are exposed to occupational hazards which include physical hazards ranging from noise, vibration, exposure to cold and ergonomic stress from manual, repetitive tasks that require force. A PubMed, Medline and Science Direct online database search, using specific keywords was conducted and the results confirmed that physical and ergonomic hazards impact on abattoir processing workers health, with harm not only to workers' health but also as an economic burden due to the loss of their livelihoods and the need for treatment and compensation in the industry. This review endeavours to highlight the contribution poultry processing plays in the development of physical agents and ergonomic stress related occupational diseases in poultry abattoir processing workers. The impact includes noise-induced hearing loss, increased blood pressure, menstrual and work related upper limb disorders. These are summarised as a quick reference guide for poultry abattoir owners, abattoir workers, poultry associations, occupational hygienists and medical practitioners to assist in the safer management of occupational health in poultry abattoirs.

  9. New continuous-mix process for gelling anhydrous methanol minimizes hazards

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, J.E. (BJ Services Co., Tomball, TX (US)); McBain, C. (Union Pacific Resources Inc. (US)); Gregory, G.; Gerbrandt, D. (BJ Services Canada Inc., Calgary (CA))

    1992-07-01

    This paper discusses a novel approach to well stimulation with anhydrous methanol-based fracturing-fluid that significantly reduces hazards to personnel and equipment during the fracturing process. Research is presented on the various chemical and engineering process technologies used to develop and evaluate continuously mixed anhydrous methanol fracturing-fluid performance. Field case histories are also discussed. Since the development of hydraulic fracturing technology as well completion technique hazardous fluids frequently have been used. These fluids include refined oils, formation condensates, crude oils, acids, alcohols, and a variety of other chemicals and additives. Batch mixing is a common technique used to formulate fluids for fracturing applications. The fluid viscosity is adjusted to the desired level by circulating the base fluid through blending equipment while adding the polymeric gelling agents, buffers, and other chemicals. The fluid must circulate through the equipment many times for the fluid to develop the desired viscosity. When hazardous materials are handled in a multiple-circulating process, the potential for accidents is high.

  10. Hazard Analysis of Arid and Semi-Arid (ASAL) Regions of Kenya ...

    African Journals Online (AJOL)

    There is need to undertake a comprehensive hazard and Vulnerability analysis at regional and country level to inform interventions and other developmental activities. Women should be targeted at the community and leadership level, and efforts to empower them should be stepped up. Keywords: hazard, natural disaster, ...

  11. Preliminary fire hazard analysis for the PUTDR and TRU trenches in the Solid Waste Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    Gaschott, L.J.

    1995-06-16

    This document represents the Preliminary Fire Hazards Analysis for the Pilot Unvented TRU Drum Retrieval effort and for the Transuranic drum trenches in the low level burial grounds. The FHA was developed in accordance with DOE Order 5480.7A to address major hazards inherent in the facility.

  12. Job safety analysis and hazard identification for work accident prevention in para rubber wood sawmills in southern Thailand.

    Science.gov (United States)

    Thepaksorn, Phayong; Thongjerm, Supawan; Incharoen, Salee; Siriwong, Wattasit; Harada, Kouji; Koizumi, Akio

    2017-11-25

    We utilized job safety analysis (JSA) and hazard identification for work accident prevention in Para rubber wood sawmills, which aimed to investigate occupational health risk exposures and assess the health hazards at sawmills in the Trang Province, located in southern Thailand. We conducted a cross-sectional study which included a walk-through survey, JSA, occupational risk assessment, and environmental samplings from March through September 2015 at four Para rubber wood sawmills. We identified potential occupational safety and health hazards associated with six main processes, including: 1) logging and cutting, 2) sawing the lumber into sheets, 3) planing and re-arranging, 4) vacuuming and wood preservation, 5) drying and planks re-arranging, and 6) grading, packing, and storing. Working in sawmills was associated with high risk of wood dust and noise exposure, occupational accidents injuring hands and feet, chemicals and fungicide exposure, and injury due to poor ergonomics or repetitive work. Several high-risk areas were identified from JSA and hazard identification of the working processes, especially high wood dust and noise exposure when sawing lumber into sheets and risk of occupational accidents of the hands and feet when struck by lumber. All workers were strongly recommended to use personal protective equipment in any working processes. Exposures should be controlled using local ventilation systems and reducing noise transmission. We recommend that the results from the risk assessment performed in this study be used to create an action plan for reducing occupational health hazards in Para rubber sawmills.

  13. METHODOLOGY OF SITE-SPECIFIC SEISMIC HAZARD ANALYSIS FOR IMPORTANT CIVIL STRUCTURE

    Directory of Open Access Journals (Sweden)

    Donny T. Dangkua

    2007-01-01

    Full Text Available Note from the Editor The Indonesian archipelago is one of the most active tectonic zones in the world. Therefore to design an important (and dangerous structure such as a nuclear power plan knowledge of the seismicity of the site is very important. This could be achieved by doing a site-specific seismic hazard analysis. A site-specific seismic hazard analysis is required in the design state in order to determine the recommended seismic design criteria of the structure. A complete and thorough explanation of the methodology to do a site-specific seismic hazard analysis is presented in this Technical Note Abstract in Bahasa Indonesia :

  14. Hazard Analysis and Critical Control Point Program for Foodservice Establishments.

    Science.gov (United States)

    Control Point ( HACCP ) inspections in foodservice operations throughout the state. The HACCP system, which first emerged in the late 1960s, is a rational...has been adopted for use in the foodservice industry. The HACCP system consists of three main components which are the: (1) Assessment of the hazards...operations. This manual was developed to assist local sanitarians in conducting HACCP inspections and in educating foodservice operators and employees

  15. Chemical hazards analysis of resilient flooring for healthcare.

    Science.gov (United States)

    Lent, Tom; Silas, Julie; Vallette, Jim

    2010-01-01

    This article addresses resilient flooring, evaluating the potential health effects of vinyl flooring and the leading alternatives-synthetic rubber, polyolefin, and linoleum-currently used in the healthcare marketplace. The study inventories chemicals incorporated as components of each of the four material types or involved in their life cycle as feedstocks, intermediary chemicals, or emissions. It then characterizes those chemicals using a chemical hazard-based framework that addresses persistence and bioaccumulation, human toxicity, and human exposures.

  16. [Investigation and analysis on occupational hazards in a carbon enterprise].

    Science.gov (United States)

    Lu, C D; Ding, Q F; Wang, Z X; Shao, H; Sun, X C; Zhang, F

    2017-04-20

    Objective: To explore occupational-disease-inductive in a carbon enterprise workplace and personnel occupational health examination, providing the basis for occupational disease prevention and control of the industry. Methods: Field occupational health survey and inspection law were used to study the the situation and degree of occupational disease hazards in carbon enterprise from 2013 to 2015.Occupational health monitoring was used for workers, physical examination, detection of occupational hazard factors and physical examination results were analyzed comprehensive. Results: Dust, coal tar pitch volatiles, and noise in carbon enterprise were more serious than others. Among them, the over standard rate of coal tar pitch volatiles was 76.67%, the maximum point detection was 1.06 mg/m(3), and the maximum of the individual detection was 0.67 mg/m(3). There was no statistical difference among the 3 years (P>0.05) . There were no significant differences in the incidence of occupation health examination, chest X-ray, skin audiometry, blood routine, blood pressure, electrocardiogram between 3 years (P>0.05) , in which the skin and audiometry abnormal rate was higher than 10% per year. Conclusion: Dust, coal tar, and noise are the main occupational hazard factors of carbon enterprise, should strengthen the corresponding protection.

  17. High-Precision and Low Latency RT-GNSS Processed Data for Diverse Geophysical and Natural Hazard Communities.

    Science.gov (United States)

    Mencin, David; Hodgkinson, Kathleen; Sievers, Charlie; David, Phillips; Charles, Meertens; Glen, Mattioli

    2017-04-01

    UNAVCO has been providing infrastructure and support for solid-earth sciences and earthquake natural hazards for the past two decades. Recent advances in GNSS technology and data processing are now providing position solutions with centimeter-level precision at high-rate (>1 Hz) and low latency (i.e. the time required for data to arrive for analysis, in this case less than 1 second). These data have the potential to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami sources, and thus profoundly transform rapid event characterization and warning. Scientific and operational applications also include glacier and ice sheet motions; tropospheric modeling; and space weather. These areas of geophysics represent a spectrum of research fields, including geodesy, seismology, tropospheric weather, space weather and natural hazards. Processed Real-Time GNSS (RT-GNSS) data will require formats and standards that allow this broad and diverse community to use these data and associated meta-data in existing research infrastructure. These advances have critically highlighted the difficulties associated with merging data and metadata between scientific disciplines. Even seemingly very closely related fields such as geodesy and seismology, which both have rich histories of handling large volumes of data and metadata, do not go together well in any automated way. Community analysis strategies, or lack thereof, such as treatment of error prove difficult to address and are reflected in the data and metadata. In addition, these communities have differing security, accessibility and reliability requirements. We propose some solutions to the particular problem of making RT-GNSS processed solution data and metadata accessible to multiply scientific and natural hazard communities. Importantly, we discuss the roadblocks encounter and solved and those that remain to be addressed.

  18. Image processing for hazard recognition in on-board weather radar

    Science.gov (United States)

    Kelly, Wallace E. (Inventor); Rand, Timothy W. (Inventor); Uckun, Serdar (Inventor); Ruokangas, Corinne C. (Inventor)

    2003-01-01

    A method of providing weather radar images to a user includes obtaining radar image data corresponding to a weather radar image to be displayed. The radar image data is image processed to identify a feature of the weather radar image which is potentially indicative of a hazardous weather condition. The weather radar image is displayed to the user along with a notification of the existence of the feature which is potentially indicative of the hazardous weather condition. Notification can take the form of textual information regarding the feature, including feature type and proximity information. Notification can also take the form of visually highlighting the feature, for example by forming a visual border around the feature. Other forms of notification can also be used.

  19. [Re-analysis of occupational hazards in foundry].

    Science.gov (United States)

    Zhang, Min; Qi, Cheng; Chen, Wei-Hong; Lu, Yang; Du, Xie-Yi; Li, Wen-Jie; Meng, Chuan-San

    2010-04-01

    To analyze systematically the characteristics of occupational hazards in the foundry, and provide precise data for epidemiology studies and control of occupational hazards in the foundry. Data of airborne dust, chemical occupational hazards and physical occupational agents in environment in the foundry from 1978 to 2008 were dynamically collected. Mean concentration and intensity (geometric mean) of occupational hazards were calculated by job in different years. Main occupational hazards in the foundry were silica, metal fume, noise and heat stress. Silica existed in all of main jobs. The mean concentration of silica before 1986 was an extremely high level of 8.6 mg/m(3), and then remarkably dropped after 1986, with the level of 2.4 mg/m(3) from 1986 to 1989, 2.7 mg/m(3) from 1990 to 2002 and 2.7 mg/m(3) from 2003 to 2008. The trend of silica concentrations by job was consistent with that in general. Silica concentrations among jobs were significantly different, with highest level in melting (4.4 mg/m(3)), followed by cast shakeout and finishing (3.4 mg/m(3)), pouring (3.4 mg/m(3)), sand preparation (2.4 mg/m(3)), moulding (2.1 mg/m(3)) and core-making (1.7 mg/m(3)). Concentration of respirable dust in pouring was highest (2.76 mg/m(3)), followed by cast shakeout and finishing (1.14 mg/m(3)). Mean concentration of asbestos dust in melting was a relative high level of 2.0 mg/m(3). In core-making and sand preparation, there existed emission production of adhesive, with mean concentrations as followed, ammonia (5.84 mg/m(3)), formaldehyde (0.60 mg/m(3)), phenol (1.73 mg/m(3)) and phenol formaldehyde resin (1.3 mg/m(3)) also existed. Benzene and its homologues existed in cast shakeout and finishing, and the level of benzene, toluene, xylene was 0.2 mg/m(3), 0.1 mg/m(3) and 1.3 mg/m(3), respectively. In pouring and melting, there existed chemical occupational hazards, including benzo(a) pyrene, metal fume (lead, cadmium, manganese, nickel, chromium) and gas

  20. Implementation of hazard analysis and critical control points in the drinking water supply system

    Directory of Open Access Journals (Sweden)

    Asghar Tavasolifar

    2012-01-01

    Full Text Available Aims: This study was aimed to design comprehensive risk management based on hazard analysis and critical control points (HACCP in the Isfahan drinking water system. Materials and Methods: Data obtained from field inspections and through related organizations of Isfahan, Iran. The most important risks and risky events of water quality in all sources of raw water in the study area including the Zayanderoud river, the water treatment plant, and the distribution system were identified and analyzed. Practical measures for the protection, control, and limitation of the risks in different phases, from water supply to consumption point, were presented in the form of seven principles of the HACCP system. Results: It was found that there was a potential of hazards during the treatment process of water because of seasonal changes and discharge of various pollutants. Water contamination could occur in eight identified critical control points (CCP. River water could be contaminated by rural communities on the banks of the river, by natural and sudden accidents, by subversive accidents, by incomplete operation, by lack of proportionate of the current treatment process, and by the high extent of antiquity of the Isfahan water distribution system. Conclusions: In order to provide safe drinking water, it is necessary to implement a modern risk management system such as the HACCP approach. The increasing trend of the Zayandehroud river pollution needs urgent attention. Therefore, the role of the government in developing and mandating the HACCP system in water industries is essential.

  1. Seismic hazard analysis of Tianjin area based on strong ground motion prediction

    Science.gov (United States)

    Zhao, Boming

    2010-08-01

    Taking Tianjin as an example, this paper proposed a methodology and process for evaluating near-fault strong ground motions from future earthquakes to mitigate earthquake damage for the metropolitan area and important engineering structures. The result of strong ground motion was predicted for Tianjin main faults by the hybrid method which mainly consists of 3D finite difference method and stochastic Green’s function. Simulation is performed for 3D structures of Tianjin region and characterized asperity models. The characterized asperity model describing source heterogeneity is introduced following the fault information from the project of Tianjin Active Faults and Seismic Hazard Assessment. We simulated the worst case that two earthquakes separately occur. The results indicate that the fault position, rupture process and the sedimentary deposits of the basin significantly affect amplification of the simulated ground motion. Our results also demonstrate the possibility of practical simulating wave propagation including basin induced surface waves in broad frequency-band, for seismic hazard analysis near the fault from future earthquakes in urbanized areas.

  2. Seafood safety: economics of hazard analysis and Critical Control Point (HACCP) programmes

    National Research Council Canada - National Science Library

    Cato, James C

    1998-01-01

    .... This document on economic issues associated with seafood safety was prepared to complement the work of the Service in seafood technology, plant sanitation and Hazard Analysis Critical Control Point (HACCP) implementation...

  3. Development of hazard analysis by critical control points (HACCP) procedures to control organic chemical hazards in the agricultural production of raw food commodities.

    Science.gov (United States)

    Ropkins, Karl; Ferguson, Andrew; Beck, Angus J

    2003-01-01

    Hazard Analysis by Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards in the food chain. Effective HACCP requires the consideration of all chemical microbiological, and physical hazards. However, current procedures focus primarily on microbiological and physical hazards, while chemical aspects of HACCP have received relatively little attention. In this article we discuss the application of HACCP to organic chemical contaminants and the problems that are likely to be encountered in agriculture. We also present generic templates for the development of organic chemical contaminant HACCP procedures for selected raw food commodities, that is, cereal crops,raw meats, and milk.

  4. A Quantitative Risk Analysis Method for the High Hazard Mechanical System in Petroleum and Petrochemical Industry

    Directory of Open Access Journals (Sweden)

    Yang Tang

    2017-12-01

    Full Text Available The high hazard mechanical system (HHMS has three characteristics in the petroleum and petrochemical industry (PPI: high risk, high cost, and high technology requirements. For a HHMS, part, component, and subsystem failures will result in varying degrees and various types of risk consequences, including unexpected downtime, production losses, economic costs, safety accidents, and environmental pollution. Thus, obtaining the quantitative risk level and distribution in a HHMS to control major risk accidents and ensure safe production is of vital importance. However, the structure of the HHMS is more complex than some other systems, making the quantitative risk analysis process more difficult. Additionally, a variety of uncertain risk data hinder the realization of quantitative risk analysis. A few quantitative risk analysis techniques and studies for HHMS exist, especially in the PPI. Therefore, a study on the quantitative risk analysis method for HHMS was completed to obtain the risk level and distribution of high-risk objects. Firstly, Fuzzy Set Theory (FST was applied to address the uncertain risk data for the occurrence probability (OP and consequence severity (CS in the risk analysis process. Secondly, a fuzzy fault tree analysis (FFTA and a fuzzy event tree analysis (FETA were used to achieve quantitative risk analysis and calculation. Thirdly, a fuzzy bow-tie model (FBTM was established to obtain a quantitative risk assessment result according to the analysis results of the FFTA and FETA. Finally, the feasibility and practicability of the method were verified with a case study on the quantitative risk analysis of one reciprocating pump system (RPS. The quantitative risk analysis method for HHMS can provide more accurate and scientific data support for the development of Asset Integrity Management (AIM systems in the PPI.

  5. Climatic hazards warning process in Bangladesh: Experience of, and lessons from, the 1991 April cyclone

    Science.gov (United States)

    Haque, C. Emdad

    1995-09-01

    Science and technology cannot control entirely the causes of natural hazards. However, by using multifaceted programs to modify the physical and human use systems, the potential losses from disasters can effectively be minized. Predicting, identifying, monitoring, and forecasting extreme meteorological events are the preliminary actions towards mitigating the cyclone-loss potential of coastal inhabitants, but without the successful dissemination of forecasts and relevant information, and without appropriate responses by the potential victims, the loss potential would probably remain the same. This study examines the process through which warning of the impending disastrous cyclone of April 1991 was received by the local communities and disseminated throughout the coastal regions of Bangladesh. It is found that identification of the threatening condition due to atmospheric disturbance, monitoring of the hazard event, and dissemination of the cyclone warning were each very successful. However, due to a number of socioeconomic and cognitive factors, the reactions and responses of coastal inhabitants to the warning were in general passive, resulting in a colossal loss, both at the individual and national level. The study recommends that the hazard mitigation policies should be integrated with national economic development plans and programs. Specifically, it is suggested that, in order to attain its goals, the cyclone warning system should regard the aspects of human response to warnings as a constituent part and accommodate human dimensions in its operational design.

  6. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  7. A historical analysis of hazardous events in the Alps – the case of Hindelang (Bavaria, Germany

    Directory of Open Access Journals (Sweden)

    F. Barnikel

    2003-01-01

    Full Text Available A historical analysis of natural hazards for the Hindelang area in the Bavarian Alps is done by researching and assessing data from different archives. The focus is on an evaluation of historical hazards on a local scale by working with written documents only. Data is compiled from the archives of governmental departments, local authorities, private collections and state archives. The bandwidth of the assessed hazards includes floods, mass movements and snow avalanches. So far we have collected more than 400 references for events in the Hindelang area, some of which at times or in places where natural hazards used to be thought of as unlikely or unknown. Our aim was to collect all written data for this area and to deduce as much information on the hazardous effects on the environment as possible, thereby enhancing our knowledge about past climatic and geomorphic dynamics in the Alps.

  8. Novel two stage bio-oxidation and chlorination process for high strength hazardous coal carbonization effluent.

    Science.gov (United States)

    Manekar, Pravin; Biswas, Rima; Karthik, Manikavasagam; Nandy, Tapas

    2011-05-15

    Effluent generated from coal carbonization to coke was characterized with high organic content, phenols, ammonium nitrogen, and cyanides. A full scale effluent treatment plant (ETP) working on the principle of single stage carbon-nitrogen bio-oxidation process (SSCNBP) revealed competition between heterotrophic and autotrophic bacteria in the bio-degradation and nitrification process. The effluent was pretreated in a stripper and further combined with other streams to treat in the SSCNBP. Laboratory studies were carried on process and stripped effluents in a bench scale model of ammonia stripper and a two stage bio-oxidation process. The free ammonia removal efficiency of stripper was in the range 70-89%. Bench scale studies of the two stage bio-oxidation process achieved a carbon-nitrogen reduction at 6 days hydraulic retention time (HRT) operating in an extended aeration mode. This paper addresses the studies on selection of a treatment process for removal of organic matter, phenols, cyanide and ammonia nitrogen. The treatment scheme comprising ammonia stripping (pretreatment) followed by the two stage bio-oxidation and chlorination process met the Indian Standards for discharge into Inland Surface Waters. This treatment process package offers a techno-economically viable treatment scheme to neuter hazardous effluent generated from coal carbonization process. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. In silico analysis of nanomaterials hazard and risk.

    Science.gov (United States)

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    false positive relative to false negative predictions and the reliability of toxicity data. To establish the environmental impact of ENMs identified as toxic, researchers will need to estimate the potential level of environmental exposure concentration of ENMs in the various media such as air, water, soil, and vegetation. When environmental monitoring data are not available, models of ENMs fate and transport (at various levels of complexity) serve as alternative approaches for estimating exposure concentrations. Risk management decisions regarding the manufacturing, use, and environmental regulations of ENMs would clearly benefit from both the assessment of potential ENMs exposure concentrations and suitable toxicity metrics. The decision process should consider the totality of available information: quantitative and qualitative data and the analysis of nanomaterials toxicity, and fate and transport behavior in the environment. Effective decision-making to address the potential impacts of nanomaterials will require considerations of the relevant environmental, ecological, technological, economic, and sociopolitical factors affecting the complete lifecycle of nanomaterials, while accounting for data and modeling uncertainties. Accordingly, researchers will need to establish standardized data management and analysis tools through nanoinformatics as a basis for the development of rational decision tools.

  10. Sneak analysis applied to process systems

    Science.gov (United States)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  11. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  12. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    Science.gov (United States)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready

  13. A regional analysis of elements at risk exposed to mountain hazards in the Eastern European Alps

    Science.gov (United States)

    Fuchs, Sven; Zischg, Andreas

    2014-05-01

    We present a method to quantify the number and value of buildings exposed to torrents and snow avalanches in the Austrian Alps, as well as the number of exposed people. Based on a unique population and building register dataset, a relational SQL database was developed that allows in combination with GIS data a rule-based nation-wide automated analysis. Furthermore, possibilities and challenges are discussed with respect to the use of such data in vulnerability assessment and with respect to resilience measures. We comprehensively address the challenge of data accuracy, scale and uncertainties. From the total of approximately 2.4 million buildings with a clearly attributable geographical location, around 120,000 are exposed to torrent processes (5 %) and snow avalanches (0.4 %); exposition was defined here as located within the digitally available hazard maps of the Austrian Torrent and Avalanche Control Service. Around 5 % of the population (360,000 out of 8.5 million inhabitants), based on those people being compulsory listed in the population register, are located in these areas. The analysis according to the building category resulted in 2.05 million residential buildings in Austria (85 %), 93,000 of which (4.5 %) are exposed to these hazards. In contrast, 37,300 buildings (1.6 %) throughout the country belong to the category of accommodation facilities, 5,600 of which are exposed (15 %). Out of the 140,500 commercial buildings, 8,000 (5 %) are exposed. A considerable spatial variation was detectable within the communities and Federal States. In general, an above-average exposition of buildings to torrent process and snow avalanches was detectable in communities located in the Federal State of Salzburg, Styria and Vorarlberg (torrents), and Tyrol and Vorarlberg (snow avalanches). In the alpine part of Austria, the share of exposed accommodation buildings was two times (Salzburg) and three times (Vorarlberg) higher than the regional average of exposed buildings

  14. Flood hazard zoning in Yasooj region, Iran, using GIS and multi-criteria decision analysis

    OpenAIRE

    Omid Rahmati; Hossein Zeinivand; Mosa Besharat

    2016-01-01

    Flood is considered to be the most common natural disaster worldwide during the last decades. Flood hazard potential mapping is required for management and mitigation of flood. The present research was aimed to assess the efficiency of analytical hierarchical process (AHP) to identify potential flood hazard zones by comparing with the results of a hydraulic model. Initially, four parameters via distance to river, land use, elevation and land slope were used in some part of the Yasooj River, I...

  15. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Paces, James B. [U.S. Geological Survey

    2014-08-31

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  16. Analysis and evaluation of "noise" of occupational hazards in pumped storage power station

    Science.gov (United States)

    Zhao, Xin; Yang, Hongjian; Zhang, Huafei; Chen, Tao

    2017-05-01

    Aiming at the influence of "noise" of occupational hazards on the physical health of workers, the noise intensity of a working area of a hydropower station in China was evaluated comprehensively. Under the condition of power generation, noise detection is conducted on the main patrol area of the operator, and the noise samples in different regions are analyzed and processed by the single factor analysis of variance. The results show that the noise intensity of different working areas is significantly different, in which the overall noise level of the turbine layer is the highest and beyond the national standard, the protection measures need to be strengthened and the noise intensity of the rest area is normal

  17. Pathogen Reduction and Hazard Analysis and Critical Control Point (HACCP) systems for meat and poultry. USDA.

    Science.gov (United States)

    Hogue, A T; White, P L; Heminover, J A

    1998-03-01

    The United States Department of Agriculture (USDA) Food Safety Inspection Service (FSIS) adopted Hazard Analysis and Critical Control Point Systems and established finished product standards for Salmonella in slaughter plants to improve food safety for meat and poultry. In order to make significant improvements in food safety, measures must be taken at all points in the farm-to-table chain including production, transportation, slaughter, processing, storage, retail, and food preparation. Since pathogens can be introduced or multiplied anywhere along the continuum, success depends on consideration and comparison of intervention measures throughout the continuum. Food animal and public health veterinarians can create the necessary preventative environment that mitigates risks for food borne pathogen contamination.

  18. Safety Analysis of Soybean Processing for Advanced Life Support

    Science.gov (United States)

    Hentges, Dawn L.

    1999-01-01

    Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.

  19. Recycling of hazardous solid waste material using high-temperature solar process heat

    Energy Technology Data Exchange (ETDEWEB)

    Schaffner, B.; Meier, A.; Wuillemin, D.; Hoffelner, W.; Steinfeld, A.

    2003-03-01

    A novel high-temperature solar chemical reactor is proposed for the thermal recycling of hazardous solid waste material using concentrated solar power. A 10 kW solar reactor prototype was designed and tested for the carbothermic reduction of electric arc furnace dusts (EAFD). The reactor was subjected to mean solar flux intensities of 2000 kW/m2 and operated in both batch and continuous mode within the temperature range 1120-1400 K. Extraction of up to 99% and 90% of the Zn originally contained in the EAFD was achieved in the residue for the batch and continuous solar experiments, respectively. The condensed off-gas products consisted mainly of Zn, Pb, and Cl. No ZnO was detected when the O{sub 2} concentration remained below 2 vol.-%. The use of concentrated solar energy as the source of process heat offers the possibility of converting hazardous solid waste material into valuable commodities for processes in closed and sustainable material cycles. (author)

  20. Mapping landslide processes in the North Tanganyika - Lake Kivu rift zones: towards a regional hazard assessment

    Science.gov (United States)

    Dewitte, Olivier; Monsieurs, Elise; Jacobs, Liesbet; Basimike, Joseph; Delvaux, Damien; Draida, Salah; Hamenyimana, Jean-Baptiste; Havenith, Hans-Balder; Kubwimana, Désiré; Maki Mateso, Jean-Claude; Michellier, Caroline; Nahimana, Louis; Ndayisenga, Aloys; Ngenzebuhoro, Pierre-Claver; Nkurunziza, Pascal; Nshokano, Jean-Robert; Sindayihebura, Bernard; Philippe, Trefois; Turimumahoro, Denis; Kervyn, François

    2015-04-01

    The mountainous environments of the North Tanganyika - Lake Kivu rift zones are part of the West branch of the East African Rift. In this area, natural triggering and environmental factors such as heavy rainfalls, earthquake occurrences and steep topographies favour the concentration of mass movement processes. In addition anthropogenic factors such as rapid land use changes and urban expansion increase the sensibility to slope instability. Until very recently few landslide data was available for the area. Now, through the initiation of several research projects and the setting-up of a methodology for data collection adapted to this data-poor environment, it becomes possible to draw a first regional picture of the landslide hazard. Landslides include a wide range of ground movements such as rock falls, deep failure of slopes and shallow debris flows. Landslides are possibly the most important geohazard in the region in terms of recurring impact on the populations, causing fatalities every year. Many landslides are observed each year in the whole region, and their occurrence is clearly linked to complex topographic, lithological and vegetation signatures coupled with heavy rainfall events, which is the main triggering factor. Here we present the current knowledge of the various slope processes present in these equatorial environments. A particular attention is given to urban areas such as Bukavu and Bujumbura where landslide threat is particularly acute. Results and research perspectives on landslide inventorying, monitoring, and susceptibility and hazard assessment are presented.

  1. Network meta-analysis on the log-hazard scale, combining count and hazard ratio statistics accounting for multi-arm trials: A tutorial

    Directory of Open Access Journals (Sweden)

    Hawkins Neil

    2010-06-01

    Full Text Available Abstract Background Data on survival endpoints are usually summarised using either hazard ratio, cumulative number of events, or median survival statistics. Network meta-analysis, an extension of traditional pairwise meta-analysis, is typically based on a single statistic. In this case, studies which do not report the chosen statistic are excluded from the analysis which may introduce bias. Methods In this paper we present a tutorial illustrating how network meta-analyses of survival endpoints can combine count and hazard ratio statistics in a single analysis on the hazard ratio scale. We also describe methods for accounting for the correlations in relative treatment effects (such as hazard ratios that arise in trials with more than two arms. Combination of count and hazard ratio data in a single analysis is achieved by estimating the cumulative hazard for each trial arm reporting count data. Correlation in relative treatment effects in multi-arm trials is preserved by converting the relative treatment effect estimates (the hazard ratios to arm-specific outcomes (hazards. Results A worked example of an analysis of mortality data in chronic obstructive pulmonary disease (COPD is used to illustrate the methods. The data set and WinBUGS code for fixed and random effects models are provided. Conclusions By incorporating all data presentations in a single analysis, we avoid the potential selection bias associated with conducting an analysis for a single statistic and the potential difficulties of interpretation, misleading results and loss of available treatment comparisons associated with conducting separate analyses for different summary statistics.

  2. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

    Science.gov (United States)

    Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

    2016-04-01

    measures in the area. After designing measures, the users can re-calculate risk by updating hazard intensity and object layers. This is achieved by manual editing of shape (vector) layers in the web-GIS interface interactively. Within the application, a cost-benefit analysis tool is also integrated to support the decision-making process for the selection of different protection measures. Finally, the resultant risk information (vector layers and data) can be exported in the form of shapefiles and excel sheets. A prototype application is realized using open-source geospatial software and technologies. Boundless framework with its client-side SDK environment is applied for the rapid prototyping. Free and open source components such as PostGIS spatial database, GeoServer and GeoWebCache, GeoExt and OpenLayers are used for the development of the platform. This developed prototype is demonstrated with a case study area located in Les Diablerets, Switzerland. This research work is carried out within a project funded by the Canton of Vaud, Switzerland. References: Bründl, M., Romang, H. E., Bischof, N., and Rheinberger, C. M.: The risk concept and its application in natural hazard risk management in Switzerland, Nat. Hazards Earth Syst. Sci., 9, 801-813, 2009. DGE: Valdorisk - Direction Générale de l'Environnement, www.vd.ch, accessed 9 January 2016, 2016. OFEV: EconoMe - Office fédéral de l'environnement, www.econome.admin.ch, accessed 9 January 2016, 2016.

  3. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  4. Real time explosive hazard information sensing, processing, and communication for autonomous operation

    Energy Technology Data Exchange (ETDEWEB)

    Versteeg, Roelof J.; Few, Douglas A.; Kinoshita, Robert A.; Johnson, Douglas; Linda, Ondrej

    2015-12-15

    Methods, computer readable media, and apparatuses provide robotic explosive hazard detection. A robot intelligence kernel (RIK) includes a dynamic autonomy structure with two or more autonomy levels between operator intervention and robot initiative A mine sensor and processing module (ESPM) operating separately from the RIK perceives environmental variables indicative of a mine using subsurface perceptors. The ESPM processes mine information to determine a likelihood of a presence of a mine. A robot can autonomously modify behavior responsive to an indication of a detected mine. The behavior is modified between detection of mines, detailed scanning and characterization of the mine, developing mine indication parameters, and resuming detection. Real time messages are passed between the RIK and the ESPM. A combination of ESPM bound messages and RIK bound messages cause the robot platform to switch between modes including a calibration mode, the mine detection mode, and the mine characterization mode.

  5. Toxic hazard and chemical analysis of leachates from furfurylated wood

    NARCIS (Netherlands)

    Pilgard, A.; Treu, A.; Zeeland, van A.N.T.; Gosselink, R.J.A.; Westin, M.

    2010-01-01

    The furfurylation process is an extensively investigated wood modification process. Furfuryl alcohol molecules penetrate into the wood cell wall and polymerize in situ. This results in a permanent swelling of the wood cell walls. It is unclear whether or not chemical bonds exist between the furfuryl

  6. Process chains in high mountain areas and multi-hazards of different scales - the Barsem disaster, Tajikistan

    Science.gov (United States)

    Zimmermann, Markus; Fuchs, Sven; Keiler, Margreth

    2016-04-01

    Changes in high-mountain environments are responsible for new and challenging multi-hazard conditions and materialize in particular through cases such as the Barsem disaster (Pamir, Tajikistan) in July 2015. At least 14 major debris flows occurred in the Barsem Valley within four days during a period of exceptional meteorological conditions. The flows transported large volumes of debris on the fan where the village Barsem with about 1,500 inhabitants is located. As a result, 80 homes were completely destroyed, and one person went lost. Moreover, the debris dammed the Gunt River, forming a lake of two kilometers length and endangering the local power supply. The lake interrupted the Pamir Highway and the potential lake outburst threatened the downstream communities along the valley as well as Khorog, the capital of the Gorno Badakhshan Autonomous Oblast. The damage was caused directly by the debris flows deposits and by subsequent flooding as a consequence of dammed Gunt River. This contribution will provide a first analysis of the conditions in the debris flow starting zone and the triggering of the event, the sediment connectivity during the event and further consequences downstream related to the accumulated debris dam at the Gunt River. Furthermore, the analysis will be supported by a comparison between different events in the Pamir region and the European Alps focusing on geomorphological features in the starting zone, processes sequences, process-process interactions but also on emerging multi-hazard situation in this context. Increasing challenges due to changes in the high-mountain environment will be discusses for the Pamir region as well as the comparability between different mountain regions.

  7. Dynamic analysis of process reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process models are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.

  8. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, Northwest Italy

    Science.gov (United States)

    Turconi, L.; Tropeano, D.; Savio, G.; De, S. K.; Mason, P. J.

    2015-09-01

    The study area (600 km2), consisting of Orco and Soana valleys in the Western Italian Alps, experienced different types of natural hazards, typical of the whole Alpine environment. Some of the authors have been requested to draw a civil protection plan for such mountainous regions. This offered the special opportunity (1) to draw a lot of unpublished historical data, dating back several centuries mostly concerning natural hazard processes and related damages, (2) to develop original detailed geo-morphological studies in a region still poorly known, (3) to prepare detailed thematic maps illustrating landscape components related to natural conditions and hazards, (4) to thoroughly check present-day situations in the area compared to the effects of past events and (5) to find adequate natural hazard scenarios for all sites exposed to risk. The method of work has been essentially to compare archival findings with field evidence in order to assess natural hazard processes, their occurrence and magnitude, and to arrange all such elements in a database for GIS-supported thematic maps. Several types of natural hazards, such as landslides, rockfalls, debris flows, stream floods and snow avalanches cause huge damage to lives and properties (housings, roads, tourist sites). We aim to obtain newly acquired knowledge in this large, still poorly understood area as well as develop easy-to-interpret products such as natural risk maps.

  9. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  10. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    Science.gov (United States)

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  11. Hazard Analysis for the Pretreatment Engineering Platform (PEP)

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, Robin S.; Geeting, John GH; Lawrence, Wesley E.; Young, Jonathan

    2008-07-10

    The Pretreatment Engineering Platform (PEP) is designed to perform a demonstration on an engineering scale to confirm the Hanford Waste Treatment Plant Pretreatment Facility (PTF) leaching and filtration process equipment design and sludge treatment process. The system will use scaled prototypic equipment to demonstrate sludge water wash, caustic leaching, oxidative leaching, and filtration. Unit operations to be tested include pumping, solids washing, chemical reagent addition and blending, heating, cooling, leaching, filtration, and filter cleaning. In addition, the PEP will evaluate potential design changes to the ultrafiltration process system equipment to potentially enhance leaching and filtration performance as well as overall pretreatment throughput. The skid-mounted system will be installed and operated in the Processing Development Laboratory-West at Pacific Northwest National Laboratory (PNNL) in Richland, Washington.

  12. High-resolution marine flood modelling coupling overflow and overtopping processes: framing the hazard based on historical and statistical approaches

    Science.gov (United States)

    Nicolae Lerma, Alexandre; Bulteau, Thomas; Elineau, Sylvain; Paris, François; Durand, Paul; Anselme, Brice; Pedreros, Rodrigo

    2018-01-01

    A modelling chain was implemented in order to propose a realistic appraisal of the risk in coastal areas affected by overflowing as well as overtopping processes. Simulations are performed through a nested downscaling strategy from regional to local scale at high spatial resolution with explicit buildings, urban structures such as sea front walls and hydraulic structures liable to affect the propagation of water in urban areas. Validation of the model performance is based on hard and soft available data analysis and conversion of qualitative to quantitative information to reconstruct the area affected by flooding and the succession of events during two recent storms. Two joint probability approaches (joint exceedance contour and environmental contour) are used to define 100-year offshore conditions scenarios and to investigate the flood response to each scenario in terms of (1) maximum spatial extent of flooded areas, (2) volumes of water propagation inland and (3) water level in flooded areas. Scenarios of sea level rise are also considered in order to evaluate the potential hazard evolution. Our simulations show that for a maximising 100-year hazard scenario, for the municipality as a whole, 38 % of the affected zones are prone to overflow flooding and 62 % to flooding by propagation of overtopping water volume along the seafront. Results also reveal that for the two kinds of statistic scenarios a difference of about 5 % in the forcing conditions (water level, wave height and period) can produce significant differences in terms of flooding like +13.5 % of water volumes propagating inland or +11.3 % of affected surfaces. In some areas, flood response appears to be very sensitive to the chosen scenario with differences of 0.3 to 0.5 m in water level. The developed approach enables one to frame the 100-year hazard and to characterize spatially the robustness or the uncertainty over the results. Considering a 100-year scenario with mean sea level rise (0.6 m), hazard

  13. The Transfer of Core-Based Hazardous Production Processes to the Export Processing Zones of the Periphery: The Maquiladora Centers of Northern Mexico

    Directory of Open Access Journals (Sweden)

    R. Scott Frey

    2015-08-01

    Full Text Available Transnational corporations appropriate 'carrying capacity" for the core by transferring the core's hazardous products, production processes, and wastes to the peripheral countries of the world-system. An increasingly important form of this reproduction process is the transfer of core-based hazardous industries to export processing zones (EPZs locatedin a number of peripheral countries in Africa, Asia, and Latin America and the Caribbean. A specific case is examined in this paper: the transfer of hazardous industries to the maquiladora centers located on the Mexican side of the Mexico-U.S. border. Maquiladoras provide an excellent case for examining what is known about the causes, adverse consequences, and political responses associated with the transfer of core-based hazardous production processes to the EPZs of the periphery.

  14. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant.

    Science.gov (United States)

    Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen

    2015-09-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.

  15. Marine natural hazards in coastal zone: observations, analysis and modelling (Plinius Medal Lecture)

    Science.gov (United States)

    Didenkulova, Ira

    2010-05-01

    Giant surface waves approaching the coast frequently cause extensive coastal flooding, destruction of coastal constructions and loss of lives. Such waves can be generated by various phenomena: strong storms and cyclones, underwater earthquakes, high-speed ferries, aerial and submarine landslides. The most famous examples of such events are the catastrophic tsunami in the Indian Ocean, which occurred on 26 December 2004 and hurricane Katrina (28 August 2005) in the Atlantic Ocean. The huge storm in the Baltic Sea on 9 January 2005, which produced unexpectedly long waves in many areas of the Baltic Sea and the influence of unusually high surge created by long waves from high-speed ferries, should also be mentioned as examples of regional marine natural hazards connected with extensive runup of certain types of waves. The processes of wave shoaling and runup for all these different marine natural hazards (tsunami, coastal freak waves, ship waves) are studied based on rigorous solutions of nonlinear shallow-water theory. The key and novel results presented here are: i) parameterization of basic formulas for extreme runup characteristics for bell-shape waves, showing that they weakly depend on the initial wave shape, which is usually unknown in real sea conditions; ii) runup analysis of periodic asymmetric waves with a steep front, as such waves are penetrating inland over large distances and with larger velocities than symmetric waves; iii) statistical analysis of irregular wave runup demonstrating that wave nonlinearity nearshore does not influence on the probability distribution of the velocity of the moving shoreline and its moments, and influences on the vertical displacement of the moving shoreline (runup). Wave runup on convex beaches and in narrow bays, which allow abnormal wave amplification is also discussed. Described analytical results are used for explanation of observed extreme runup of tsunami, freak (sneaker) waves and ship waves on different coasts

  16. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    Science.gov (United States)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  17. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  18. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    Science.gov (United States)

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  19. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    Science.gov (United States)

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  20. Natural hazards for the Earth's civilization from space, 1. Cosmic ray influence on atmospheric processes

    Directory of Open Access Journals (Sweden)

    L. I. Dorman

    2008-04-01

    Full Text Available In this paper we give a short description of global natural disasters for the Earth's civilization from space: 1 Galactic and solar cosmic ray (CR influence on the atmospheric processes; 2 Impacts of great space magnetic storms during big Forbush-effects in CR, 3 Impacts of great radiation hazards from solar CR during flare energetic particle events, 4 Great impacts on planetary climate during periods of the Solar system capturing by molecular-dust clouds, 5 Catastrophic disasters from nearby Supernova explosions, and 6 Catastrophic disasters from asteroid impacts on the Earth. Some of these problems have been already studied (see e.g. Dorman, 1957, 1963a, b; Dorman and Miroshnichenko, 1968; Dorman, 1972, 1974, 1975a, b, 1978; Velinov et al., 1974; Miroshnichenko, 2001, 2003; Dorman, 2004, 2006, 2008. We present here a detailed treatment of the first disaster only, leaving to future papers the analysis of the other aspects.

  1. Analysis on the Industrial Design of Food Package and the Component of Hazardous Substance in the Packaging Material

    OpenAIRE

    Wei-Wen Huang

    2015-01-01

    Transferring the hazardous chemicals contained in food packaging materials into food would threaten the health of consumers, therefore, the related laws and regulations and the detection method of hazardous substance have been established at home and abroad to ensure the safety to use the food packaging material. According to the analysis on the hazardous component in the food packaging, a set of detection methods for hazardous substance in the food packaging was established in the paper and ...

  2. Hazard Analysis of Pollution Abatement Techniques. Volume I

    Science.gov (United States)

    1974-06-01

    tower (or scrubber ) at the end of the nitric acid manufacturing process. This is the point at which the tail-gas is received by the molecular sieve...NH4NO3 , or NH3 ) and a fuel source, such as a bearing leaking oil, or oil left inside the unit after maintenance. NH4NO3 and NH3 are normally not

  3. Flood hazard zoning in Yasooj region, Iran, using GIS and multi-criteria decision analysis

    Directory of Open Access Journals (Sweden)

    Omid Rahmati

    2016-05-01

    Full Text Available Flood is considered to be the most common natural disaster worldwide during the last decades. Flood hazard potential mapping is required for management and mitigation of flood. The present research was aimed to assess the efficiency of analytical hierarchical process (AHP to identify potential flood hazard zones by comparing with the results of a hydraulic model. Initially, four parameters via distance to river, land use, elevation and land slope were used in some part of the Yasooj River, Iran. In order to determine the weight of each effective factor, questionnaires of comparison ratings on the Saaty's scale were prepared and distributed to eight experts. The normalized weights of criteria/parameters were determined based on Saaty's nine-point scale and its importance in specifying flood hazard potential zones using the AHP and eigenvector methods. The set of criteria were integrated by weighted linear combination method using ArcGIS 10.2 software to generate flood hazard prediction map. The inundation simulation (extent and depth of flood was conducted using hydrodynamic program HEC-RAS for 50- and 100-year interval floods. The validation of the flood hazard prediction map was conducted based on flood extent and depth maps. The results showed that the AHP technique is promising of making accurate and reliable prediction for flood extent. Therefore, the AHP and geographic information system (GIS techniques are suggested for assessment of the flood hazard potential, specifically in no-data regions.

  4. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  5. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  6. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    Science.gov (United States)

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-08

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  7. Automatic hazard analysis of batch operations with Petri nets

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.-F.; Wu, J.-Y.; Chang, C.-T

    2002-04-01

    A systematic procedure has been proposed in this study to construct Petri nets for modeling the fault propagation behaviors in batch processes. An efficient algorithm has also been developed to enumerate all possible scenarios, which may lead to an undesirable consequence. This approach has been applied to a number of examples. The results show that it is more accurate and more comprehensive when compared with the conventional methods.

  8. The Total Risk Analysis of Large Dams under Flood Hazards

    Directory of Open Access Journals (Sweden)

    Yu Chen

    2018-02-01

    Full Text Available Dams and reservoirs are useful systems in water conservancy projects; however, they also pose a high-risk potential for large downstream areas. Flood, as the driving force of dam overtopping, is the main cause of dam failure. Dam floods and their risks are of interest to researchers and managers. In hydraulic engineering, there is a growing tendency to evaluate dam flood risk based on statistical and probabilistic methods that are unsuitable for the situations with rare historical data or low flood probability, so a more reasonable dam flood risk analysis method with fewer application restrictions is needed. Therefore, different from previous studies, this study develops a flood risk analysis method for large dams based on the concept of total risk factor (TRF used initially in dam seismic risk analysis. The proposed method is not affected by the adequacy of historical data or the low probability of flood and is capable of analyzing the dam structure influence, the flood vulnerability of the dam site, and downstream risk as well as estimating the TRF of each dam and assigning corresponding risk classes to each dam. Application to large dams in the Dadu River Basin, Southwestern China, demonstrates that the proposed method provides quick risk estimation and comparison, which can help local management officials perform more detailed dam safety evaluations for useful risk management information.

  9. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 18. Errors in Probabilistic Seismic Hazard Analysis.

    Science.gov (United States)

    1982-01-01

    hazard is conditional on a given t.a. process representation of seismicity, symbolized here by the random process X(t). However, X(t) is not always known...Regionalized Variables and Its Applications, Les Cahiers du Centre de Morphologie Mathematique de Fontainbleau, No. 5. McGuire, R.K. and Shedlock

  10. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  11. Systemic cost-effectiveness analysis of food hazard reduction

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Lawson, Lartey Godwin; Lund, Mogens

    2015-01-01

    stage are considered. Cost analyses are conducted for different risk reduction targets and for three alternative scenarios concerning the acceptable range of interventions. Results demonstrate that using a system-wide policy approach to risk reduction can be more cost-effective than a policy focusing......An integrated microbiological-economic framework for policy support is developed to determine the cost-effectiveness of alternative intervention methods and strategies to reduce the risk of Campylobacter in broilers. Four interventions at the farm level and four interventions at the processing...... purely on farm-level interventions. Allowing for chemical decontamination methods may enhance cost-effectiveness of intervention strategies further....

  12. Occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Malaysia

    Science.gov (United States)

    Abdullahi, Auwalu; Hassan, Azmi; Kadarman, Norizhar; Junaidu, Yakubu Muhammad; Adeyemo, Olanike Kudrat; Lua, Pei Lin

    2016-01-01

    Purpose This study aims to investigate the occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Terengganu State, Malaysia. Occupational hazards are the major source of morbidity and mortality among the animal workers due to exposure to many hazardous situations in their daily practices. Occupational infections mostly contracted by abattoir workers could be caused by iatrogenic or transmissible agents, including viruses, bacteria, fungi, and parasites and the toxins produced by these organisms. Materials and methods The methodology was based on a cross-sectional survey using cluster sampling technique in the four districts of Terengganu State, Malaysia. One hundred and twenty-one abattoir workers from five abattoirs were assessed using a validated structured questionnaire and an observation checklist. Results The mean and standard deviation of occupational hazards scores of the workers were 2.32 (2.721). Physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards were the major findings of this study. However, the highest prevalence of occupational hazards identified among the workers was injury by sharp equipment such as a knife (20.0%), noise exposure (17.0%), and due to offensive odor within the abattoir premises (12.0%). Conclusion The major occupational hazards encountered by the workers in the study area were physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards. To ensure proper control of occupational health hazards among the abattoir workers, standard design and good environmental hygiene must be taken into consideration all the time. Exposure control plan, which includes risk identification, risk characterization, assessment of workers at risk, risk control, workers’ education/training, and implementation of safe work procedures, should be implemented by the government and all the existing laws governing the abattoir

  13. Occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Malaysia.

    Science.gov (United States)

    Abdullahi, Auwalu; Hassan, Azmi; Kadarman, Norizhar; Junaidu, Yakubu Muhammad; Adeyemo, Olanike Kudrat; Lua, Pei Lin

    2016-01-01

    This study aims to investigate the occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Terengganu State, Malaysia. Occupational hazards are the major source of morbidity and mortality among the animal workers due to exposure to many hazardous situations in their daily practices. Occupational infections mostly contracted by abattoir workers could be caused by iatrogenic or transmissible agents, including viruses, bacteria, fungi, and parasites and the toxins produced by these organisms. The methodology was based on a cross-sectional survey using cluster sampling technique in the four districts of Terengganu State, Malaysia. One hundred and twenty-one abattoir workers from five abattoirs were assessed using a validated structured questionnaire and an observation checklist. The mean and standard deviation of occupational hazards scores of the workers were 2.32 (2.721). Physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards were the major findings of this study. However, the highest prevalence of occupational hazards identified among the workers was injury by sharp equipment such as a knife (20.0%), noise exposure (17.0%), and due to offensive odor within the abattoir premises (12.0%). The major occupational hazards encountered by the workers in the study area were physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards. To ensure proper control of occupational health hazards among the abattoir workers, standard design and good environmental hygiene must be taken into consideration all the time. Exposure control plan, which includes risk identification, risk characterization, assessment of workers at risk, risk control, workers' education/training, and implementation of safe work procedures, should be implemented by the government and all the existing laws governing the abattoir operation in the country should be enforced.

  14. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    Science.gov (United States)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  15. Algal-bacterial processes for the treatment of hazardous contaminants: a review.

    Science.gov (United States)

    Muñoz, Raul; Guieysse, Benoit

    2006-08-01

    Microalgae enhance the removal of nutrients, organic contaminants, heavy metals, and pathogens from domestic wastewater and furnish an interesting raw material for the production of high-value chemicals (algae metabolites) or biogas. Photosynthetic oxygen production also reduces the need for external aeration, which is especially advantageous for the treatment of hazardous pollutants that must be biodegraded aerobically but might volatilize during mechanical aeration. Recent studies have therefore shown that when proper methods for algal selection and cultivation are used, it is possible to use microalgae to produce the O(2) required by acclimatized bacteria to biodegrade hazardous pollutants such as polycyclic aromatic hydrocarbons, phenolics, and organic solvents. Well-mixed photobioreactors with algal biomass recirculation are recommended to protect the microalgae from effluent toxicity and optimize light utilization efficiency. The optimum biomass concentration to maintain in the system depends mainly on the light intensity and the reactor configuration: At low light intensity, the biomass concentration should be optimized to avoid mutual shading and dark respiration whereas at high light intensity, a high biomass concentration can be useful to protect microalgae from light inhibition and optimize the light/dark cycle frequency. Photobioreactors can be designed as open (stabilization ponds or high rate algal ponds) or enclosed (tubular, flat plate) systems. The latter are generally costly to construct and operate but more efficient than open systems. The best configuration to select will depend on factors such as process safety, land cost, and biomass use. Biomass harvest remains a limitation but recent progresses have been made in the selection of flocculating strains, the application of bioflocculants, or the use of immobilized biomass systems.

  16. Pulsed Electric Processing of the Seismic-Active Fault for Earthquake Hazard Mitigation

    Science.gov (United States)

    Novikov, V. A.; Zeigarnik, V. A.; Konev, Yu. B.; Klyuchkin, V. N.

    2010-03-01

    Previous field and laboratory investigations performed in Russia (1999-2008) showed a possibility of application of high-power electric current pulses generated by pulsed MHD power system for triggering the weak seismicity and release of tectonic stresses in the Earth crust for earthquake hazard mitigation. The mechanism of the influence of man-made electromagnetic field on the regional seismicity is not clear yet. One of possible cause of the phenomenon may be formation of cracks in the rocks under fluid pressure increase due to Joule heat generation by electric current injected into the Earth crust. Detailed 3D-calculaton of electric current density in the Earth crust of Northern Tien Shan provided by pulsed MHD power system connected to grounded electric dipole showed that at the depth of earthquake epicenters (> 5km) the electric current density is lower than 10-7 A/m2 that is not sufficient for increase of pressure in the fluid-saturated porous geological medium due to Joule heat generation, which may provide formation of cracks resulting in the fault propagation and release of tectonic stresses in the Earth crust. Nevertheless, under certain conditions, when electric current will be injected into the fault through the casing pipes of deep wells with preliminary injection of conductive fluid into the fault, the current density may be high enough for significant increase of mechanic pressure in the porous two-phase geological medium. Numerical analysis of a crack formation triggered by high-power electric pulses based on generation of mechanical pressure in the geological medium was carried out. It was shown that calculation of mechanical pressure impulse due to high-power electrical current in the porous two-phase medium may be performed neglecting thermal conductance by solving the non-stationary equation of piezo-conductivity with Joule heat generation. For calculation of heat generation the known solution of the task of current spreading from spherical or

  17. Flow-type failures in fine-grained soils : An important aspect in landslide hazard analysis

    NARCIS (Netherlands)

    Van Asch, T.W.J.; Malet, J.P.

    2009-01-01

    Forecasting the possibility of flow-type failures within a slow-moving landslide mass is rarely taken into account in quantitative hazard assessments. Therefore, this paper focuses on the potential transition of sliding blocks (slumps) into flow-like processes due to the generation of excess pore

  18. Reduction of uncertainties in probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-02-01

    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  19. Hazard Map for Autonomous Navigation

    DEFF Research Database (Denmark)

    Riis, Troels

    This dissertation describes the work performed in the area of using image analysis in the process of landing a spacecraft autonomously and safely on the surface of the Moon. This is suggested to be done using a Hazard Map. The correspondence problem between several Hazard Maps are investigated...

  20. Spatial temporal analysis of urban heat hazard in Tangerang City

    Science.gov (United States)

    Wibowo, Adi; Kuswantoro; Ardiansyah; Rustanto, Andry; Putut Ash Shidiq, Iqbal

    2016-11-01

    Urban heat is a natural phenomenon which might caused by human activities. The human activities were represented by various types of land-use such as urban and non-urban area. The aim of this study is to identify the urban heat behavior in Tangerang City as it might threats the urban environment. This study used three types of remote sensing data namely, Landsat TM, Landsat ETM+ and Landsat OLI-TIRS, to capture the urban heat behavior and to analysis the urban heat signature of Tangerang City in 2001, 2012, 2013, 2014, 2015 and 2016. The result showed that urban heat signature change dynamically each month based on the sun radiation. The urban heat island covered only small part of Tangerang City in 2001, but it was significantly increased and reached 50% of the area in 2012. Based on the result on urban heat signature, the threshold for threatening condition is 30 oC which recognized from land surface temperature (LST). The effective temperature (ET) index explains that condition as warm, uncomfortable, increase stress due to sweating and blood flow and may causing cardiovascular disorder.

  1. Signal processing for airborne doppler radar detection of hazardous wind shear as applied to NASA 1991 radar flight experiment data

    Science.gov (United States)

    Baxa, Ernest G., Jr.

    1992-01-01

    Radar data collected during the 1991 NASA flight tests have been selectively analyzed to support research directed at developing both improved as well as new algorithms for detecting hazardous low-altitude windshear. Analysis of aircraft attitude data from several flights indicated that platform stability bandwidths were small compared to the data rate bandwidths which should support an assumption that radar returns can be treated as short time stationary. Various approaches at detection of weather returns in the presence of ground clutter are being investigated. Non-coventional clutter rejection through spectrum mode tracking and classification algorithms is a subject of continuing research. Based upon autoregressive modeling of the radar return time sequence, this approach may offer an alternative to overcome errors in conventional pulse-pair estimates. Adaptive filtering is being evaluated as a means of rejecting clutter with emphasis on low signal-to-clutter ratio situations, particularly in the presence of discrete clutter interference. An analysis of out-of-range clutter returns is included to illustrate effects of ground clutter interference due to range aliasing for aircraft on final approach. Data are presented to indicate how aircraft groundspeed might be corrected from the radar data as well as point to an observed problem of groundspeed estimate bias variation with radar antenna scan angle. A description of how recorded clutter return data are mixed with simulated weather returns is included. This enables the researcher to run controlled experiments to test signal processing algorithms. In the summary research efforts involving improved modelling of radar ground clutter returns and a Bayesian approach at hazard factor estimation are mentioned.

  2. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    Science.gov (United States)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  3. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    Science.gov (United States)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  4. Sinkhole Susceptibility Hazard Zones Using GIS and Analytical Hierarchical Process (ahp): a Case Study of Kuala Lumpur and Ampang Jaya

    Science.gov (United States)

    Rosdi, M. A. H. M.; Othman, A. N.; Zubir, M. A. M.; Latif, Z. A.; Yusoff, Z. M.

    2017-10-01

    Sinkhole is not classified as new phenomenon in this country, especially surround Klang Valley. Since 1968, the increasing numbers of sinkhole incident have been reported in Kuala Lumpur and the vicinity areas. As the results, it poses a serious threat for human lives, assets and structure especially in the capital city of Malaysia. Therefore, a Sinkhole Hazard Model (SHM) was generated with integration of GIS framework by applying Analytical Hierarchical Process (AHP) technique in order to produced sinkhole susceptibility hazard map for the particular area. Five consecutive parameters for main criteria each categorized by five sub classes were selected for this research which is Lithology (LT), Groundwater Level Decline (WLD), Soil Type (ST), Land Use (LU) and Proximity to Groundwater Wells (PG). A set of relative weights were assigned to each inducing factor and computed through pairwise comparison matrix derived from expert judgment. Lithology and Groundwater Level Decline has been identified gives the highest impact to the sinkhole development. A sinkhole susceptibility hazard zones was classified into five prone areas namely very low, low, moderate, high and very high hazard. The results obtained were validated with thirty three (33) previous sinkhole inventory data. This evaluation shows that the model indicates 64 % and 21 % of the sinkhole events fall within high and very high hazard zones respectively. Based on this outcome, it clearly represents that AHP approach is useful to predict natural disaster such as sinkhole hazard.

  5. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  6. The importance of censoring in competing risks analysis of the subdistribution hazard

    OpenAIRE

    Mark W. Donoghoe; Val Gebski

    2017-01-01

    Background The analysis of time-to-event data can be complicated by competing risks, which are events that alter the probability of, or completely preclude the occurrence of an event of interest. This is distinct from censoring, which merely prevents us from observing the time at which the event of interest occurs. However, the censoring distribution plays a vital role in the proportional subdistribution hazards model, a commonly used method for regression analysis of time-to-event data in th...

  7. Risk analysis procedure for post-wildfire natural hazards in British Columbia

    Science.gov (United States)

    Jordan, Peter

    2010-05-01

    Following a severe wildfire season in 2003, and several subsequent damaging debris flow and flood events, the British Columbia Forest Service developed a procedure for analysing risks to public safety and infrastructure from such events. At the same time, the Forest Service undertook a research program to determine the extent of post-wildfire hazards, and examine the hydrologic and geomorphic processes contributing to the hazards. The risk analysis procedure follows the Canadian Standards Association decision-making framework for risk management (which in turn is based on international standards). This has several steps: identification of risk, risk analysis and estimation, evaluation of risk tolerability, developing control or mitigation strategies, and acting on these strategies. The Forest Service procedure deals only with the first two steps. The results are passed on to authorities such as the Provincial Emergency Program and local government, who are responsible for evaluating risks, warning residents, and applying mitigation strategies if appropriate. The objective of the procedure is to identify and analyse risks to public safety and infrastructure. The procedure is loosely based on the BAER (burned area emergency response) program in the USA, with some important differences. Our procedure focuses on identifying risks and warning affected parties, not on mitigation activities such as broadcast erosion control measures. Partly this is due to limited staff and financial resources. Also, our procedure is not multi-agency, but is limited to wildfires on provincial forest land; in British Columbia about 95% of forest land is in the publicly-owned provincial forest. Each fire season, wildfires are screened by size and proximity to values at risk such as populated areas. For selected fires, when the fire is largely contained, the procedure begins with an aerial reconnaissance of the fire, and photography with a hand-held camera, which can be used to make a

  8. Treatment of hazardous landfill leachate using Fenton process followed by a combined (UASB/DHS) system.

    Science.gov (United States)

    Ismail, Sherif; Tawfik, Ahmed

    2016-01-01

    Fenton process for pre-treatment of hazardous landfill leachate (HLL) was investigated. Total, particulate and soluble chemical oxygen demand (CODt, CODp and CODs) removal efficiency amounted to 67%, 47% and 64%, respectively, at pH value of 3.5, molar ratio (H2O2/Fe(2+)) of 5, H2O2 dosage of 25 ml/L and contact time of 15 min. Various treatment scenarios were attempted and focused on studying the effect of pre-catalytic oxidation process on the performance of up-flow anaerobic sludge blanket (UASB), UASB/down-flow hanging sponge (DHS) and DHS system. The results obtained indicated that pre-catalytic oxidation process improved the CODt removal efficiency in the UASB reactor by a value of 51.4%. Overall removal efficiencies of CODt, CODs and CODp were 80 ± 6%, 80 ± 7% and 78 ± 16% for UASB/DHS treating pre-catalytic oxidation effluent, respectively. The removal efficiencies of CODt, CODs and CODp were, respectively, decreased to 54 ± 2%, 49 ± 2% and 71 ± 16% for UASB/DHS system without pre-treatment. However, the results for the combined process (UASB/DHS) system is almost similar to those obtained for UASB reactor treating pre-catalytic oxidation effluent. The DHS system achieved average removal efficiencies of 52 ± 4% for CODt, 51 ± 4% for CODs and 52 ± 15% for CODp. A higher COD fractions removal was obtained when HLL was pre-treated by Fenton reagent. The combined processes provided a removal efficiency of 85 ± 1% for CODt, 85 ± 1% for CODs and 83 ± 8% for CODp. The DHS system is not only effective for organics degradation but also for ammonia oxidation. Almost complete ammonia (NH4-N) removal (92 ± 3.6%) was occurred and the nitrate production amounted to 37 ± 6 mg/L in the treated effluent. This study strongly recommends applying Fenton process followed by DHS system for treatment of HLL.

  9. Lessons learned from the EG&G consolidated hazardous waste subcontract and ESH&Q liability assessment process

    Energy Technology Data Exchange (ETDEWEB)

    Fix, N.J.

    1995-03-01

    Hazardous waste transportation, treatment, recycling, and disposal contracts were first consolidated at the Idaho National Engineering Laboratory in 1992 by EG&G Idaho, Inc. At that time, disposition of Resource, Conservation and Recovery Act hazardous waste, Toxic Substance Control Act waste, Comprehensive Environmental Response, Compensation, and Liability Act hazardous substances and contaminated media, and recyclable hazardous materials was consolidated under five subcontracts. The wastes were generated by five different INEL M&O contractors, under the direction of three different Department of Energy field offices. The consolidated contract reduced the number of facilities handling INEL waste from 27 to 8 qualified treatment, storage, and disposal facilities, with brokers specifically prohibited. This reduced associated transportation costs, amount and cost of contractual paperwork, and environmental liability exposure. EG&G reviewed this approach and proposed a consolidated hazardous waste subcontract be formed for the major EG&G managed DOE sites: INEL, Mound, Rocky Flats, Nevada Test Site, and 10 satellite facilities. After obtaining concurrence from DOE Headquarters, this effort began in March 1992 and was completed with the award of two master task subcontracts in October and November 1993. In addition, the effort included a team to evaluate the apparent awardee`s facilities for environment, safety, health, and quality (ESH&Q) and financial liability status. This report documents the evaluation of the process used to prepare, bid, and award the EG&G consolidated hazardous waste transportation, treatment, recycling, and/or disposal subcontracts and associated ESH&Q and financial liability assessments; document the strengths and weaknesses of the process; and propose improvements that would expedite and enhance the process for other DOE installations that used the process and for the re-bid of the consolidated subcontract, scheduled for 1997.

  10. Integration of human reliability analysis into the high consequence process

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.; Morzinski, J.

    1998-12-01

    When performing a hazards analysis (HA) for a high consequence process, human error often plays a significant role in the hazards analysis. In order to integrate human error into the hazards analysis, a human reliability analysis (HRA) is performed. Human reliability is the probability that a person will correctly perform a system-required activity in a required time period and will perform no extraneous activity that will affect the correct performance. Even though human error is a very complex subject that can only approximately be addressed in risk assessment, an attempt must be made to estimate the effect of human errors. The HRA provides data that can be incorporated in the hazard analysis event. This paper will discuss the integration of HRA into a HA for the disassembly of a high explosive component. The process was designed to use a retaining fixture to hold the high explosive in place during a rotation of the component. This tool was designed as a redundant safety feature to help prevent a drop of the explosive. This paper will use the retaining fixture to demonstrate the following HRA methodology`s phases. The first phase is to perform a task analysis. The second phase is the identification of the potential human, both cognitive and psychomotor, functions performed by the worker. During the last phase the human errors are quantified. In reality, the HRA process is an iterative process in which the stages overlap and information gathered in one stage may be used to refine a previous stage. The rationale for the decision to use or not use the retaining fixture and the role the HRA played in the decision will be discussed.

  11. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-02-19

    ... Preventive Controls for Human Food; Extension of Comment Period for Information Collection Provisions AGENCY... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in the... ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food...

  12. 78 FR 48636 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-08-09

    ... Preventive Controls for Human Food; Extension of Comment Periods AGENCY: Food and Drug Administration, HHS... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food,'' that appeared in... Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food...

  13. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-11-20

    ... Preventive Controls for Human Food; Extension of Comment Periods AGENCY: Food and Drug Administration, HHS... 3646), entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk- Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...

  14. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-04-26

    ... Preventive Controls for Human Food; Extension of Comment Periods AGENCY: Food and Drug Administration, HHS... the proposed rule, ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...

  15. Incorporating Site Amplification into Seismic Hazard Analysis: A Fully Probabilistic Approach

    Science.gov (United States)

    Cramer, C. H.

    2001-12-01

    Developing site-specific amplification factors from geological, geophysical, and geotechnical information has been the state-of-practice for the last couple of decades. Now the state-of-the-art is to develop a distribution of possible site-specific amplification factors for a given input rock ground-motion. These state-of-the-art site-amplification distributions account for the uncertainty in soil properties and Vs structure at the site. Applying these site amplification distributions to a site-specific seismic hazard analysis requires a fully probabilistic approach. One such approach is to modify the generic ground-motion attenuation relations used in a probabilistic seismic hazard analysis to site-specific relations using a site amplification distribution developed for that site. The modification of the ground-motion attenuation relation is done prior to calculating probabilistic seismic hazard at the site. This approach has been implemented using the USGS National Seismic Hazard Mapping codes. Standard hazard models and hard-rock ground-motion attenuation relations are input into the modified codes along with a description of the site-specific amplification in the form of a lognormal probability-density-function (pdf). For each hard-rock ground-motion level, the pdf is specified by the median site-amplification factor and its natural-logarithmic standard deviation. The fully probabilistic ground-motion hazard curves are always above the hazard curve derived from multiplying the hard-rock hazard curve by the site's median site-amplification factors. At Paducah, Kentucky the difference is significant for 2%-in-50-year ground-motion estimates (0.7g vs. 0.5g for PGA and 1.3g vs. 0.9g for 1.0 s Sa). At Memphis, Tennessee the differences are less significant and may only be important at long periods (1.0 s and longer) on Mississippi flood-plain (lowlands) deposits (on the uplands deposits: 0.35g vs. 0.30g for PGA and 0.8g vs. 0.7g for 1.0 s Sa; on the lowlands

  16. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  17. Current issues and related activities in seismic hazard analysis in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong-Moon [Korea Atomic Energy Research Inst., Taejon (Korea, Republic of); Lee, Jong-Rim; Chang, Chun-Joong

    1997-03-01

    This paper discusses some technical issues identified from the seismic hazard analyses for probabilistic safety assessment on the operating Korean nuclear power plants and the related activities to resolve the issues. Since there are no strong instrumental earthquake records in Korea, the seismic hazard analysis is mainly dependent on the historical earthquake records. Results of the past seismic hazard analyses show that there are many uncertainties in attenuation function and intensity level and that there is a need to improve statistical method. The identification of the activity of the Yangsan Fault, which is close to nuclear power plant sites, has been an important issue. But the issue has not been resolved yet in spite of much research works done. Recently, some capable faults were found in the offshore area of Gulupdo Island in the Yellow Sea. It is anticipated that the results of research on both the Yangsan Fault and reduction of uncertainty in seismic hazard analysis will have an significant influence on seismic design and safety assessment of nuclear power plants in the future. (author)

  18. Washing and chilling as critical control points in pork slaughter hazard analysis and critical control point (HACCP) systems.

    Science.gov (United States)

    Bolton, D J; Pearce, R A; Sheridan, J J; Blair, I S; McDowell, D A; Harrington, D

    2002-01-01

    The aim of this research was to examine the effects of preslaughter washing, pre-evisceration washing, final carcass washing and chilling on final carcass quality and to evaluate these operations as possible critical control points (CCPs) within a pork slaughter hazard analysis and critical control point (HACCP) system. This study estimated bacterial numbers (total viable counts) and the incidence of Salmonella at three surface locations (ham, belly and neck) on 60 animals/carcasses processed through a small commercial pork abattoir (80 pigs d(-1)). Significant reductions (P HACCP in pork slaughter plants. This research will provide a sound scientific basis on which to develop and implement effective HACCP in pork abattoirs.

  19. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    Science.gov (United States)

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  20. March 2016 Memo: Planning for Removal and Remedial Activities at Hardrock Mining and Mineral Processing Sites with Fluid Hazards

    Science.gov (United States)

    Memo from EPA Assistant Administrator Mathy Stanislaus, regarding planning for removal and remedial activities at hardrock mining and mineral processing sites with fluid hazards, and to share the Agency’s expectations for the work that is done at these sit

  1. 77 FR 36605 - Office of Hazardous Materials Safety Notice of Delays in Processing of Special Permits Applications

    Science.gov (United States)

    2012-06-19

    ... public comment under review. 3. Application is technically complex and is of significant impact or... volume of special permit Applications. Meaning of Application Number Suffixes N--New application M... Delays in Processing of Special Permits Applications AGENCY: Pipeline and Hazardous Materials Safety...

  2. 75 FR 52057 - Office of Hazardous Materials Safety; Notice of Delays in Processing of Special Permits Applications

    Science.gov (United States)

    2010-08-24

    ... public comment under review. 3. Application is technically complex and is of significant impact or... volume of special permit applications. Meaning of Application Number Suffixes N--New application. M... Delays in Processing of Special Permits Applications AGENCY: Pipeline and Hazardous Materials Safety...

  3. Hazard analysis in active landslide areas in the State of Veracruz, Mexico

    Science.gov (United States)

    Wilde, Martina; Morales Barrera, Wendy V.; Rodriguez Elizarrarás, Sergio R.; Solleiro Rebolledo, Elizabeth; Sedov, Sergey; Terhorst, Birgit

    2016-04-01

    mass movements are analyzed in order to reconstruct complex interrelations of the causes and effects of landslide events. One of the major objectives of this research is to evaluate the potential hazard of active landslide areas. Detailed field analyzes were performed to investigate the situations and dynamics of the slope movements. Therefore, geomorphological mapping, sediment characterization as well as geophysical methods are applied. On the one hand, a detailed sediment characterization aims to identify the type of material (e.g. geotechnical attributes), on the other sediments can provide information on different activity phases, respectively movement processes in slide masses. Furthermore, the focus is placed on the determination of landslide relevant parameters and thresholds. Digital elevation models, which were generated before the onset of slope movements, are integrated in the geomorphological analysis. The poster presents the specific study sites in Veracruz and the situation of endangered slopes before and after the landslide events. It is planned to use this knowledge to model susceptibility maps for the region in the future. Moreover, field data will be used as basic information for further monitoring plans. Resulting susceptibility maps will be provided to the responsible authorities in order to support sustainable planning of settlements and infrastructure in hazardous regions.

  4. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

  5. Geological Hazards analysis in Urban Tunneling by EPB Machine (Case study: Tehran subway line 7 tunnel

    Directory of Open Access Journals (Sweden)

    Hassan Bakhshandeh Amnieh

    2016-06-01

    Full Text Available Technological progress in tunneling has led to modern and efficient tunneling methods in vast underground spaces even under inappropriate geological conditions. Identification and access to appropriate and sufficient geological hazard data are key elements to successful construction of underground structures. Choice of the method, excavation machine, and prediction of suitable solutions to overcome undesirable conditions depend on geological studies and hazard analysis. Identifying and investigating the ground hazards in excavating urban tunnels by an EPB machine could augment the strategy for improving soil conditions during excavation operations. In this paper, challenges such as geological hazards, abrasion of the machine cutting tools, clogging around these tools and inside the chamber, diverse work front, severe water level fluctuations, existence of water, and fine-grained particles in the route were recognized in a study of Tehran subway line 7, for which solutions such as low speed boring, regular cutter head checks, application of soil improving agents, and appropriate grouting were presented and discussed. Due to the presence of fine particles in the route, foam employment was suggested as the optimum strategy where no filler is needed.

  6. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    Energy Technology Data Exchange (ETDEWEB)

    Adelman, D.D. [Water Resources Engineer, Lincoln, NE (United States); Stansbury, J. [Univ. of Nebraska-Lincoln, Omaha, NE (United States)

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  7. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  8. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2003-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  9. Hazard analysis and critical control point systems in the United States Department of Agriculture regulatory policy.

    Science.gov (United States)

    Billy, T J; Wachsmuth, I K

    1997-08-01

    Recent outbreaks of foodborne illness and studies by expert groups have established the need for fundamental change in the United States meat and poultry inspection programme to reduce the risk of foodborne illness. The Food Safety and Inspection Service (FSIS) of the United States Department of Agriculture (USDA) has embarked on a broad effort to bring about such change, with particular emphasis on the reduction of pathogenic micro-organisms in raw meat and poultry products. The publication on 25 July 1996 of the Final Rule on pathogen reduction and hazard analysis and critical control point (HACCP) systems was a major milestone in the FSIS strategy for change. The Final Rule provides a framework for change and clarifies the respective roles of industry and government in ensuring the safety of meat and poultry products. With the implementation of this Final Rule underway, the FSIS has been exploring ways in which slaughter inspection carried out under an HACCP-based system can be changed so that food safety risks are addressed more adequately and the allocation of inspection resources is improved further. In addition, the FSIS is broadening the focus of food safety activities to extend beyond slaughter and processing plants by working with industry, academia and other government agencies. Such co-operation should lead to the development of measures to improve food safety before animals reach the slaughter plant and after products leave the inspected establishment for distribution to the retail level. For the future, the FSIS believes that quantitative risk assessments will be at the core of food safety activities. Risk assessments provide the most effective means of identifying how specific pathogens and other hazards may be encountered throughout the farm-to-table chain and of measuring the potential impact of various interventions. In addition, these assessments will be used in the development and evaluation of HACCP systems. The FSIS is currently conducting a

  10. The influence of Alpine soil properties on shallow movement hazards, investigated through factor analysis

    Directory of Open Access Journals (Sweden)

    S. Stanchi

    2012-06-01

    Full Text Available Mountain watersheds are particularly vulnerable to extreme meteorological events, such as high intensity rainfall, and mountain soils often show pronounced fragility and low resilience due to severe environmental conditions. Alpine soil vulnerability is partly intrinsic but in part related to climate change (mainly precipitation regimes, and is enhanced by the abandonment of rural mountain areas that reduced the land maintenance actions traditionally carried out by farmers and local populations in the past. Soil hazards are related to different processes such as water erosion, loss of consistency, surface runoff and sediment transport, often occurring simultaneously and interacting with each other. Therefore, the overall effects on soil are not easy to quantify as they can be evaluated from different soil chemical and physical properties, referring to specific soil loss phenomena such as soil erosion, soil liquefaction, loss of consistency etc. In this study, we focus our attention on a mountain region in the NW Italian Alps (Valle d'Aosta, which suffered from diffuse soil instability phenomena in recent years, as a consequence of extreme rainfall events and general abandonment of the agricultural activities in marginal areas. The main effects were a large number of shallow landislides involving limited soil depths (less than 1 m, affecting considerable surfaces in the lower and middle part of the slopes. These events caused loss of human lives in the year 2000 and therefore raised the attention on land maintenance issues. Surface (topsoil: 0–20 cm and subsurface (subsoil: 20–70 cm samples were characterised chemically and physically (pH, carbon and nitrogen contents, cation exchange capacity, texture, aggregate stability, Atterberg limits etc. and they showed very different soil properties. Topsoils were characterised by better stability, structure, and consistency. The differences between the two depths were potential trigger factors for

  11. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  12. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    Science.gov (United States)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard

  13. Occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Malaysia

    Directory of Open Access Journals (Sweden)

    Abdullahi A

    2016-07-01

    Full Text Available Auwalu Abdullahi,1–3 Azmi Hassan,1 Norizhar Kadarman,2 Yakubu Muhammad Junaidu,3 Olanike Kudrat Adeyemo,4,5 Pei Lin Lua6 1Institute for Community Development and Quality of Life (i-CODE, Universiti Sultan Zainal Abidin (UniSZA, Kampus Gong Badak, 2Faculty of Medicine, Department of Community Medicine, Universiti Sultan Zainal Abidin (UniSZA, Kampus Kota, Kuala Terengganu, Terengganu, Malaysia; 3Department of Animal Health and Husbandry, Audu Bako College of Agriculture Dambatta, Kano, Nigeria; 4Center for Human and Environmental Toxicology, Department of Physiological Sciences, University of Florida, Gainesville, FL, USA; 5Department of Veterinary Public Health and Preventive Medicine, University of Ibadan, Ibadan, Nigeria; 6Community Health Research Cluster, Faculty of Health Sciences, Universiti Sultan Zainal Abidin (UniSZA, Kampus Gong Badak, Kuala Terengganu, Terengganu, Malaysia Purpose: This study aims to investigate the occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Terengganu State, Malaysia. Occupational hazards are the major source of morbidity and mortality among the animal workers due to exposure to many hazardous situations in their daily practices. Occupational infections mostly contracted by abattoir workers could be caused by iatrogenic or transmissible agents, including viruses, bacteria, fungi, and parasites and the toxins produced by these organisms. Materials and methods: The methodology was based on a cross-sectional survey using cluster sampling technique in the four districts of Terengganu State, Malaysia. One hundred and twenty-one abattoir workers from five abattoirs were assessed using a validated structured questionnaire and an observation checklist. Results: The mean and standard deviation of occupational hazards scores of the workers were 2.32 (2.721. Physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards

  14. Hazardous Waste Management: The Role of Journalists in Decision Making Process

    Energy Technology Data Exchange (ETDEWEB)

    Eerskov-Klika, M.; Lokner, V.; Subasiae, D.; Schaller, A.

    2002-02-28

    The journalists are crucial for informing and education of general public about facts related to hazardous and radioactive waste management. Radio programs, TV and newspapers are daily reporting on relevant facts and news. In general, it is true that the majority of journalists are interested more in so called daily politics than in educating general public on certain technical or scientific topics. Therefore, hazardous and radioactive waste management was introduced to Croatian general public in last ten years mainly through various news on site selection of radioactive waste disposal facilities and some problems related to hazardous waste management. This paper presents APO's experience with journalists in last ten years includes program and activities referring informing and educating of journalists from all media.

  15. Workflow Management of the SCEC Computational Platforms for Physics-Based Seismic Hazard Analysis

    Science.gov (United States)

    Jordan, T. H.; Callaghan, S.; Maechling, P. J.; Juve, G.; Deelman, E.; Rynge, M.; Vahi, K.; Silva, F.

    2012-12-01

    Earthquake simulation has the potential to substantially improve seismic hazard and risk forecasting, but the practicality of using simulation results is limited by the scale and complexity of the computations. Here we will focus on the experience of the Southern California Earthquake Center (SCEC) in applying workflow management tools to facilitate physics-based seismic hazard analysis. This system-level problem can be partitioned into a series of computational pathways according to causal sequences described in terms of conditional probabilities. For example, the exceedance probabilities of shaking intensities at geographically distributed sites conditional on a particular fault rupture (a ground motion prediction model or GMPM) can be combined with the probabilities of different ruptures (an earthquake rupture forecast or ERF) to create a seismic hazard map. Deterministic simulations of ground motions from very large suites (millions) of ruptures, now feasible through high-performance computational facilities such as SCEC's CyberShake Platform, are allowing seismologists to replace empirical GMPMs with physics-based models that more accurately represent wave propagation through heterogeneous geologic structures, such as the sedimentary basins that amplify seismic shaking. One iteration of the current broadband CyberShake hazard model for the Los Angeles region, which calculates ground motions deterministically up to 0.5 Hz and stochastically up to 10 Hz, requires the execution of about 3.3 billion jobs, taking 12.8 million computer hours and producing 10 TB of simulation data. We will show how the scalability and reliability of CyberShake calculations on some of the nation's largest computers has been improved using the Pegasus Workflow Management System. We will also describe the current challenges of scaling these calculations up by an order of magnitude to create a California-wide hazard model, which will be based on the new Uniform California Earthquake

  16. Parameter estimation in Probabilistic Seismic Hazard Analysis: current problems and some solutions

    Science.gov (United States)

    Vermeulen, Petrus

    2017-04-01

    A typical Probabilistic Seismic Hazard Analysis (PSHA) comprises identification of seismic source zones, determination of hazard parameters for these zones, selection of an appropriate ground motion prediction equation (GMPE), and integration over probabilities according the Cornell-McGuire procedure. Determination of hazard parameters often does not receive the attention it deserves, and, therefore, problems therein are often overlooked. Here, many of these problems are identified, and some of them addressed. The parameters that need to be identified are those associated with the frequency-magnitude law, those associated with earthquake recurrence law in time, and the parameters controlling the GMPE. This study is concerned with the frequency-magnitude law and temporal distribution of earthquakes, and not with GMPEs. TheGutenberg-Richter frequency-magnitude law is usually adopted for the frequency-magnitude law, and a Poisson process for earthquake recurrence in time. Accordingly, the parameters that need to be determined are the slope parameter of the Gutenberg-Richter frequency-magnitude law, i.e. the b-value, the maximum value at which the Gutenberg-Richter law applies mmax, and the mean recurrence frequency,λ, of earthquakes. If, instead of the Cornell-McGuire, the "Parametric-Historic procedure" is used, these parameters do not have to be known before the PSHA computations, they are estimated directly during the PSHA computation. The resulting relation for the frequency of ground motion vibration parameters has an analogous functional form to the frequency-magnitude law, which is described by parameters γ (analogous to the b¬-value of the Gutenberg-Richter law) and the maximum possible ground motion amax (analogous to mmax). Originally, the approach was possible to apply only to the simple GMPE, however, recently a method was extended to incorporate more complex forms of GMPE's. With regards to the parameter mmax, there are numerous methods of estimation

  17. The importance of censoring in competing risks analysis of the subdistribution hazard.

    Science.gov (United States)

    Donoghoe, Mark W; Gebski, Val

    2017-04-04

    The analysis of time-to-event data can be complicated by competing risks, which are events that alter the probability of, or completely preclude the occurrence of an event of interest. This is distinct from censoring, which merely prevents us from observing the time at which the event of interest occurs. However, the censoring distribution plays a vital role in the proportional subdistribution hazards model, a commonly used method for regression analysis of time-to-event data in the presence of competing risks. We present the equations that underlie the proportional subdistribution hazards model to highlight the way in which the censoring distribution is included in its estimation via risk set weights. By simulating competing risk data under a proportional subdistribution hazards model with different patterns of censoring, we examine the properties of the estimates from such a model when the censoring distribution is misspecified. We use an example from stem cell transplantation in multiple myeloma to illustrate the issue in real data. Models that correctly specified the censoring distribution performed better than those that did not, giving lower bias and variance in the estimate of the subdistribution hazard ratio. In particular, when the covariate of interest does not affect the censoring distribution but is used in calculating risk set weights, estimates from the model based on these weights may not reflect the correct likelihood structure and therefore may have suboptimal performance. The estimation of the censoring distribution can affect the accuracy and conclusions of a competing risks analysis, so it is important that this issue is considered carefully when analysing time-to-event data in the presence of competing risks.

  18. The importance of censoring in competing risks analysis of the subdistribution hazard

    Directory of Open Access Journals (Sweden)

    Mark W. Donoghoe

    2017-04-01

    Full Text Available Abstract Background The analysis of time-to-event data can be complicated by competing risks, which are events that alter the probability of, or completely preclude the occurrence of an event of interest. This is distinct from censoring, which merely prevents us from observing the time at which the event of interest occurs. However, the censoring distribution plays a vital role in the proportional subdistribution hazards model, a commonly used method for regression analysis of time-to-event data in the presence of competing risks. Methods We present the equations that underlie the proportional subdistribution hazards model to highlight the way in which the censoring distribution is included in its estimation via risk set weights. By simulating competing risk data under a proportional subdistribution hazards model with different patterns of censoring, we examine the properties of the estimates from such a model when the censoring distribution is misspecified. We use an example from stem cell transplantation in multiple myeloma to illustrate the issue in real data. Results Models that correctly specified the censoring distribution performed better than those that did not, giving lower bias and variance in the estimate of the subdistribution hazard ratio. In particular, when the covariate of interest does not affect the censoring distribution but is used in calculating risk set weights, estimates from the model based on these weights may not reflect the correct likelihood structure and therefore may have suboptimal performance. Conclusions The estimation of the censoring distribution can affect the accuracy and conclusions of a competing risks analysis, so it is important that this issue is considered carefully when analysing time-to-event data in the presence of competing risks.

  19. Mapping the hazard of extreme rainfall by peaks-over-threshold extreme value analysis and spatial regression techniques

    NARCIS (Netherlands)

    Beguería, S.; Vicente-Serrano, S.M.

    2006-01-01

    The occurrence of rainfalls of high magnitude constitutes a primary natural hazard in many parts of the world, and the elaboration of maps showing the hazard of extreme rainfalls has great theoretical and practical interest. In this work a procedure based on extreme value analysis and spatial

  20. Use of the hazard analysis and critical control points (HACCP) risk assessment on a medical device for parenteral application.

    Science.gov (United States)

    Jahnke, Michael; Kühn, Klaus-Dieter

    2003-01-01

    In order to guarantee the consistently high quality of medical products for human use, it is absolutely necessary that flawless hygiene conditions are maintained by the strict observance of hygiene rules. With the growing understanding of the impact of process conditions on the quality of the resulting product, process controls (surveillance) have gained increasing importance to complete the quality profile traditionally defined by post-process product testing. Today, process controls have become an important GMP requirement for the pharmaceutical industry. However, before quality process controls can be introduced, the manufacturing process has to be analyzed, with the focus on its critical quality-influencing steps. The HACCP (Hazard Analysis and Critical Control Points) method is well recognized as a useful tool in the pharmaceutical industry. This risk analysis, following the guidelines of the HACCP method and the monitoring of critical steps during the manufacturing process was applied to the manufacture of methyl methacrylate solution used for bone cement and led to the establishment of a preventative monitoring system and constitutes an effective concept for quality assurance of hygiene and all other parameters influencing the quality of the product.

  1. SCEC Community Modeling Environment (SCEC/CME) - Seismic Hazard Analysis Applications and Infrastructure

    Science.gov (United States)

    Maechling, P. J.; Jordan, T. H.; Kesselman, C.; Moore, R.; Minster, B.; SCEC ITR Collaboration

    2003-12-01

    The Southern California Earthquake Center (SCEC) has formed a Geoscience/IT partnership to develop an advanced information infrastructure for system-level earthquake science in Southern California. This SCEC/ITR partnership comprises SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. This collaboration recently completed the second year in a five-year National Science Foundation (NSF) funded ITR project called the SCEC Community Modeling Environment (SCEC/CME). The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed by project collaborators include a Probabilistic Seismic Hazard Analysis system called OpenSHA [Field et al., this meeting]. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERF's). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. A Rupture Dynamic Model (RDM) has also been developed that couples a rupture dynamics simulation into an anelastic wave model. The collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of SHA programs. To support computationally expensive simulations, we have constructed a grid-based system utilizing Globus software [Kesselman et al., this meeting]. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC, NPACI and Teragrid High Performance Computing Centers. We have

  2. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    Science.gov (United States)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy

  3. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    Science.gov (United States)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

  4. Penerapan Hazard Analysis Critical Control Point (HACCP terhadap penurunan bahaya mikrobiologis pada makanan khusus anak berbasis hewani di Rumah Sakit Umum Daerah Dr. Soedarsono Pontianak

    Directory of Open Access Journals (Sweden)

    Widyana Lakshmi Puspita

    2010-07-01

    Full Text Available Background: One way to improve the quality of food provision in hospitals is by implementing hazard analysis critical control point (HACCP in food processing. Objective: The study aimed to identify the effect of HACCP implementation to the decrease of microbiological hazards of foods for children in particular at Nutrition Installation of Dr. Soedarso Hospital of Pontianak. Methods: The study was a quasi experiment that use multiple time series design with intervention and cassation of intervention (ABA time series chain. Samples of the study were animal based food for children, cooking utensils used preparation, processing, and distribution of the food, the food providers and food processing containers. Samples were taken 3 times before and after the implementation of HACCP, each within a week duration. Result: Average germ rate in foods and cooking utensils before implementation of HACCP was relatively high. After the implementation of HACCP there was a decrease. The result of statistical analysis showed that there were effects of HACCP implementation to the reduction of microbiological hazards in foods and cooking utensils (p<0.05. Average score of knowledge on sanitation hygiene of food and practice of sanitation hygiene of foods after HACCP implementation increased. There was an increase of average score of knowledge on food sanitation hygiene and practice of food sanitation hygiene of HACCP implementation (p<0.05. Average score of sanitation hygiene of food processing container after HACCP implementation increased. Conclusion: The implementation of HACCP could reduce microbiological hazards (germ rate of animal based special foods for children.

  5. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    Science.gov (United States)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

  6. Evaluating the influence of gully erosion on landslide hazard analysis triggered by heavy rainfall

    Science.gov (United States)

    Ruljigaljig, Tjuku; Tsai, Ching-Jun; Peng, Wen-Fei; Yu, Teng-To

    2017-04-01

    During the rainstorm period such as typhoon or heavy rain, the development of gully will induce a large-scale landslide. The purpose of this study is to assess and quantify the existence and development of gully for the purpose of triggering landslides by analyzing the landslides hazard. Firstly, based on multi-scale DEM data, this study uses wavelet transform to construct an automatic algorithm. The 1-meter DEM is used to evaluate the location and type of gully, and to establish an evaluation model for predicting erosion development.In this study, routes in the Chai-Yi were studied to clarify the damage potential of roadways from local gully. The local of gully is regarded as a parameter to reduce the strength parameter. The distribution of factor of safe (F.S.) is compared with the landslide inventory map. The result of this research could be used to increase the prediction accuracy of landslide hazard analysis due to heavy rainfalls.

  7. 76 FR 70833 - National Emission Standards for Hazardous Air Pollutant Emissions for Primary Lead Processing

    Science.gov (United States)

    2011-11-15

    ... purposes, all reference to lead emissions in this preamble means ``lead compounds'' (which is a hazardous... editorial corrections in the rule. Responding to the January 2009 petition for rulemaking from the Natural... were changes to our cancer, acute, and PB-HAP multipathway screening analyses for non-lead HAP as a...

  8. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    Science.gov (United States)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  9. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    Science.gov (United States)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  10. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)

    2016-09-15

    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  11. Hazard reduction in nanotechnology

    NARCIS (Netherlands)

    Reijnders, L.

    2008-01-01

    The release of hazardous substances is a matter of concern for nanotechnology. This may include some nanoparticles, reactants, by-products, and solvents. The use of low-hazard solvents may reduce the hazards from nanoparticle production and nanomaterial processing. The hazards of inorganic

  12. Probabilistic aftershock hazard analysis, two case studies in West and Northwest Iran

    Science.gov (United States)

    Ommi, S.; Zafarani, H.

    2017-09-01

    Aftershock hazard maps contain the essential information for search and rescue process, and re-occupation after a main-shock. Accordingly, the main purposes of this article are to study the aftershock decay parameters and to estimate the expected high-frequency ground motions (i.e., Peak Ground Acceleration (PGA)) for recent large earthquakes in the Iranian plateau. For this aim, the Ahar-Varzaghan doublet earthquake (August 11, 2012; M N =6.5, M N =6.3), and the Ilam (Murmuri) earthquake (August 18, 2014 ; M N =6.2) have been selected. The earthquake catalogue has been collected based on the Gardner and Knopoff (Bull Seismol Soc Am 64(5), 1363-1367, 1974) temporal and spatial windowing technique. The magnitude of completeness and the seismicity parameters (a, b) and the modified Omori law parameters (P, K, C) have been determined for these two earthquakes in the 14, 30, and 60 days after the mainshocks. Also, the temporal changes of parameters (a, b, P, K, C) have been studied. The aftershock hazard maps for the probability of exceedance (33%) have been computed in the time periods of 14, 30, and 60 days after the Ahar-Varzaghan and Ilam (Murmuri) earthquakes. For calculating the expected PGA of aftershocks, the regional and global ground motion prediction equations have been utilized. Amplification factor based on the site classes has also been implied in the calculation of PGA. These aftershock hazard maps show an agreement between the PGAs of large aftershocks and the forecasted PGAs. Also, the significant role of b parameter in the Ilam (Murmuri) probabilistic aftershock hazard maps has been investigated.

  13. Laser hazard analysis for airborne AURA (Big Sky variant) Proteus platform.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2004-02-01

    A laser safety and hazard analysis was performed for the airborne AURA (Big Sky Laser Technology) lidar system based on the 2000 version of the American National Standard Institute's (ANSI) Standard Z136.1, for the Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for the Safe Use of Lasers Outdoors. The AURA lidar system is installed in the instrument pod of a Proteus airframe and is used to perform laser interaction experiments and tests at various national test sites. The targets are located at various distances or ranges from the airborne platform. In order to protect personnel, who may be in the target area and may be subjected to exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength, calculate the Nominal Ocular Hazard Distance (NOHD), and determine the maximum 'eye-safe' dwell times for various operational altitudes and conditions. It was also necessary to calculate the appropriate minimum Optical Density (ODmin) of the laser safety eyewear used by authorized personnel who may receive hazardous exposures during ground base operations of the airborne AURA laser system (system alignment and calibration).

  14. A first hazard analysis of the Harrat Ash Shamah volcanic field, Syria-Jordan Borderline

    Science.gov (United States)

    Cagnan, Zehra; Akkar, Sinan; Moghimi, Saed

    2017-04-01

    The northernmost part of the Saudi Cenozoic Volcanic Fields, the 100,000 km2 Harrat Ash Shamah has hosted some of the most recent volcanic eruptions along the Syria-Jordan borderline. With rapid growth of the cities in this region, exposure to any potential renewed volcanism increased considerably. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from Harrat Ash Shamah. The 733 visible eruption vent sites were utilized to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard surrounding the cities Amman and Irbid in Jordan. The temporal eruption recurrence rate is estimated to be approximately one vent per 3500 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 17,700 yrs and 70 yrs, respectively. A Poisson temporal model is employed within the scope of this study. In order to treat the uncertainties associated with the spatio-temporal models as well as size of the area affected by the lava flow, the logic tree approach is adopted. For the Syria-Jordan borderline, the spatial variation of volcanic hazard is computed as well as uncertainty associated with these estimates.

  15. Inter-Neighborhood Migration, Race, and Environmental Hazards: Modeling Micro-Level Processes of Environmental Inequality

    Science.gov (United States)

    Crowder, Kyle; Downey, Liam

    2009-01-01

    This study combines data from the Panel Study of Income Dynamics with neighborhood-level industrial hazard data from the Environmental Protection Agency to examine the extent and sources of environmental inequality at the individual level. Results indicate that profound racial and ethnic differences in proximity to industrial pollution persist when differences in individual education, household income, and other micro-level characteristics are controlled. Examination of underlying migration patterns further reveals that black and Latino householders move into neighborhoods with significantly higher hazard levels than do comparable whites, and that racial differences in proximity to neighborhood pollution are maintained more by these disparate mobility destinations than by differential effects of pollution on the decision to move. PMID:20503918

  16. Digital detection and processing of laser beacon signals for aircraft collision hazard warning

    Science.gov (United States)

    Sweet, L. M.; Miles, R. B.; Russell, G. F.; Tomeh, M. G.; Webb, S. G.; Wong, E. Y.

    1981-01-01

    A low-cost collision hazard warning system suitable for implementation in both general and commercial aviation is presented. Laser beacon systems are used as sources of accurate relative position information that are not dependent on communication between aircraft or with the ground. The beacon system consists of a rotating low-power laser beacon, detector arrays with special optics for wide angle acceptance and filtering of solar background light, microprocessors for proximity and relative trajectory computation, and pilot displays of potential hazards. The laser beacon system provides direct measurements of relative aircraft positions; using optimal nonlinear estimation theory, the measurements resulting from the current beacon sweep are combined with previous data to provide the best estimate of aircraft proximity, heading, minimium passing distance, and time to closest approach.

  17. ANÁLISIS DEL RIESGO DE INCENDIOS FORESTALES: UN ENFOQUE BASADO EN PROCESOS PUNTUALES // FOREST WILDFIRE HAZARD ANALYSES: A POINT PROCESSES APPROACH

    Directory of Open Access Journals (Sweden)

    Rafael González de Gouveia

    2017-06-01

    Full Text Available Los procesos estocásticos puntuales representan una herramienta de gran utilidad para el análisis de los factores de riesgo en los incendios forestales. En este artículo se estudia la ocurrencia de los incendios forestales a partir de un proceso de Poisson espacio temporal, en el que se considera la función de intensidad del mismo como una caracterización del riesgo de incendio a partir de técnicas paramétricas y no paramétricas. Finalmente, se considera un conjunto de datos reales, suministrados por el Ministerio del Poder Popular para el Ambiente a través del Instituto Nacional de Meteorología e Hidrología (INAMEH en Venezuela, relativos a los incendios forestales producidos en un día en particular. Se estiman las funciones de riesgo basadas en el modelo propuesto y se generan mapas de riesgo de incendios lo cuales se ajustan a las características geográficas y climáticas del país. // Point stochastic processes represent a very useful tool for the analysis of hazard factors in wildfire. In this article, the occurrence of wildfire is studied from a spatial-temporal Poisson process, in which the intensity function thereof is considered as a wildfire hazard characterization based on parametric and non-parametric techniques. Finally, it is considered a set of real data, provided by the Ministerio del Poder Popular para el Ambiente from Instituto Nacional de Meteorología e Hidrologia (INAMEH of Venezuela, relating to widlfire produced on a particular day. The hazard functions are estimated based on the proposed model and wildfire hazard maps are generated, which are adjusted to the geographical and climatic characteristics of the country.

  18. Modeling and hazard mapping of complex cascading mass movement processes: the case of glacier lake 513, Carhuaz, Peru

    Science.gov (United States)

    Schneider, Demian; Huggel, Christian; García, Javier; Ludeña, Sebastian; Cochachin, Alejo

    2013-04-01

    that complex cascades of mass movement processes can realistically be modeled using different models and model parameters. The method to semi-automatically produce hazard maps is promising and should be applied in other case studies. Verification of model based results in the field remains an important requirement. Results from this study are important for the GLOF early warning system that is currently in an implementation phase, and for risk reduction efforts in general.

  19. Criticality analysis for hazardous materials transportation; Classificacao da criticidade das rotas do transporte rodoviario de produtos perigosos da BRASKEM

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Katia; Brady, Mariana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Diniz, Americo [BRASKEM S.A., Sao Paulo, SP (Brazil)

    2008-07-01

    The bad conditions of Brazilians roads drive the companies to be more exigent with the transportation of hazardous materials to avoid accidents or materials releases with actions to contain the releases to community and water sources. To minimize this situation, DNV and BRASKEM developed a methodology for risk analysis called Criticality Analysis for Hazardous Materials Transportation. The objective of this methodology is identifying the most critical points of routes to make actions to avoid accidents. (author)

  20. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Torrential processes like flooding, heavy bedload transport or debris flows in steep mountain channels emerge during intense, highly localized rainfall events. They pose a serious risk on the densely populated Alpine region. Hydrogeomorphic hazards are profoundly nonlinear, threshold mediated phenomena frequently causing costly damage to infrastructure and people. Thus, in the context of climate change, there is an ever rising interest in whether sediment cascades of small alpine catchments react to changing precipitation patterns and how the climate signal is propagated through the fluvial system. We intend to answer the following research questions: (i) What are critical meteorological characteristics triggering torrential events in the Eastern Alps of Austria? (ii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which factors control the internal susceptibility? (iii) Do torrential processes show an increase in magnitude and frequency or a shift in seasonality in the recent past? (iv) Which future changes can be expected under different climate scenarios? Quantifications of bedload transport in small alpine catchments are rare and often associated with high uncertainties. Detailed knowledge though exists for the Schöttlbach catchment, a 71 km2 study area in Styria in the Eastern Alps. The torrent is monitored since a heavy precipitation event resulted in a disastrous flood in July 2011. Sediment mobilisation from slopes as well as within-channel storage and fluxes are regularly measured by photogrammetric methods and sediment impact sensors (SIS). The associated hydro-meteorological conditions are known from a dense station network. Changing states of connectivity can thus be related to precipitation and internal dynamics (sediment availability, cut-and-fill cycles). The site-specific insights are then conceptualized for application to a broader scale. Therefore, a Styria wide database of torrential

  1. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steve [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ma, Zhegang [Idaho National Lab. (INL), Idaho Falls, ID (United States); Spears, Bob [Idaho National Lab. (INL), Idaho Falls, ID (United States); Szilard, Ronaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kosbab, Ben [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-07-26

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA models for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.

  2. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2008-05-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  3. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    Science.gov (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  4. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    Science.gov (United States)

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

    2004-01-01

    -motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

  5. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

    Energy Technology Data Exchange (ETDEWEB)

    Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

    2010-11-01

    Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

  6. Uncertainty Analysis of the Potential Hazard of MCCI during Severe Accidents for the CANDU6 Plant

    Directory of Open Access Journals (Sweden)

    Sooyong Park

    2015-01-01

    Full Text Available This paper illustrates the application of a severe accident analysis computer program to the uncertainty analysis of molten corium-concrete interaction (MCCI phenomena in cases of severe accidents in CANDU6 type plant. The potential hazard of MCCI is a failure of the reactor building owing to the possibility of a calandria vault floor melt-through even though the containment filtered vent system is operated. Meanwhile, the MCCI still has large uncertainties in several phenomena such as a melt spreading area and the extent of water ingression into a continuous debris layer. The purpose of this study is to evaluate the MCCI in the calandria vault floor via an uncertainty analysis using the ISAAC program for the CANDU6.

  7. Numerical Simulation of Aerogasdynamics Processes in A Longwall Panel for Estimation of Spontaneous Combustion Hazards

    Science.gov (United States)

    Meshkov, Sergey; Sidorenko, Andrey

    2017-11-01

    The relevance of a solution of the problem of endogenous fire safety in seams liable to self-ignition is shown. The possibilities of numerical methods of researches of gasdynamic processes are considered. The analysis of methodical approaches with the purpose to create models and carry out numerical researches of aerogasdynamic processes in longwall panels of gas mines is made. Parameters of the gob for longwall mining are considered. The significant influence of geological and mining conditions of conducting mining operations on distribution of air streams on longwall panels and effective management of gas emission is shown. The aerogasdynamic model of longwall panels for further research of influence of parameters of ventilation and properties of gob is presented. The results of numerical researches including distribution of air streams, fields of concentration of methane and oxygen at application of various schemes of airing for conditions of perspective mines of the Pechora basin and Kuzbass are given. Recommendations for increase of efficiency of the coal seams mining liable to selfignition are made. The directions of further researches are defined.

  8. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    Science.gov (United States)

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  9. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-05-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  10. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Project

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-10-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  11. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-02-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  12. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    Science.gov (United States)

    Grasso, S.; Maugeri, M.

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either

  13. The Hazard Analysis Critical Control Point's (HACCP) concept as applied to some chemical, physical and microbiological contaminants of milk on dairy farms. A prototype.

    Science.gov (United States)

    Lievaart, J J; Noordhuizen, J P T M; van Beek, E; van der Beek, C; van Risp, A; Schenkel, J; van Veersen, J

    2005-03-01

    Quality management on dairy farms becomes more and more important regarding the different areas of animal health, animal welfare and food safety. Monitoring animals, farm conditions and farm records can be extended with risk identification and risk management. The hazard analysis critical control point's system is useful as an on farm strategy to control the product as well as the production process on the areas of animal health, animal welfare and food safety. This article deals in detail with the question how to develop a qualitative method where risk can be defined as an interaction between probability and impact. Two parts of the production process (milk harvest and treatment of cows) where used as an example how to apply the hazard analysis critical control point's system on chemical, physical and microbiological contaminants of milk. Not just only by summarizing the different critical checkpoints for each area but also by giving them a precise judgement of probability and impact.

  14. Application of Hazard Analysis and Critical Control Points (HACCP) to the Cultivation Line of Mushroom and Other Cultivated Edible Fungi.

    Science.gov (United States)

    Pardo, José E; de Figueirêdo, Vinícius Reis; Alvarez-Ortí, Manuel; Zied, Diego C; Peñaranda, Jesús A; Dias, Eustáquio Souza; Pardo-Giménez, Arturo

    2013-09-01

    The Hazard analysis and critical control points (HACCP) is a preventive system which seeks to ensure food safety and security. It allows product protection and correction of errors, improves the costs derived from quality defects and reduces the final overcontrol. In this paper, the system is applied to the line of cultivation of mushrooms and other edible cultivated fungi. From all stages of the process, only the reception of covering materials (stage 1) and compost (stage 3), the pre-fruiting and induction (step 6) and the harvest (stage 7) have been considered as critical control point (CCP). The main hazards found were the presence of unauthorized phytosanitary products or above the permitted dose (stages 6 and 7), and the presence of pathogenic bacteria (stages 1 and 3) and/or heavy metals (stage 3). The implementation of this knowledge will allow the self-control of their productions based on the system HACCP to any plant dedicated to mushroom or other edible fungi cultivation.

  15. Towards a probabilistic tsunami hazard analysis for the Gulf of Cadiz

    Science.gov (United States)

    Løvholt, Finn; Urgeles, Roger

    2017-04-01

    Landslides and volcanic flank collapses constitute a significant portion of all known tsunami sources, and they are less constrained geographically than earthquakes as they are not tied to large fault zones. While landslides have mostly produced local tsunamis historically, prehistoric evidence show that landslides can also produce ocean wide tsunamis. Because the landslide induced tsunami probability is more difficult to quantify than the one induced by earthquakes, our understanding of the landslide tsunami hazard is less understood. To improve our understanding and methodologies to deal with this hazard, we here present results and methods for a preliminary landslide probabilistic tsunami hazard assessment (LPTHA) for the Gulf of Cadiz for submerged landslides. The present literature on LPTHA is sparse, and studies have so far been separated into two groups, the first based on observed magnitude frequency distributions (MFD's), the second based on simplified geotechnical slope stability analysis. We argue that the MFD based approach is best suited when a sufficient amount of data covering a wide range of volumes is available, although uncertainties in the dating of the landslides often represent a potential large source of bias. To this end, the relatively rich availability of landslide data in the Gulf of Cadiz makes this area suitable for developing and testing LPTHA models. In the presentation, we will first explore the landslide data and statistics, including different spatial factors such as slope versus volume relationships, faults etc. Examples of how random realizations can be used to distribute tsunami source over the study area will be demonstrated. Furthermore, computational strategies for simulating both the landslide and the tsunami generation in a simplified way will be described. To this end, we use depth averaged viscoplastic landslide model coupled to the numerical tsunami model to represent a set of idealized tsunami sources, which are in turn

  16. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    Science.gov (United States)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  17. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    Energy Technology Data Exchange (ETDEWEB)

    Dionne, B.J.; Morris, S.C. III; Baum, J.W. [and others

    1998-01-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique. This document contains the Appendices for the report.

  18. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    Energy Technology Data Exchange (ETDEWEB)

    Dionne, B.J.; Morris, S. III; Baum, J.W. [and others

    1998-03-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique.

  19. Flood hazard energy in urban areas: a new integrated method for flood risk analysis in synthesizing interactions with urban boundary layer

    Science.gov (United States)

    Park, S. Y.; Schmidt, A.

    2015-12-01

    Since urban physical characteristics (such as morphology and land-use/land-cover) are different from those of nature, altered interactions between the surface and atmosphere (especially urban boundary layer, UBL) or surface and subsurface can affect the hydrologic behavior and hence the flood hazards. In this research we focus on three main aspects of the urban surface/atmosphere interactions that affect flood hazard: urban heat island (UHI) effect, increased surface roughness, and accumulated aerosols. These factors, along with the uncertainties in quantifying these components make risk analysis intractable. In order to perform a risk analysis, the impact of these components needs to be mapped to a variable that can be mathematically described in a risk-analysis framework. We propose defining hazard energy as a surrogate for the combined effect of these three components. Perturbations that can change the hazard energy come from diverse sources in the urban areas and these somewhat disconnected things can be combined by the energy concept to characterize the impacts of urban areas in risk assessment. This approach synthesizes across hydrological and hydraulic processes in UBL, land surface, subsurface, and sewer network with scrutinizing energy exchange across places. We can extend our understanding about not only the influence of cities on local climate in rural areas or larger scales but also the interaction of cities and nature affecting each other.

  20. Analytical Problems Associated with the Analysis of Metals in a Simulated Hazardous Waste

    Science.gov (United States)

    Dunnivant, F. M.

    2002-06-01

    Analysis of samples subject to physical and chemical interferences can greatly enhance the learning experience in instrumental analysis and environmental chemistry laboratories. This article describes a project-based experience in which students analyze simulated hazardous waste samples (carbonated beverages) for calcium by six techniques: (i) flame atomic absorption spectroscopy (FAAS) using external standard calibration, (ii) FAAS using external standard calibration with a releasing agent (Sr), (iii) FAAS using standard addition, (iv) FAAS using standard addition with a releasing agent (Sr), (v) ethylenediaminetetraacetic acid (EDTA) titration, and (vi) Ca-ion-specific electrode. Not surprisingly, students find that these different techniques yield conflicting results and their assignment is to explain their data in the format of a peer-reviewed journal article. Students report that this series of lab experiments is challenging and highly rewarding. Laboratory experiences such as this one should significantly improve the student's ability to analyze problematic samples and interpret experimental data.

  1. [Epidemiologic aspects of a new approach to monitoring hygienic food handling using the hazard analysis critical control points (HACCP) system].

    Science.gov (United States)

    Matyás, Z

    1992-10-01

    The hitherto used traditional control of food hygiene focused on assessment whether the controlled sanitary and technological practice is consistent with requirements of regulations sometimes comprises also details of minor importance. To put it briefly, in the course of the production process are many check-up points, but only some or possibly only one is a critical control point. Moreover, by periodic supervision the hygienist is able to record the hygienic and technological state typical only for the time of control. Microbiological examination of final products can reveal only negative sequelae of microbial processes; it does not provide information on the conditions of contamination nor ensure protection against it. For these and other reasons the conclusion is reached that the hitherto used traditional approach of the hygiene supervision is not quite effective and must be replaced by a more active approach focused on the control of factors threatening the wholesomeness already during the production process. The new approach to supervision of food hygiene is the HACCP system (hazard analysis critical control points). The system works rationally as it is based on analysis of systematically assembled data on the causes and conditions which evoked the illness of the consumers by food products or meals. HACCP can be described as prompt, as health or quality problems are revealed immediately after their genesis during production or processing and are eliminated immediately. The system is also comprehensive as it comprises not only the basic technological process incl. processing or modification of ingredients but takes into account also the handling of the given food product after termination of production and in particular final culinar processing. The system can be applied to all pathogenic agents transmitted by foods to man from bacteria and their toxins, viruses, parasites, moulds and mycotoxins, biotoxins but also contaminants and radionuclides. The system

  2. Hazard Analysis and Safety Requirements for Small Drone Operations: To What Extent Do Popular Drones Embed Safety?

    Science.gov (United States)

    Plioutsias, Anastasios; Karanikas, Nektarios; Chatzimihailidou, Maria Mikela

    2017-08-02

    Currently, published risk analyses for drones refer mainly to commercial systems, use data from civil aviation, and are based on probabilistic approaches without suggesting an inclusive list of hazards and respective requirements. Within this context, this article presents: (1) a set of safety requirements generated from the application of the systems theoretic process analysis (STPA) technique on a generic small drone system; (2) a gap analysis between the set of safety requirements and the ones met by 19 popular drone models; (3) the extent of the differences between those models, their manufacturers, and the countries of origin; and (4) the association of drone prices with the extent they meet the requirements derived by STPA. The application of STPA resulted in 70 safety requirements distributed across the authority, manufacturer, end user, or drone automation levels. A gap analysis showed high dissimilarities regarding the extent to which the 19 drones meet the same safety requirements. Statistical results suggested a positive correlation between drone prices and the extent that the 19 drones studied herein met the safety requirements generated by STPA, and significant differences were identified among the manufacturers. This work complements the existing risk assessment frameworks for small drones, and contributes to the establishment of a commonly endorsed international risk analysis framework. Such a framework will support the development of a holistic and methodologically justified standardization scheme for small drone flights. © 2017 Society for Risk Analysis.

  3. The development of a real-time monitor for metallic off-gases emitted from hazardous mixed waste processing systems

    Energy Technology Data Exchange (ETDEWEB)

    Carney, K.P. [Argonne National Laboratory, Idaho Falls, ID (United States)

    1994-12-31

    A glow discharge based detection system has been developed for the real-time monitoring of As and Pb in off-gas systems for the plasma hearth hazardous mixed waste processing system. The glow discharge sampling system has been calibrated using hydride generation which has been described in the literature. Arsenic has been detected at levels below 500 ug/m{sup 3} in an argon flow stream. Aspects of the design and operation characteristics of the discharge cell will be presented. Results for the quantification of As and Pb in a combustion flow stream will be presented.

  4. A Sensitivity Study for an Evaluation of Input Parameters Effect on a Preliminary Probabilistic Tsunami Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyun-Me; Kim, Min Kyu; Choi, In-Kil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Dong-Hoon [Chonnam National University, Gwangju (Korea, Republic of)

    2014-10-15

    The tsunami hazard analysis has been based on the seismic hazard analysis. The seismic hazard analysis has been performed by using the deterministic method and the probabilistic method. To consider the uncertainties in hazard analysis, the probabilistic method has been regarded as attractive approach. The various parameters and their weight are considered by using the logic tree approach in the probabilistic method. The uncertainties of parameters should be suggested by analyzing the sensitivity because the various parameters are used in the hazard analysis. To apply the probabilistic tsunami hazard analysis, the preliminary study for the Ulchin NPP site had been performed. The information on the fault sources which was published by the Atomic Energy Society of Japan (AESJ) had been used in the preliminary study. The tsunami propagation was simulated by using the TSUNAMI{sub 1}.0 which was developed by Japan Nuclear Energy Safety Organization (JNES). The wave parameters have been estimated from the result of tsunami simulation. In this study, the sensitivity analysis for the fault sources which were selected in the previous studies has been performed. To analyze the effect of the parameters, the sensitivity analysis for the E3 fault source which was published by AESJ was performed. The effect of the recurrence interval, the potential maximum magnitude, and the beta were suggested by the sensitivity analysis results. Level of annual exceedance probability has been affected by the recurrence interval.. Wave heights have been influenced by the potential maximum magnitude and the beta. In the future, the sensitivity analysis for the all fault sources in the western part of Japan which were published AESJ would be performed.

  5. Microbiological quality of food in relation to hazard analysis systems and food hygiene training in UK catering and retail premises.

    Science.gov (United States)

    Little, C L; Lock, D; Barnes, J; Mitchell, R T

    2003-09-01

    A meta-analysis of eight UK food studies was carried out to determine the microbiological quality of food and its relationship with the presence in food businesses of hazard analysis systems and food hygiene training. Of the 19,022 premises visited to collect food samples in these studies between 1997 and 2002, two thirds (66%) were catering premises and one third (34%) were retail premises. Comparison with PHLS Microbiological Guidelines revealed that significantly more ready-to-eat food samples from catering premises (20%; 2,511/12,703) were of unsatisfactory or unacceptable microbiological quality compared to samples from retail premises (12%; 1,039/8,462) (p catering premises (p catering premises (p catering) compared with premises where the manager had received food hygiene training (11% retail, 19% catering) (p catering) were from premises where there was no hazard analysis system in place compared to premises that had a documented hazard analysis system in place (10% retail, 18% catering) (p catering premises compared with those collected from retail premises may reflect differences in management food hygiene training and the presence of a hazard analysis system. The importance of adequate training for food handlers and their managers as a pre-requisite for effective hazard analysis and critical control point (HACCP) based controls is therefore emphasised.

  6. Evaluation of the Potential of NASA Multi-satellite Precipitation Analysis in Global Landslide Hazard Assessment

    Science.gov (United States)

    Hong, Yang; Adler, Robert F.; Huffman, George J.

    2007-01-01

    Landslides are one of the most widespread natural hazards on Earth, responsible for thousands of deaths and billions of dollars in property damage every year. In the U.S. alone landslides occur in every state, causing an estimated $2 billion in damage and 25- 50 deaths each year. Annual average loss of life from landslide hazards in Japan is 170. The situation is much worse in developing countries and remote mountainous regions due to lack of financial resources and inadequate disaster management ability. Recently, a landslide buried an entire village on the Philippines Island of Leyte on Feb 17,2006, with at least 1800 reported deaths and only 3 houses left standing of the original 300. Intense storms with high-intensity , long-duration rainfall have great potential to trigger rapidly moving landslides, resulting in casualties and property damage across the world. In recent years, through the availability of remotely sensed datasets, it has become possible to conduct global-scale landslide hazard assessment. This paper evaluates the potential of the real-time NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA) system to advance our understanding of and predictive ability for rainfall-triggered landslides. Early results show that the landslide occurrences are closely associated with the spatial patterns and temporal distribution of rainfall characteristics. Particularly, the number of landslide occurrences and the relative importance of rainfall in triggering landslides rely on the influence of rainfall attributes [e.g. rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms). TMPA precipitation data are available in both real-time and post-real-time versions, which are useful to assess the location and timing of rainfall-triggered landslide hazards by monitoring landslide-prone areas while receiving heavy rainfall. For the purpose of identifying rainfall-triggered landslides, an empirical global rainfall intensity

  7. Enforcement Alert: Hazardous Waste Management Practices at Mineral Processing Facilities Under Scrutiny by U.S. EPA; EPA Clarifies 'Bevill Exclusion' Wastes and Establishes Disposal Standards

    Science.gov (United States)

    This is the enforcement alert for Hazardous Waste Management Practices at Mineral Processing Facilities Under Scrutiny by U.S. EPA; EPA Clarifies 'Bevill Exclusion' Wastes and Establishes Disposal Standards

  8. List of Potentially Affected Sources for the Asphalt Processing and Roofing Manufacturing National Emission Standards for Hazardous Air Pollutants (NESHAP) November 2001

    Science.gov (United States)

    This is a November 2001 list of sources identified by EPA as potentially affected by the Asphalt Processing and Roofing Manufacturing National Emission Standards for Hazardous Air Pollutants (NESHAP).

  9. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    Science.gov (United States)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  10. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    Science.gov (United States)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  11. EPOS Thematic Core Service Anthropogenic Hazards for SHEER project: maintain, process and manage your project research data

    Science.gov (United States)

    Orlecka-Sikora, Beata; Lasocki, Stanislaw; Staszek, Monika; Olszewska, Dorota; Urban, Pawel; Jaroslawski, Janusz; Cielesta, Szymon; Mirek, Janusz; Wiszniowski, Jan; Picozzi, Matteo; Solaro, Giuseppe; Pringle, Jamie; Toon, Sam; Cesca, Simone; Kuehn, Daniela; Ruigrok, Elmer; Gunning, Andrew; Isherwood, Catherine

    2017-04-01

    The main objective of the "Shale gas exploration and exploitation induced risks - SHEER" project (Horizon 2020, call LCE 16-2014) is to develop a probabilistic methodology to assess and mitigate the short- and the long-term environmental risks associated with the exploration and exploitation of shale gas. To this end, the SHEER project makes use of a large amount of heterogeneous data of various types. This data, from different disciplines of science e.g. geophysical, geochemical, geological, technological, etc., must be homogenized, harmonized and made accessible exclusively for all project participants. This requires to develop an over-arching structure for high-level multidisciplinary data integration. The bespoke solution is provided by Thematic Core Service Anthropogenic Hazards (TCS AH) developed in the framework of European Plate Observing System Program (https://tcs.ah-epos.eu/, infrastructural projects IS-EPOS, POIG.02.03.00-14-090/13-00 and EPOS IP, H2020-INFRADEV-1-2015-1). TCS AH provides virtual access to a comprehensive, wide-scale and high quality research infrastructure in the field of induced seismicity and other anthropogenic hazards evoked by exploration and exploitation of geo-resources. TCS AH is designed as a functional e-research environment to ensure a researcher the maximum possible freedom for experimentation by providing a virtual laboratory flexible to create own workspace for processing streams. A data-management process promotes the use of research infrastructure in novel ways providing an access to (i) data gathered in the so-called "episodes", comprehensively describing a geophysical process, induced or triggered by human technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment, (ii) problem-oriented, specific services, with the particular attention devoted to methods analyzing correlations between technology, geophysical response and resulting hazards, (iii) the

  12. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    Science.gov (United States)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  13. Analysis of Operational Hazards and Safety Requirements for Traffic Aware Strategic Aircrew Requests (TASAR)

    Science.gov (United States)

    Koczo, Stefan, Jr.

    2013-01-01

    Safety analyses of the Traffic Aware Strategic Aircrew Requests (TASAR) Electronic Flight Bag (EFB) application are provided to establish its Failure Effects Classification which affects certification and operational approval requirements. TASAR was developed by NASA Langley Research Center to offer flight path improvement opportunities to the pilot during flight for operational benefits (e.g., reduced fuel, flight time). TASAR, using own-ship and network-enabled information concerning the flight and its environment, including weather and Air Traffic Control (ATC) system constraints, provides recommended improvements to the flight trajectory that the pilot can choose to request via Change Requests to ATC for revised clearance. This study reviews the Change Request process of requesting updates to the current clearance, examines the intended function of TASAR, and utilizes two safety assessment methods to establish the Failure Effects Classification of TASAR. Considerable attention has been given in this report to the identification of operational hazards potentially associated with TASAR.

  14. Tetrachloroethene recovery and hazard reduction of spent powders from dry cleaning process.

    Science.gov (United States)

    Petrucci, Elisabetta; Scarsella, Marco; De Filippis, Paolo; Di Palma, Luca

    2015-04-01

    Dry cleaning facilities using perchloroethylene produce a solid waste consisting of spent filtering powders with a high content of residual perchloroethylene, together with dyes and non-volatile residues. Untreated spent powders, classified as hazardous waste, cannot be disposed in landfill and incineration represents the only viable alternative. In this study, together with a full characterisation of the waste, the removal and recovery of the residual perchloroethylene by means of different heat treatments was investigated. In particular, tests of distillation and stripping with air and steam were carried out, evaluating the effectiveness of the treatments by quantifying the residual perchloroethylene in the samples treated. The results obtained show that the spent filtering powders contained about 25% wt. of perchloroethylene and that the maximum perchloroethylene recovery was obtained by steam stripping; approximately 98% after only 50 minutes. However, this treatment accounted for the production of a liquid mixture containing perchloroethylene and of a solid waste that required a further washing with boiling water to decrease the residual organic content below the eligibility criteria for landfill disposal. © The Author(s) 2015.

  15. Multifaceted processes controlling the distribution of hazardous compounds in the spontaneous combustion of coal and the effect of these compounds on human health.

    Science.gov (United States)

    Oliveira, Marcos L S; da Boit, Kátia; Pacheco, Fernanda; Teixeira, Elba C; Schneider, Ismael L; Crissien, Tito J; Pinto, Diana C; Oyaga, Rafael M; Silva, Luis F O

    2018-01-01

    Pollution generated by hazardous elements and persistent organic compounds that affect coal fire is a major environmental concern because of its toxic nature, persistence, and potential risk to human health. The coal mining activities are growing in the state of Santa Catarina in Brazil, thus the collateral impacts on the health and economy are yet to be analyzed. In addition, the environment is also enduring the collateral damage as the waste materials directly influence the coal by-products applied in civil constructions. This study was aimed to establish the relationships between the composition, morphology, and structural characteristics of ultrafine particles emitted by coal mine fires. In Brazil, the self-combustions produced by Al-Ca-Fe-Mg-Si coal spheres are rich in chalcophile elements (As, Cd, Cu, Hg, Pb, Sb, Se, Sn, and Zn), lithophile elements (Ce, Hf, In, La, Th, and U), and siderophile elements (Co, Cr, Mo, Fe, Ni, and V). The relationship between nanomineralogy and the production of hazardous elements as analyzed by advanced methods for the geochemical analysis of different materials were also delineated. The information obtained by the mineral substance analysis may provide a better idea for the understanding of coal-fire development and assessing the response of particular coal in different combustion processes. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Experimental analysis of armouring process

    Science.gov (United States)

    Lamberti, Alberto; Paris, Ennio

    Preliminary results from an experimental investigation on armouring processes are presented. Particularly, the process of development and formation of the armour layer under different steady flow conditions has been analyzed in terms of grain size variations and sediment transport rate associated to each size fraction.

  17. Analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    To enable the development of automated support for design, a challenge is to model and analyse dynamics of design processes in a formal manner. This paper contributes a declarative, logical approach for specification of dynamic properties of design processes, supported by a formal temporal

  18. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J. [VTT Automation, Espoo (Finland)

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant. 62 refs. The thesis includes also five previous publications by author.

  19. Geocryological hazards and destructive exogenic geological processes on lines of linear constructions of tundra and forest-tundra zones of Western Siberia

    Science.gov (United States)

    Ospennikov, E. N.; Hilimonjuk, V. Z.

    2009-04-01

    construction. The estimation was carried out on the basis of the analysis, including features of geocryological processes development in natural conditions and certain types of geocryological conditions; character of the failures caused by construction and operation of roads; hazard severity of destructive processes for certain geotechnical systems of roads. Three categories of territories have been specified as a result on base of hazard severity: very complex, complex and simple. Very complex ones are characterized by close to 0 0C by average annual temperatures of soils, presence massive pore and it is repeated- wedge ices, a wide circulation it is high ice bearing ground and active modern development of processes thermokarst, thermo erosion and frost heave. Simple territories differ in low average annual temperatures of soils (below -4 0С), absence massive underground ices and weak development of geocryological processes. All other territories representing potential hazard at adverse change of an environment are classified as complex territories.

  20. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  1. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  2. Multi-hazard response analysis of a 5MW offshore wind turbine

    DEFF Research Database (Denmark)

    Katsanos, Evangelos; Sanz, A. Arrospide; Georgakis, Christos T.

    2017-01-01

    Wind energy has already dominant role on the scene of the clean energy production. Well-promising markets, like China, India, Korea and Latin America are the fields of expansion for new wind turbines mainly installed in offshore environment, where wind, wave and earthquake loads threat the struct......Wind energy has already dominant role on the scene of the clean energy production. Well-promising markets, like China, India, Korea and Latin America are the fields of expansion for new wind turbines mainly installed in offshore environment, where wind, wave and earthquake loads threat...... the structural integrity and reliability of these energy infrastructures. Along these lines, a multi-hazard environment was considered herein and the structural performance of a 5 MW offshore wind turbine was assessed through time domain analysis. A fully integrated model of the offshore structure consisting...

  3. LIFE CYCLE ASSESSMENT AND HAZARD ANALYSIS AND CRITICAL CONTROL POINTS TO THE PASTA PRODUCT

    Directory of Open Access Journals (Sweden)

    Yulexis Meneses Linares

    2016-10-01

    Full Text Available The objective of this work is to combine the Life Cycle Assessment (LCA and Hazard Analysis and Critical Control Points (HACCP methodologies for the determination of risks that the food production represents to the human health and the ecosystem. The environmental performance of the production of pastas in the “Marta Abreu” Pasta Factory of Cienfuegos is assessed, where the critical control points determined by the biological dangers (mushrooms and plagues and the physical dangers (wood, paper, thread and ferromagnetic particles were the raw materials: flour, semolina and its mixtures, and the disposition and extraction of them. Resources are the most affected damage category due to the consumption of fossil fuels.

  4. Three-dimensional displays for natural hazards analysis, using classified Landsat Thematic Mapper digital data and large-scale digital elevation models

    Science.gov (United States)

    Butler, David R.; Walsh, Stephen J.; Brown, Daniel G.

    1991-01-01

    Methods are described for using Landsat Thematic Mapper digital data and digital elevation models for the display of natural hazard sites in a mountainous region of northwestern Montana, USA. Hazard zones can be easily identified on the three-dimensional images. Proximity of facilities such as highways and building locations to hazard sites can also be easily displayed. A temporal sequence of Landsat TM (or similar) satellite data sets could also be used to display landscape changes associated with dynamic natural hazard processes.

  5. Causal Analysis of the Inadvertent Contact with an Uncontrolled Electrical Hazardous Energy Source (120 Volts AC)

    Energy Technology Data Exchange (ETDEWEB)

    David E. James; Dennis E. Raunig; Sean S. Cunningham

    2014-10-01

    with in HFEF-OI-3165 placed the HPT in proximity of an unmitigated hazard directly resulting in this event. Contributing Factor A3B3C04/A4B5C04: - Knowledge Based Error, LTA Review Based on Assumption That Process Will Not Change - Change Management LTA, Risks/consequences associated with change not adequately reviewed/assessed Prior to the pneumatic system being out of service, the probe and meter were not being source checked together. The source check issue was identified and addressed during the period of time when the system was out of service. The corrective actions for this issue resulted in the requirement that a meter and probe be source checked together as it is intended to be used. This changed the activity and required an HPT to weekly, when in use, remove and install the probe from above HBV-7 to meet the requirement of LRD 15001 Part 5 Article 551.5. Risks and consequences associated with this change were not adequately reviewed or assessed. Failure to identify the hazard associated with this change directly contributed to this event.

  6. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Science.gov (United States)

    2010-07-01

    ... newly assigned process, shall be trained in an overview of the process and in the operating procedures... operating procedures. (2) Refresher training. Refresher training shall be provided at least every three... shall assure that each contract employee follows the safety rules of the facility including the safe...

  7. 29 CFR 1926.64 - Process safety management of highly hazardous chemicals.

    Science.gov (United States)

    2010-07-01

    ... involved in operating a newly assigned process, shall be trained in an overview of the process and in the... provided at least every three years, and more often if necessary, to each employee involved in operating a... contract employer shall assure that each contract employee follows the safety rules of the facility...

  8. Physicochemically modified peat by thermal and oxidation processes as an active material for purification of wastewaters from certain hazardous pollutants

    Directory of Open Access Journals (Sweden)

    Purenović Jelena M.

    2017-01-01

    Full Text Available The physicochemical modification of peat through thermal and oxidation processes was carried out, in order to obtain new, inexpensive and active material for purification of different types of waters. During the modification, surface chemical compounds of Shilov type were formed. Batch adsorption properties and suitability of physicochemically modified peat (PCMP for odor removal were tested in aqueous solutions of H2S and colloidal sulphur. Additionally, PCMP was tested in the removal of As(V which is hazardous ingredient in contaminated waters. Possible mechanisms of pollutants binding include interactions, which lead to formation of adducts and clathrates. All these processes are elucidated in detail. The results showed that the obtained material can be used for the removal of sulphide, colloidal sulphur and As(V from different types of waters. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. III 45012

  9. A study of the impact of moist-heat and dry-heat treatment processes on hazardous trace elements migration in food waste.

    Science.gov (United States)

    Chen, Ting; Jin, Yiying; Qiu, Xiaopeng; Chen, Xin

    2015-03-01

    Using laboratory experiments, the authors investigated the impact of dry-heat and moist-heat treatment processes on hazardous trace elements (As, Hg, Cd, Cr, and Pb) in food waste and explored their distribution patterns for three waste components: oil, aqueous, and solid components. The results indicated that an insignificant reduction of hazardous trace elements in heat-treated waste-0.61-14.29% after moist-heat treatment and 4.53-12.25% after dry-heat treatment-and a significant reduction in hazardous trace elements (except for Hg without external addition) after centrifugal dehydration (P heat treatment, over 90% of the hazardous trace elements in the waste were detected in the aqueous and solid components, whereas only a trace amount of hazardous trace elements was detected in the oil component (heat treatment process did not significantly reduce the concentration of hazardous trace elements in food waste, but the separation process for solid and aqueous components, such as centrifugal dehydration, could reduce the risk considerably. Finally, combined with the separation technology for solid and liquid components, dry-heat treatment is superior to moist-heat treatment on the removal of external water-soluble ionic hazardous trace elements. An insignificant reduction of hazardous trace elements in heat-treated waste showed that heat treatment does not reduce trace elements contamination in food waste considerably, whereas the separation process for solid and aqueous components, such as centrifugal dehydration, could reduce the risk significantly. Moreover, combined with the separation technology for solid and liquid components, dry-heat treatment is superior to moist-heat treatment for the removal of external water-soluble ionic hazardous trace elements, by exploring distribution patterns of trace elements in three waste components: oil, aqueous, and solid components.

  10. Hazards and hazard combinations relevant for the safety of nuclear power plants

    Science.gov (United States)

    Decker, Kurt; Brinkman, Hans; Raimond, Emmanuel

    2017-04-01

    exclusive (e.g., extremely high air temperature and surface ice). Our dataset further provides information on hazard combinations which are more likely to occur than just by random coincidence. 577 correlations between individual hazards are identified by expert opinion and shown in a cross-correlation chart. Combinations discriminate between: (1) causally connected hazards (cause-effect relation) where one hazard (e.g., costal erosion) may be caused by another hazard (e.g., storm surge); or where one hazard (e.g., high wind) is a prerequisite for a correlated hazard (e.g., storm surge). The identified causal links are not commutative. (2) Associated hazards ("contemporary" events) which are probable to occur at the same time due to a common root cause (e.g., a cold front of a meteorological low pressure area which leads to a drop of air pressure, high wind, thunderstorm, lightning, heavy rain and hail). The root cause may not necessarily be regarded as a hazard by itself. The hazard list and the hazard correlation chart may serve as a starting point for the hazard analysis process for nuclear installations in Level 1 PSA as outlined by IAEA (2010), the definition of design basis for nuclear reactors, and the assessment of design extension conditions as required by WENRA-RHWG (2014). It may further be helpful for the identification of hazard combinations and hazard cascades which threaten other critical infrastructure. References: Decker, K. & Brinkman, H., 2017. List of external hazards to be considered in extended PSA. Report No. ASAMPSA_E/WP21/D21.2/2017-41 - IRSN/ PSN-RES/SAG/2017-00011 IAEA, 2010. Development and Application of Level 1 Probabilistic Safety Assessment for Nuclear Power Plants. Safety Guide No. SSG-3, Vienna. http://www-pub.iaea.org/books/ WENRA-RHWG, 2014. WENRA Safety Reference Levels for Existing Reactors. Update in Relation to Lessons Learned from TEPCO Fukushima Dai-Ichi Accident. http://www.wenra.org/publications/

  11. Sensor data fusion and image processing for object and hazard detection; Sensordatenfusion und Bildverarbeitung zur Objekt- und Gefahrenerkennung

    Energy Technology Data Exchange (ETDEWEB)

    Catala Prat, Alvaro

    2011-03-15

    The present work deals with automatic detection and tracking of objects in driving situations as well as derivation of potential hazards. To do this, the data of a laser scanner and a camera is processed and fused. The work provides new methods in the area of immediate environment detection and modeling. Thus, it creates a basis for innovative driver assistance and automation systems. The aim of such systems is to improve driving safety, traffic efficiency and driving comfort. The methods introduced in this work can be classified into different abstraction levels: At sensor data level, the data is prepared and reduced. In this work, the focus is especially set on the detection of driving oscillations from camera images and on the detection of the driving corridor from the data of different sensors, used later as the primary area of interest. At object level the central data fusion is done. High reliability, availability and sensor independency are achieved by choosing a competitive object fusion approach. As an input of the data fusion, object observations from camera and laser scanner data are extracted. These are then fused at the aim of object detection and tracking, where aspects such as robustness against manoeuvring objects, measurement outliers, split and merge effects, as well as partial object observability are addressed. At application level, early detection of potential hazards is addressed. A statistical approach has been chosen and developed, in which hazards are handled as atypical situations. This general and expandable approach is exemplarily shown based on the detected object data. The presented strategies and methods have been developed systematically, implemented in a modular prototype and tested with simulated and real data. The test results of the data fusion system show a win in data quality and robustness, with which an improvement of driver assistance and automation systems can be reached. (orig.)

  12. Classification of working processes to facilitate occupational hazard coding on industrial trawlers

    DEFF Research Database (Denmark)

    Jensen, Olaf C; Stage, Søren; Noer, Preben

    2003-01-01

    BACKGROUND: Commercial fishing is an extremely dangerous economic activity. In order to more accurately describe the risks involved, a specific injury coding based on the working process was developed. METHOD: Observation on six different types of vessels was conducted and allowed a description...... and a classification of the principal working processes on all kinds of vessels and a detailed classification for industrial trawlers. In industrial trawling, fish are landed for processing purposes, for example, for the production of fish oil and fish meal. The classification was subsequently used to code...

  13. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  14. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  15. 75 FR 28227 - National Emission Standards for Hazardous Air Pollutants: Gold Mine Ore Processing and Production...

    Science.gov (United States)

    2010-05-20

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION... Mine Ore Processing and Production Area Source Category and Addition to Source Category List for Standards AGENCY: Environmental Protection Agency (EPA). ACTION: Extension of public comment period. SUMMARY...

  16. Geomorphological hazard analysis along the Egyptian Red Sea coast between Safaga and Quseir

    Directory of Open Access Journals (Sweden)

    A. M. Youssef

    2009-05-01

    Full Text Available Geomophological hazard assessment is an important component of natural hazard risk assessment. This paper presents GIS-based geomorphological hazard mapping in the Red Sea area between Safaga and Quseir, Egypt. This includes the integration of published geological, geomorphological, and other data into GIS, and generation of new map products, combining governmental concerns and legal restrictions. Detailed geomorphological hazard maps for flooding zones and earth movement potential, especially along the roads and railways, have been prepared. Further the paper illustrates the application of vulnerability maps dealing with the effect of hazard on urban areas, tourist villages, industrial facilities, quarries, and road networks. These maps can help to initiate appropriate measures to mitigate the probable hazards in the area.

  17. VEGETATION COVER ANALYSIS OF HAZARDOUS WASTE SITES IN UTAH AND ARIZONA USING HYPERSPECTRAL REMOTE SENSING

    Energy Technology Data Exchange (ETDEWEB)

    Serrato, M.; Jungho, I.; Jensen, J.; Jensen, R.; Gladden, J.; Waugh, J.

    2012-01-17

    Remote sensing technology can provide a cost-effective tool for monitoring hazardous waste sites. This study investigated the usability of HyMap airborne hyperspectral remote sensing data (126 bands at 2.3 x 2.3 m spatial resolution) to characterize the vegetation at U.S. Department of Energy uranium processing sites near Monticello, Utah and Monument Valley, Arizona. Grass and shrub species were mixed on an engineered disposal cell cover at the Monticello site while shrub species were dominant in the phytoremediation plantings at the Monument Valley site. The specific objectives of this study were to: (1) estimate leaf-area-index (LAI) of the vegetation using three different methods (i.e., vegetation indices, red-edge positioning (REP), and machine learning regression trees), and (2) map the vegetation cover using machine learning decision trees based on either the scaled reflectance data or mixture tuned matched filtering (MTMF)-derived metrics and vegetation indices. Regression trees resulted in the best calibration performance of LAI estimation (R{sup 2} > 0.80). The use of REPs failed to accurately predict LAI (R{sup 2} < 0.2). The use of the MTMF-derived metrics (matched filter scores and infeasibility) and a range of vegetation indices in decision trees improved the vegetation mapping when compared to the decision tree classification using just the scaled reflectance. Results suggest that hyperspectral imagery are useful for characterizing biophysical characteristics (LAI) and vegetation cover on capped hazardous waste sites. However, it is believed that the vegetation mapping would benefit from the use of 1 higher spatial resolution hyperspectral data due to the small size of many of the vegetation patches (< 1m) found on the sites.

  18. Site Specific Probabilistic Seismic Hazard and Risk Analysis for Surrounding Communities of The Geysers Geothermal Development Area

    Science.gov (United States)

    Miah, M.; Hutchings, L. J.; Savy, J. B.

    2014-12-01

    We conduct a probabilistic seismic hazard and risk analysis from induced and tectonic earthquakes for a 50 km radius area centered on The Geysers, California and for the next ten years. We calculate hazard with both a conventional and physics-based approach. We estimate site specific hazard. We convert hazard to risk of nuisance and damage to structures per year and map the risk. For the conventional PSHA we assume the past ten years is indicative of hazard for the next ten years from Msurprising since they were calculated by completely independent means. The conventional approach used the actual catalog of the past ten years of earthquakes to estimate the hazard for the next ten year. While the physics-based approach used geotechnical modeling to calculate the catalog for the next ten years. Similarly, for the conventional PSHA, we utilized attenuation relations from past earthquakes recorded at the Geysers to translate the ground motion from the source to the site. While for the physics-based approach we calculated ground motion from simulation of actual earthquake rupture. Finally, the source of the earthquakes was the actual source for the conventional PSHA. While, we assumed random fractures for the physics-based approach. From all this, we consider the calculation of the conventional approach, based on actual data, to validate the physics-based approach used.

  19. Vulnerability analysis of Landslide hazard area: Case study of South Korea

    Science.gov (United States)

    Oh, Chaeyeon; Jun, Kyewon; Kim, Younghwan

    2017-04-01

    Recently such as Landslide and debris flow are occurring over the due to climate changes, frequent sedimentation disaster in mountains area. A scientific analysis of landslide risk areas along with the collection and analysis of a variety of spatial information would be critical for minimizing damage in the event of mountainous disasters such as landslide and debris flow. We carried out a case study of the selected areas at Inje, Gangwon province which suffered from serious landslides due to flash floods by Typhoon Ewiniar in 2006. Landslide and debris flow locations were identified in the study area from interpretation of airborne image and field surveys. We used GIS to construct a spatial information database integrating the data required for a comprehensive analysis of landslide risk areas including geography, hydrology, pedology, and forestry. Furthermore, this study evaluates slope stability of the affected areas using SINMAP(Stability Index Mapping), analyzes spatial data that have high correlation with selected landslide areas using Likelihood ratio. And by applying the Weight of evidence techniques weight values (W+ and W-) which were calculated for each element. We then analyzed the spatial data which were significantly correlated with the landslide occurrence and predicted the mountainous areas with elevated risks of landslide which are vulnerable to disasters, and the hazard map was generated using GIS. Acknowledgments This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(No.NRF-2014R1A1A3050495).

  20. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  1. Group Process: A Systematic Analysis.

    Science.gov (United States)

    Roark, Albert E.; Radl, Myrna C.

    1984-01-01

    Identifies components of group process and describes leader functions. Discusses personal elements, focus of interaction/psychological distance, group development, content, quality of interaction, and self-reflective/meaning attribution, illustrated by a case study of a group of persons (N=5) arrested for drunk driving. (JAC)

  2. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  3. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  4. 76 FR 9449 - National Emission Standards for Hazardous Air Pollutants: Gold Mine Ore Processing and Production...

    Science.gov (United States)

    2011-02-17

    ...: Examples of Category NAICS code \\1\\ regulated entities Industry: Gold Ore Mining 212221 Establishments... that EPA does not have the authority to list gold mining processing and production as a source category... emissions, and that gold mining was not included on that list in 1998. In addition, the commenters said that...

  5. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  6. Safety and Hazard Analysis for the Coherent/Acculite Laser Based Sandia Remote Sensing System (Trailer B70).

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2005-09-01

    A laser safety and hazard analysis is presented, for the Coherent(r) driven Acculite(r) laser central to the Sandia Remote Sensing System (SRSS). The analysis is based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The trailer (B70) based SRSS laser system is a mobile platform which is used to perform laser interaction experiments and tests at various national test sites. The trailer based SRSS laser system is generally operated on the United State Air Force Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances. In order to protect personnel who work inside the Nominal Hazard Zone (NHZ) from hazardous laser exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (ODmin) necessary for the laser safety eyewear used by authorized personnel. Also, the Nominal Ocular Hazard Distance (NOHD) and The Extended Ocular Hazard Distance (EOHD) are calculated in order to protect unauthorized personnel who may have violated the boundaries of the control area and might enter into the laser's NHZ for testing outside the trailer. 4Page intentionally left blank

  7. Hazard, Vulnerability and Capacity Mapping for Landslides Risk Analysis using Geographic Information System (GIS)

    Science.gov (United States)

    Sari, D. A. P.; Innaqa, S.; Safrilah

    2017-06-01

    This research analyzed the levels of disaster risk in the Citeureup sub-District, Bogor Regency, West Java, based on its potential hazard, vulnerability and capacity, using map to represent the results, then Miles and Huberman analytical techniques was used to analyze the qualitative interviews. The analysis conducted in this study is based on the concept of disaster risk by Wisner. The result shows that the Citeureup sub-District has medium-low risk of landslides. Of the 14 villages, three villages have a moderate risk level, namely Hambalang, Tajur, and Tangkil, or 49.58% of the total land area. Eleven villages have a low level of risk, namely Pasir Mukti, Sanja, Tarikolot, Gunung Sari, Puspasari, East Karang Asem, Citeureup, Leuwinutug, Sukahati, West Karang Asem West and Puspanegara, or 48.68% of the total land area, for high-risk areas only around 1.74%, which is part of Hambalang village. The analysis using Geographic Information System (GIS) prove that areas with a high risk potential does not necessarily have a high level of risk. The capacity of the community plays an important role to minimize the risk of a region. Disaster risk reduction strategy is done by creating a safe condition, which intensified the movement of disaster risk reduction.

  8. Site specific seismic hazard analysis and determination of response spectra of Kolkata for maximum considered earthquake

    Science.gov (United States)

    Shiuly, Amit; Sahu, R. B.; Mandal, Saroj

    2017-06-01

    This paper presents site specific seismic hazard analysis of Kolkata city, former capital of India and present capital of state West Bengal, situated on the world’s largest delta island, Bengal basin. For this purpose, peak ground acceleration (PGA) for a maximum considered earthquake (MCE) at bedrock level has been estimated using an artificial neural network (ANN) based attenuation relationship developed on the basis of synthetic ground motion data for the region. Using the PGA corresponding to the MCE, a spectrum compatible acceleration time history at bedrock level has been generated by using a wavelet based computer program, WAVEGEN. This spectrum compatible time history at bedrock level has been converted to the same at surface level using SHAKE2000 for 144 borehole locations in the study region. Using the predicted values of PGA and PGV at the surface, corresponding contours for the region have been drawn. For the MCE, the PGA at bedrock level of Kolkata city has been obtained as 0.184 g, while that at the surface level varies from 0.22 g to 0.37 g. Finally, Kolkata has been subdivided into eight seismic subzones, and for each subzone a response spectrum equation has been derived using polynomial regression analysis. This will be very helpful for structural and geotechnical engineers to design safe and economical earthquake resistant structures.

  9. Using SAR and GPS for Hazard Management and Response: Progress and Examples from the Advanced Rapid Imaging and Analysis (ARIA) Project

    Science.gov (United States)

    Owen, S. E.; Simons, M.; Hua, H.; Yun, S. H.; Agram, P. S.; Milillo, P.; Sacco, G. F.; Webb, F.; Rosen, P. A.; Lundgren, P.; Milillo, G.; Manipon, G. J. M.; Moore, A. W.; Liu, Z.; Polet, J.; Cruz, J.

    2014-12-01

    ARIA is a joint JPL/Caltech project to automate synthetic aperture radar (SAR) and GPS imaging capabilities for scientific understanding, hazard response, and societal benefit. We have built a prototype SAR and GPS data system that forms the foundation for hazard monitoring and response capability, as well as providing imaging capabilities important for science studies. Together, InSAR and GPS have the ability to capture surface deformation in high spatial and temporal resolution. For earthquakes, this deformation provides information that is complementary to seismic data on location, geometry and magnitude of earthquakes. Accurate location information is critical for understanding the regions affected by damaging shaking. Regular surface deformation measurements from SAR and GPS are useful for monitoring changes related to many processes that are important for hazard and resource management such as volcanic deformation, groundwater withdrawal, and landsliding. Observations of SAR coherence change have a demonstrated use for damage assessment for hazards such as earthquakes, tsunamis, hurricanes, and volcanic eruptions. These damage assessment maps can be made from imagery taken day or night and are not affected by clouds, making them valuable complements to optical imagery. The coherence change caused by the damage from hazards (building collapse, flooding, ash fall) is also detectable with intelligent algorithms, allowing for rapid generation of damage assessment maps over large areas at fine resolution, down to the spatial scale of single family homes. We will present the progress and results we have made on automating the analysis of SAR data for hazard monitoring and response using data from the Italian Space Agency's (ASI) COSMO-SkyMed constellation of X-band SAR satellites. Since the beginning of our project with ASI, our team has imaged deformation and coherence change caused by many natural hazard events around the world. We will present progress on our

  10. Detoxification and Disposal of Hazardous Organic Chemicals by Processing in Supercritical Water

    Science.gov (United States)

    1985-11-06

    40 Ta1ble I0 - Results at Critical (647K; 22MN/M.) ....... ............. .. 62 T-jble 11 - Results of Reforming Maple Sawdust at 3770.C...Advisory Board and the technical staff of the Stellite Division of the Cabot Corporation have provided useful advice and technical support to MODAR ,in...business objectives are consistent with good technical practice. A unit at this scale will allow us to pin (town the fluid mechanical aspects of the process

  11. Selection of Steady-State Process Simulation Software to Optimize Treatment of Radioactive and Hazardous Waste

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, Todd Travis; Barnes, Charles Marshall; Lauerhass, Lance; Taylor, Dean Dalton

    2001-06-01

    The process used for selecting a steady-state process simulator under conditions of high uncertainty and limited time is described. Multiple waste forms, treatment ambiguity, and the uniqueness of both the waste chemistries and alternative treatment technologies result in a large set of potential technical requirements that no commercial simulator can totally satisfy. The aim of the selection process was two-fold. First, determine the steady-state simulation software that best, albeit not completely, satisfies the requirements envelope. And second, determine if the best is good enough to justify the cost. Twelve simulators were investigated with varying degrees of scrutiny. The candidate list was narrowed to three final contenders: ASPEN Plus 10.2, PRO/II 5.11, and CHEMCAD 5.1.0. It was concluded from "road tests" that ASPEN Plus appears to satisfy the project's technical requirements the best and is worth acquiring. The final software decisions provide flexibility: they involve annual rather than multi-year licensing, and they include periodic re-assessment.

  12. Selection of Steady-State Process Simulation Software to Optimize Treatment of Radioactive and Hazardous Waste

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, T. T.; Barnes, C. M.; Lauerhass, L.; Taylor, D. D.

    2001-06-01

    The process used for selecting a steady-state process simulator under conditions of high uncertainty and limited time is described. Multiple waste forms, treatment ambiguity, and the uniqueness of both the waste chemistries and alternative treatment technologies result in a large set of potential technical requirements that no commercial simulator can totally satisfy. The aim of the selection process was two-fold. First, determine the steady-state simulation software that best, albeit not completely, satisfies the requirements envelope. And second, determine if the best is good enough to justify the cost. Twelve simulators were investigated with varying degrees of scrutiny. The candidate list was narrowed to three final contenders: ASPEN Plus 10.2, PRO/II 5.11, and CHEMCAD 5.1.0. It was concluded from ''road tests'' that ASPEN Plus appears to satisfy the project's technical requirements the best and is worth acquiring. The final software decisions provide flexibility: they involve annual rather than multi-year licensing, and they include periodic re-assessment.

  13. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xueqin [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); School of Social Development and Public Policy, Beijing Normal University, Beijing 100875 (China); Li, Ning [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); Yuan, Shuai, E-mail: syuan@nmemc.org.cn [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); Xu, Ning; Shi, Wenqin; Chen, Weibin [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China)

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  14. An Introduction to the Analysis of Paired Hazard Rates in Studies of the Family.

    Science.gov (United States)

    Smith, Ken R.; McClean, Sally I.

    1998-01-01

    Hazard rate models are described, and selected techniques are used to analyze paired hazard rates when event times are right censored. The techniques are illustrated by looking at mortality patterns in husbands and wives. Recently developed measures and models are introduced. The advantages and disadvantages of the measures are discussed.…

  15. Formation of hazardous inorganic by-products during electrolysis of seawater as a disinfection process for desalination.

    Science.gov (United States)

    Oh, Byung Soo; Oh, Sang Guen; Hwang, Youn Young; Yu, Hye-Weon; Kang, Joon-Wun; Kim, In S

    2010-11-01

    From our previous study, an electrochemical process was determined to be a promising tool for disinfection in a seawater desalination system, but an investigation on the production of several hazardous by-products is still required. In this study, a more intensive exploration of the formation patterns of perchlorate and bromate during the electrolysis of seawater was conducted. In addition, the rejection efficiencies of the targeted by-products by membrane processes (microfiltration and seawater reverse osmosis) were investigated to uncover the concentrations remaining in the final product from a membrane-based seawater desalination system for the production of drinking water. On the electrolysis of seawater, perchlorate did not provoke any problem due to the low concentrations formed, but bromate was produced at a much higher level, resulting in critical limitation in the application of the electrochemical process to the desalination of seawater. Even though the formed bromate was rejected via microfiltration and reverse osmosis during the 1st and 2nd passes, the residual concentration was a few orders of magnitude higher than the USEPA regulation. Consequently, it was concluded that the application of the electrochemical process to seawater desalination cannot be recommended without the control of bromate. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Hazardous Waste

    Science.gov (United States)

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  17. System and method for integrating hazard-based decision making tools and processes

    Science.gov (United States)

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  18. Direct analysis in real time mass spectrometry for the rapid identification of four highly hazardous pesticides in agrochemicals.

    Science.gov (United States)

    Wang, Lei; Zhao, Pengyue; Zhang, Fengzu; Li, Yanjie; Pan, Canping

    2012-08-30

    Direct analysis in real time (DART) is a new ion source technique, which is conducted in the open air under ambient conditions, applied to the rapid and direct analysis of any material (gases, liquids, and solids) with minimal or no sample preparation. In order to take advantage of the capacity of DART mass spectrometry for the real-time analysis of hazardous ingredients in commercial agrochemicals, a pilot study of rapid qualitative determination of hazardous pesticides was performed. Highly hazardous pesticides were identified by DART ionization coupled to a single-quadrupole mass spectrometer (DART-MS). Acetonitrile was chosen for dissolving samples prior to the analysis. Samples were analyzed by this technique in as little as 5 s. Phorate, carbofuran, ethoprophos and fipronil were be detected directly from commercial agrochemicals. The ionization-related parameters (DART temperature, grid voltage and MS fragment) of these compounds were optimized to obtain highly response. Isotope patterns were taken into consideration for qualitative identification. Relative standard deviations (RSDs, n = 5) of 2.3-15.0% were obtained by measuring the relative abundance of selected isotopes. This study showed that DART-MS technology was able to qualitatively determine the existence of highly hazardous pesticides in commercial pesticide formulations. It is suggested that this technology should be applied for routine monitoring in the market. Copyright © 2012 John Wiley & Sons, Ltd.

  19. 78 FR 64425 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-10-29

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Food for Animals; Public... risk-based preventive controls for animal food. This proposed rule is one of several proposed rules... system. Among other things, FSMA requires FDA to issue regulations requiring preventive controls for...

  20. Flood Hazard Assessment of the coastal lowland in the Kujukuri Plain of Chiba Prefecture, Japan, using GIS and multicriteria decision analysis

    Science.gov (United States)

    CHEN, Huali; Tokunaga, Tomochika; Ito, Yuka; Sawamukai, Marie

    2014-05-01

    obtained using an algorithm that combines factors in weighted linear combinations. The assignment of the weight/rank values and their analysis were realized by the application of the Analytic Hierarchy Process (AHP) method. This study is the preliminary work to investigate the flood hazard at the Kujukuri Plain. Flood hazard map of the other years will be analyzed to investigate the temporal change of the flood hazard area, and more data will be collected and added to improve the assessment.

  1. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  2. Multifractal Analysis in Mining Microseismicity and its Application to Seismic Hazard Analysis in Mines

    Science.gov (United States)

    Pasten, D.; Comte, D.; Vallejos, J.

    2013-05-01

    During the last decades several authors showing that the spatial distribution of earthquakes follows multifractal laws and the most interesting behavior is the decreasing of the fratal dimensions before the ocurrence of a large earthquake, and also before its main aftershocks. A multifractal analysis to over 55920 microseismicity events recorded from January 2006 to January 2009 at Creighton mine, Canada was applied. In order to work with a complete catalogue in magnitude, it was taken the data associated with the linear part of the Gutenber-Richter law, with magnitudes greater than -1.5. A multifractal analysis was performed using microseismic data, considering that significant earthquakes are those with magnitude MW ≥ 1.0. A moving window was used, containing a constant number of events in order to guarantee the precise estimations of the fractal dimensions. After different trials, we choose 200 events for the number of the data points in each windows. Two consecutive windows were shifted by 20 events. The complete data set was separated in six sections and this multifractal analysis was applied for each section of 9320 data. The multifractal analysis of each section shows that there is a systematic decreasing of the fractal dimension (Dq) with time before the occurrence of rockburst or natural event with magnitude greater than MW ≥ 1.0, as it is observed in the seismic sequence of large earthquakes. This metodology was repeated for minimum magnitudes MW ≥ 1.5 and MW ≥ 2.0, obtaining same results. The best result was obtained using MW >= 2.0, a right answer rate vary between fifty and eighty percent. The result shows the possibility to use systematically the determination of the Dq parameter in order to detect the next rockburst or natural event in the studied mine. This project has been financially suppoerted by FONDECyT No 3120237 Grant (D.P).

  3. When wanting to change is not enough: automatic appetitive processes moderate the effects of a brief alcohol intervention in hazardous-drinking college students

    OpenAIRE

    Brian D Ostafin; Palfai, Tibor P.

    2012-01-01

    Background Research indicates that brief motivational interventions are efficacious treatments for hazardous drinking. Little is known, however, about the psychological processes that may moderate intervention success. Based on growing evidence that drinking behavior may be influenced by automatic (nonvolitional) mental processes, the current study examined whether automatic alcohol-approach associations moderated the effect of a brief motivational intervention. Specifically, we examined whet...

  4. Using Websites to Convey Scientific Uncertainties for Volcanic Processes and Potential Hazards

    Science.gov (United States)

    Venezky, D. Y.; Lowenstern, J. B.; Hill, D. P.

    2005-12-01

    The Yellowstone Volcano Observatory (YVO) and Long Valley Observatory (LVO) websites have greatly increased the public's awareness and access to information about scientific uncertainties for volcanic processes by communicating at multiple levels of understanding and varied levels of detail. Our websites serve a broad audience ranging from visitors unaware of the calderas, to lay volcano enthusiasts, to scientists, federal agencies, and emergency managers. Both Yellowstone and Long Valley are highly visited tourist attractions with histories of caldera-forming eruptions large enough to alter global climate temporarily. Although it is much more likely that future activity would be on a small scale at either volcano, we are constantly posed questions about low-probability, high-impact events such as the caldera-forming eruption depicted in the recent BBC/Discovery movie, "Supervolcano". YVO and LVO website objectives include: providing monitoring data, explaining the likelihood of future events, summarizing research results, helping media provide reliable information, and expanding on information presented by the media. Providing detailed current information is a crucial website component as the public often searches online to augment information gained from often cryptic pronouncements by the media. In May 2005, for example, YVO saw an order of magnitude increase in page requests on the day MSNBC ran the misleading headline, "Yellowstone eruption threat high." The headline referred not to current events but a general rating of Yellowstone as one of 37 "high threat" volcanoes in the USGS National Volcano Early Warning System report. As websites become a more dominant source of information, we continuously revise our communication plans to make the most of this evolving medium. Because the internet gives equal access to all information providers, we find ourselves competing with various "doomsday" websites that sensationalize and distort the current understanding of

  5. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity.

    Science.gov (United States)

    Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.

  6. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Harper, Bryan [Oregon State University (United States); Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan [Pacific Northwest National Laboratory (United States); Tang, Kaizhi [Intelligent Automation, Inc. (United States); Heredia-Langner, Alejandro [Pacific Northwest National Laboratory (United States); Lins, Roberto [CPqAM, Oswaldo Cruz Foundation, FIOCRUZ-PE (Brazil); Harper, Stacey, E-mail: stacey.harper@oregonstate.edu [Oregon State University (United States)

    2015-06-15

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships.

  7. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    Energy Technology Data Exchange (ETDEWEB)

    (NOEMAIL), R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  8. Correlation analysis of heat flux and fire behaviour and hazards of polycrystalline silicon photovoltaic panels

    Science.gov (United States)

    Ju, Xiaoyu; Zhou, Xiaodong; Peng, Fei; Wu, Zhibo; Lai, Dimeng; Hu, Yue; Yang, Lizhong

    2017-05-01

    This work aims to gain a better understanding of fire behaviour and hazards of PV panels under different radiation heat fluxes. The cone calorimeter tests were applied to simulate the situations when the front and back surfaces are exposed to heat flux in a fire, respectively. Through comparison of ignition time, mass loss rate and heat release rate, it is found that the back-up condition is more hazardous than face-up condition. Meanwhile, three key parameters: flashover propensity, total heat release and FED, were introduced to quantitatively illustrate fire hazards of a PV panel.

  9. The Resource Hazards Model for the Critical Infrastructure of the State Emergency Management Process

    Directory of Open Access Journals (Sweden)

    Ostrowska Teresa

    2014-08-01

    Full Text Available This paper presents an investigation of the relevant factors related to the construction of a resource model which is designed to be useful in the management processes of the operation of critical infrastructure (CI for state emergencies. The genesis of the research lay in the perceived need for effective protection of multidimensional CI methodologies, and it was influenced by the nature of the physical characteristics of the available resources. It was necessary to establish a clear structure and well defined objectives and to assess the functional and structural resources required, as well as the potential relational susceptibilities deriving from a number of possible threats and the possible seriousness of a specific range of incidents and their possible consequences. The interdependence of CI stocks is shown by the use of tables of resource classes. The dynamics of the interaction of CI resources are modeled by examining how using clusters of potential risks can at any given time create a class of compounds related to susceptibilities and threats to the resources. As a result, the model can be used to conduct multi-dimensional risk calculations for crisis management CI resource configurations.

  10. Testing to fulfill HACCP (Hazard Analysis Critical Control Points) requirements: principles and examples.

    Science.gov (United States)

    Gardner, I A

    1997-12-01

    On-farm HACCP (hazard analysis critical control points) monitoring requires cost-effective, yet accurate and reproducible tests that can determine the status of cows, milk, and the dairy environment. Tests need to be field-validated, and their limitations need to be established so that appropriate screening strategies can be initiated and test results can be rationally interpreted. For infections and residues of low prevalence, tests or testing strategies that are highly specific help to minimize false-positive results and excessive costs to the dairy industry. The determination of the numbers of samples to be tested in HACCP monitoring programs depends on the specific purpose of the test and the likely prevalence of the agent or residue at the critical control point. The absence of positive samples from a herd test should not be interpreted as freedom from a particular agent or residue unless the entire herd has been tested with a test that is 100% sensitive. The current lack of field-validated tests for most of the chemical and infectious agents of concern makes it difficult to ensure that the stated goals of HACCP programs are consistently achieved.

  11. HACCP (Hazard Analysis Critical Control Points): is it coming to the dairy?

    Science.gov (United States)

    Cullor, J S

    1997-12-01

    The risks and consequences of foodborne and waterborne pathogens are coming to the forefront of public health concerns, and strong pressure is being applied on agriculture for immediate implementation of on-farm controls. The FDA is considering HACCP (Hazard Analysis Critical Control Points) as the new foundation for revision of the US Food Safety Assurance Program because HACCP is considered to be a science-based, systematic approach to the prevention of food safety problems. In addition, the implementation of HACCP principles permits more government oversight through requirements for standard operating procedures and additional systems for keeping records, places primary responsibility for ensuring food safety on the food manufacturer or distributor, and may assist US food companies in competing more effectively in the world market. With the HACCP-based program in place, a government investigator should be able to determine and evaluate both current and past conditions that are critical to ensuring the safety of the food produced by the facility. When this policy is brought to the production unit, the impact for producers and veterinarians will be substantial.

  12. Landslide hazard analysis for pipelines: The case of the Simonette river crossing

    Energy Technology Data Exchange (ETDEWEB)

    Grivas, D.A.; Schultz, B.C. [Arista International, Inc., Niskayuna, NY (United States); O`Neil, G.; Rizkalla, M. [NOVA Gas Transmission Ltd., Calgary, Alberta (Canada); McGuffey, V.C.

    1995-12-31

    The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associated with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.

  13. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

    Science.gov (United States)

    Coulbert, C. D.

    1978-01-01

    A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

  14. Hazard analysis and possibilities for preventing botulism originating from meat products

    Directory of Open Access Journals (Sweden)

    Vasilev Dragan

    2008-01-01

    Full Text Available The paper presents the more important data on the bacteria Clostridium botulinum, the appearance of botulism, hazard analysis and the possibilities for preventing botulism. Proteolytic strains of C.botulinum Group I, whose spores are resistant to heat, create toxins predominantly in cans containing slightly sour food items, in the event that the spores are not inactivated in the course of sterilization. Non-proteolytic strains of Group II are more sensitive to high temperatures, but they have the ability to grow and create toxins at low temperatures. Type E most often creates a toxin in vacuum-packed smoked fish, and the non-proteolytic strain type B in dried hams and certain pasteurized meat products. The following plays an important role in the prevention of botulism: reducing to a minimum meat contamination with spores of clostridia, implementing good hygiene measures and production practice during the slaughter of animals, the inactivation of spores of C. botulinum during sterilization (F>3, and, in dried hams and pasteurized products, the prevention of bacterial growth and toxin forming by maintaining low temperatures in the course of production and storage, as well as the correct use of substances that inhibit the multiplication of bacteria and the production of toxins (nitrites, table salt, etc..

  15. LIFE CYCLE ANALYSIS OF HAZARDOUS WASTE AND RECYCLABLE ORIGIN OF HOUSEHOLD

    Directory of Open Access Journals (Sweden)

    Patrícia Raquel da Silva Sottoriva

    2011-09-01

    Full Text Available As the sustainable development that the society aims is based on economic, social and environmental factors, it can be said that the environmental crisis has as the component factors: natural resources, population and pollution. To reduce the pressure that human activities have on the environment, it is necessary to know the production process, inputs and outputs, to reduce potential problems such as waste and facilitate opportunities for system optimization. In this context it was investigated the life cycle of waste and household hazardous recyclable items to identify possibilities for reducing impact on supply chains. As a result it was found that the raw material most used by the paper industry is pine and eucalyptus plantations and some industries also use sugar cane. From the growing process until the paper is industrialized, there is a large demand of time. The cutting of eucalyptus should be done between 5 and 7 years, since the pine requires 10 to 12 years. After used, the papers can and should be recycled. When recycling 1 ton of paper 29.2 m3 of water can be saved, 3.51 MWh of electricity 76 and 22 trees when compared to traditional production processes. The cultivation of trees also contributes to carbon capture and sequestration. The eucalyptus ages 2, 4, 6, 8 years fixing concentrations of 11.12, 18.55, 80.91 and 97.86 t / ha, respectively. The paper can also be designed to compost due to biodegradability. The metal, glass and plastics are not biodegradable and inorganic nature needing to be recycled or reused. Recycling 1 ton of plastic is no economy of 5.3 MWh and 500 kg of oil. Even with the gains of environmental, social and economic impacts of recycling compared to traditional processes, in Brazil, the percentage of recycling paper and glass and PET bottles are less than 60%. The recycling of aluminum cans and steel exceeds 90%. Lamps and batteries are materials that are inadequately provide for contamination to the

  16. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    Science.gov (United States)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  17. Hazards assessment for the INEL Landfill Complex

    Energy Technology Data Exchange (ETDEWEB)

    Knudsen, J.K.; Calley, M.B.

    1994-02-01

    This report documents the hazards assessment for the INEL Landfill Complex (LC) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and the DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes the hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding the LC, the buildings and structures at the LC, and the processes that are used at the LC are described in this report. All hazardous materials, both radiological and nonradiological, at the LC were identified and screened against threshold quantities according to DOE Order 5500.3A guidance. Asbestos at the Asbestos Pit was the only hazardous material that exceeded its specified threshold quantity. However, the type of asbestos received and the packaging practices used are believed to limit the potential for an airborne release of asbestos fibers. Therefore, in accordance with DOE Order 5500.3A guidance, no further hazardous material characterization or analysis was required for this hazards assessment.

  18. Bio-slurry reaction system and process for hazardous waste treatment

    Energy Technology Data Exchange (ETDEWEB)

    Castaldi, F.J.

    1993-08-03

    A method is described for improved slurry-phase bioremediation treatment of organic sludge and mixtures of organic sludge and organic-contaminated soils by dissolving the contaminants into an aqueous phase and microbially degrading same; comprising the steps of: (a) forming a high solids slurry of the sludge and soils with water and an active bioslurry consisting of large populations of acclimated hydrocarbon-utilizing bacteria and small amounts of biodegradation residue; the bacteria being selected from the genera Pseudomonas and Acinetobacter, and being capable of producing extracellular long-chain hydrocarbon-emulsifying and hydrocarbon-solubilizing agents for decreasing aqueous surface tension and lowering interfacial tension between oil and water; (b) passing the high solids slurry through a plurality of in-series bioreactors in each of which a low hydraulic shear is maintained to promote the development of a large population of microorganisms that will form flocculent suspensions; the first stage bioreactor in the series being a waste dissolution reactor operated under anoxic conditions to form a stable emulsion through the presence of the hydrocarbon-emulsifying and hydrocarbon-solubilizing agents produced by the bacteria; (c) continuously or semicontinuously flowing the output from the series of bioreactors to a liquid-solids separator to partition the mixed liquor bioslurry from the biodegraded waste residue; (d) returning the mixed liquor bioslurry containing small amounts of biodegradation residue to the slurry of step (a) for recycling; and (e) recirculating off-gas components from the system including one or more members of the group consisting of benzene, toluene, xylenes, and naphthalene back to one or more of the bioreactors, to return high volatility toxic constituents for increased microbial degradation and control of volatile toxic constituents emissions from the process.

  19. Restriction of the use of hazardous substances (RoHS in the personal computer segment: analysis of the strategic adoption by the manufacturers settled in Brazil

    Directory of Open Access Journals (Sweden)

    Ademir Brescansin

    2015-09-01

    Full Text Available The enactment of the RoHS Directive (Restriction of Hazardous Substances in 2003, limiting the use of certain hazardous substances in electronic equipment has forced companies to adjust their products to comply with this legislation. Even in the absence of similar legislation in Brazil, manufacturers of personal computers which are located in this country have been seen to adopt RoHS for products sold in the domestic market and abroad. The purpose of this study is to analyze whether these manufacturers have really adopted RoHS, focusing on their motivations, concerns, and benefits. This is an exploratory study based on literature review and interviews with HP, Dell, Sony, Lenovo, Samsung, LG, Itautec, and Positivo, using summative content analysis. The results showed that initially, global companies adopted RoHS to market products in Europe, and later expanded this practice to all products. Brazilian companies, however, adopted RoHS to participate in the government’s sustainable procurement bidding processes. It is expected that this study can assist manufacturers in developing strategies for reducing or eliminating hazardous substances in their products and processes, as well as help the government to formulate public policies on reducing risks of environmental contamination.

  20. Advanced Rapid Imaging & Analysis for Monitoring Hazards (ARIA-MH) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a service-oriented hazard/disaster monitoring data system enabling both science and decision-support communities to monitor ground motion in areas of...

  1. Readiness to implement Hazard Analysis and Critical Control Point (HACCP) systems in Iowa schools.

    Science.gov (United States)

    Henroid, Daniel; Sneed, Jeannie

    2004-02-01

    To evaluate current food-handling practices, food safety prerequisite programs, and employee knowledge and food safety attitudes and provide baseline data for implementing Hazard Analysis and Critical Control Point (HACCP) systems in school foodservice. One member of the research team visited each school to observe food-handling practices and assess prerequisite programs using a structured observation form. A questionnaire was used to determine employees' attitudes, knowledge, and demographic information. A convenience sample of 40 Iowa schools was recruited with input from the Iowa Department of Education. Descriptive statistics were used to summarize data. One-way analysis of variance was used to assess differences in attitudes and food safety knowledge among managers, cooks, and other foodservice employees. Multiple linear regression assessed the relationship between manager and school district demographics and the food safety practice score. Proper food-handling practices were not being followed in many schools and prerequisite food safety programs for HACCP were found to be inadequate for many school foodservice operations. School foodservice employees were found to have a significant amount of food safety knowledge (15.9+/-2.4 out of 20 possible points). School districts with managers (P=.019) and employees (P=.030) who had a food handler certificate were found to have higher food safety practice scores. Emphasis on implementing prerequisite programs in preparation for HACCP is needed in school foodservice. Training programs, both basic food safety such as ServSafe and HACCP, will support improvement of food-handling practices and implementation of prerequisite programs and HACCP.

  2. Workplace health hazards: analysis of hotline calls over a six-year period.

    Science.gov (United States)

    Quint, J; Handley, M; Cummings, K

    1990-01-01

    Between 1981-1986 a state-based occupational health telephone hotline received more than 8,000 inquiries on over 3,000 hazardous agents. Major caller groups were employees (37%), employers (20%), health care providers, primarily physicians (19%), government agencies (12%), and labor unions (6%). Employees were the fastest growing caller group. Callers inquired about general health hazards of chemicals (65%), the relation of symptoms to work (22%), and risks to pregnancy (13%). PMID:2297067

  3. [An analysis of occupational hazard in manufacturing industry in Guangzhou, China, in 2013].

    Science.gov (United States)

    Zhang, Haihong; Li, Yongqin; Zhou, Hailin; Rong, Xing; Zhu, Shaofang; He, Yinan; Zhai, Ran; Liu, Yiming

    2015-08-01

    To provide data for the occupational health supervision by analyzing the occupational health status in manufacturing industry in Guangzhou, China. The occupational health investigation was performed in 280 enterprises randomly selected from 8 industries based on industry stratification. According to the occupational health standards, 198 out of the 280 enterprises were supervised and monitored. Sample testing was performed in 3~5 workplaces where workers were exposed to the highest concentration/intensity of occupational hazard for the longest time. Comparative analyses of the overproof rates of hazard were performed among enterprises, workplaces, and testing items from different industries. The concentrations of occupational hazard in 42.93% (85/198) of enterprises and 22.96% (200/871) of workplaces were above the limit concentration. The most severe hazards were the noises in shipbuilding and wooden furniture industries and the welding fumes in shipbuilding industry. Less than 30% of enterprises were able to provide occupational health examination and periodic test reports of occupational hazard in workplaces. The rate of the workers with abnormal occupational health examination results and the need for reexamination reached 6.63% (832/12 549), and they were mostly from shipbuilding, wooden furniture, and chemical industries. The occupational health supervision should be strengthened in enterprises, and hazard from noises and dusts should be selectively controlled or reduced. The publication of relevant data and information of occupational health in enterprises should be promoted to enhance social supervision.

  4. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  5. Hazard analysis of critical control points assessment as a tool to respond to emerging infectious disease outbreaks.

    Directory of Open Access Journals (Sweden)

    Kelly L Edmunds

    Full Text Available Highly pathogenic avian influenza virus (HPAI strain H5N1 has had direct and indirect economic impacts arising from direct mortality and control programmes in over 50 countries reporting poultry outbreaks. HPAI H5N1 is now reported as the most widespread and expensive zoonotic disease recorded and continues to pose a global health threat. The aim of this research was to assess the potential of utilising Hazard Analysis of Critical Control Points (HACCP assessments in providing a framework for a rapid response to emerging infectious disease outbreaks. This novel approach applies a scientific process, widely used in food production systems, to assess risks related to a specific emerging health threat within a known zoonotic disease hotspot. We conducted a HACCP assessment for HPAI viruses within Vietnam's domestic poultry trade and relate our findings to the existing literature. Our HACCP assessment identified poultry flock isolation, transportation, slaughter, preparation and consumption as critical control points for Vietnam's domestic poultry trade. Introduction of the preventative measures highlighted through this HACCP evaluation would reduce the risks posed by HPAI viruses and pressure on the national economy. We conclude that this HACCP assessment provides compelling evidence for the future potential that HACCP analyses could play in initiating a rapid response to emerging infectious diseases.

  6. Hazard analysis of critical control points assessment as a tool to respond to emerging infectious disease outbreaks.

    Science.gov (United States)

    Edmunds, Kelly L; Hunter, Paul R; Few, Roger; Bell, Diana J

    2013-01-01

    Highly pathogenic avian influenza virus (HPAI) strain H5N1 has had direct and indirect economic impacts arising from direct mortality and control programmes in over 50 countries reporting poultry outbreaks. HPAI H5N1 is now reported as the most widespread and expensive zoonotic disease recorded and continues to pose a global health threat. The aim of this research was to assess the potential of utilising Hazard Analysis of Critical Control Points (HACCP) assessments in providing a framework for a rapid response to emerging infectious disease outbreaks. This novel approach applies a scientific process, widely used in food production systems, to assess risks related to a specific emerging health threat within a known zoonotic disease hotspot. We conducted a HACCP assessment for HPAI viruses within Vietnam's domestic poultry trade and relate our findings to the existing literature. Our HACCP assessment identified poultry flock isolation, transportation, slaughter, preparation and consumption as critical control points for Vietnam's domestic poultry trade. Introduction of the preventative measures highlighted through this HACCP evaluation would reduce the risks posed by HPAI viruses and pressure on the national economy. We conclude that this HACCP assessment provides compelling evidence for the future potential that HACCP analyses could play in initiating a rapid response to emerging infectious diseases.

  7. Tank farms hazards assessment

    Energy Technology Data Exchange (ETDEWEB)

    Broz, R.E.

    1994-09-30

    Hanford contractors are writing new facility specific emergency procedures in response to new and revised US Department of Energy (DOE) Orders on emergency preparedness. Emergency procedures are required for each Hanford facility that has the potential to exceed the criteria for the lowest level emergency, an Alert. The set includes: (1) a facility specific procedure on Recognition and Classification of Emergencies, (2) area procedures on Initial Emergency Response and, (3) an area procedure on Protective Action Guidance. The first steps in developing these procedures are to identify the hazards at each facility, identify the conditions that could release the hazardous material, and calculate the consequences of the releases. These steps are called a Hazards Assessment. The final product is a document that is similar in some respects to a Safety Analysis Report (SAR). The document could br produced in a month for a simple facility but could take much longer for a complex facility. Hanford has both types of facilities. A strategy has been adopted to permit completion of the first version of the new emergency procedures before all the facility hazards Assessments are complete. The procedures will initially be based on input from a task group for each facility. This strategy will but improved emergency procedures in place sooner and therefore enhance Hanford emergency preparedness. The purpose of this document is to summarize the applicable information contained within the Waste Tank Facility ``Interim Safety Basis Document, WHC-SD-WM-ISB-001`` as a resource, since the SARs covering Waste Tank Operations are not current in all cases. This hazards assessment serves to collect, organize, document and present the information utilized during the determination process.

  8. Comprehensive baseline hazard assessments

    Energy Technology Data Exchange (ETDEWEB)

    Warren, S.B.; Amundson, T.M.

    1994-10-01

    Westinghouse Hanford Company (WHC) has developed and implemented a cost effective/value-added program/process that assists in fulfilling key elements of the Occupational Safety and Health Administration`s (OSHA) voluntary Protection Program (VPP) requirements. WHC is the prime contractor for the US Department of Energy (US DOE) at the Hanford site, located in Richland, Washington. The site consists of over 560 square miles, contains over 1100 facilities and has an employment of approximately 18,000. WHC is currently in the application review phase for the US DOE equivalent of OSHA-VPP ``merit`` program status. The program involves setting up a team consisting of industrial safety and health (industrial hygienists) professionals, members of the maintenance and operations work force, and facility management. This team performs a workplace hazard characterization/analysis and then applies a risk assessment approach to prioritize observed and potential hazards in need of abatement. The process involves using checklists that serve as a guide for evaluation/inspection criteria. Forms are used to document meetings, field observations, instrument calibration and performance testing. Survey maps are generated to document quality records of measurement results. A risk assessment code matrix with a keyword index was developed to facilitate consistency. The end product is useful in communicating hazards to facility management, health and safety professionals, audit/appraisal groups, and most importantly, facility workers.

  9. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  10. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    2005-01-01

    Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005.......Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005....

  11. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  12. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  13. The RiskScape System - a tool for quantitative multi-risk analysis for natural hazards.

    Science.gov (United States)

    Schmidt, J.; Reese, S.; Matcham, I.; King, A.; Bell, R.

    2009-04-01

    This paper introduces a generic framework for multi-risk modelling developed in the project ‘Regional RiskScape' at the Research Organization GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand. Our goal was to develop a generic technology for modelling risks from multiple natural hazards and for multiple risk elements. The framework is independent on the specific nature of the individual hazard and individual risk element. A software prototype has been developed which is capable of ‘plugging in' various natural hazards and risk elements without reconfiguring / adapting the generic software framework. To achieve that goal we developed a set of standards for treating the fundamental components of a risk model: hazards, assets (risk elements), and vulnerability models (or fragility functions). Thus, the developed prototype system is able to understand any hazard, asset, or fragility model which is provided to the system according to that standard. We tested the software prototype for modelling earthquake, volcanic, flood, wind, and tsunami risks for urban centres in New Zealand.

  14. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  15. Vygotsky's Analysis of Children's Meaning Making Processes

    Science.gov (United States)

    Mahn, Holbrook

    2012-01-01

    Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…

  16. Novel head and neck cancer survival analysis approach: random survival forests versus Cox proportional hazards regression.

    Science.gov (United States)

    Datema, Frank R; Moya, Ana; Krause, Peter; Bäck, Thomas; Willmes, Lars; Langeveld, Ton; Baatenburg de Jong, Robert J; Blom, Henk M

    2012-01-01

    Electronic patient files generate an enormous amount of medical data. These data can be used for research, such as prognostic modeling. Automatization of statistical prognostication processes allows automatic updating of models when new data is gathered. The increase of power behind an automated prognostic model makes its predictive capability more reliable. Cox proportional hazard regression is most frequently used in prognostication. Automatization of a Cox model is possible, but we expect the updating process to be time-consuming. A possible solution lies in an alternative modeling technique called random survival forests (RSFs). RSF is easily automated and is known to handle the proportionality assumption coherently and automatically. Performance of RSF has not yet been tested on a large head and neck oncological dataset. This study investigates performance of head and neck overall survival of RSF models. Performances are compared to a Cox model as the "gold standard." RSF might be an interesting alternative modeling approach for automatization when performances are similar. RSF models were created in R (Cox also in SPSS). Four RSF splitting rules were used: log-rank, conservation of events, log-rank score, and log-rank approximation. Models were based on historical data of 1371 patients with primary head-and-neck cancer, diagnosed between 1981 and 1998. Models contain 8 covariates: tumor site, T classification, N classification, M classification, age, sex, prior malignancies, and comorbidity. Model performances were determined by Harrell's concordance error rate, in which 33% of the original data served as a validation sample. RSF and Cox models delivered similar error rates. The Cox model performed slightly better (error rate, 0.2826). The log-rank splitting approach gave the best RSF performance (error rate, 0.2873). In accord with Cox and RSF models, high T classification, high N classification, and severe comorbidity are very important covariates in the

  17. Introduction to special section on phenomenology, underlying processes, and hazard implications of aseismic slip and nonvolcanic tremor

    Science.gov (United States)

    Gomberg, Joan

    2010-01-01

    This paper introduces the special section on the "phenomenology, underlying processes, and hazard implications of aseismic slip and nonvolcanic tremor" by highlighting key results of the studies published in it. Many of the results indicate that seismic and aseismic manifestations of slow slip reflect transient shear displacements on the plate interface, with the outstanding exception of northern Cascadia where tremor sources have been located on and above the plate interface (differing models of the plate interface there also need to be reconciled). Slow slip phenomena appear to result from propagating deformation that may develop with persistent gaps and segment boundaries. Results add to evidence that when tectonic deformation is relaxed via slow slip, most relaxation occurs aseismically but with seismic signals providing higher-resolution proxies for the aseismic slip. Instead of two distinct slip modes as suggested previously, lines between "fast" and "slow" slip more appropriately may be described as blurry zones. Results reported also show that slow slip sources do not coincide with a specific temperature or metamorphic reaction. Their associations with zones of high conductivity and low shear to compressional wave velocity ratios corroborate source models involving pore fluid pressure buildup and release. These models and spatial anticorrelations between earthquake and tremor activity also corroborate a linkage between slow slip and frictional properties transitional between steady state and stick-slip. Finally, this special section highlights the benefits of global and multidisciplinary studies, which demonstrate that slow phenomena are not confined to beneath the locked zone but exist in many settings.

  18. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  19. ANALYSIS AND MITIGATION OF X-RAY HAZARD GENERATED FROM HIGH INTENSITY LASER-TARGET INTERACTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; /SLAC

    2011-03-21

    Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

  20. Development and Analysis of a Hurricane Hazard Model for Disaster Risk Assessment in Central America

    Science.gov (United States)

    Pita, G. L.; Gunasekera, R.; Ishizawa, O. A.

    2014-12-01

    Hurricane and tropical storm activity in Central America has consistently caused over the past decades thousands of casualties, significant population displacement, and substantial property and infrastructure losses. As a component to estimate future potential losses, we present a new regional probabilistic hurricane hazard model for Central America. Currently, there are very few openly available hurricane hazard models for Central America. This resultant hazard model would be used in conjunction with exposure and vulnerability components as part of a World Bank project to create country disaster risk profiles that will assist to improve risk estimation and provide decision makers with better tools to quantify disaster risk. This paper describes the hazard model methodology which involves the development of a wind field model that simulates the gust speeds at terrain height at a fine resolution. The HURDAT dataset has been used in this study to create synthetic events that assess average hurricane landfall angles and their variability at each location. The hazard model also then estimates the average track angle at multiple geographical locations in order to provide a realistic range of possible hurricane paths that will be used for risk analyses in all the Central-American countries. This probabilistic hurricane hazard model is then also useful for relating synthetic wind estimates to loss and damage data to develop and calibrate existing empirical building vulnerability curves. To assess the accuracy and applicability, modeled results are evaluated against historical events, their tracks and wind fields. Deeper analyses of results are also presented with a special reference to Guatemala. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the

  1. A hazard rate analysis of fertility using duration data from Malaysia.

    Science.gov (United States)

    Chang, C

    1988-01-01

    Data from the Malaysia Fertility and Family Planning Survey (MFLS) of 1974 were used to investigate the effects of biological and socioeconomic variables on fertility based on the hazard rate model. Another study objective was to investigate the robustness of the findings of Trussell et al. (1985) by comparing the findings of this study with theirs. The hazard rate of conception for the jth fecundable spell of the ith woman, hij, is determined by duration dependence, tij, measured by the waiting time to conception; unmeasured heterogeneity (HETi; the time-invariant variables, Yi (race, cohort, education, age at marriage); and time-varying variables, Xij (age, parity, opportunity cost, income, child mortality, child sex composition). In this study, all the time-varying variables were constant over a spell. An asymptotic X2 test for the equality of constant hazard rates across birth orders, allowing time-invariant variables and heterogeneity, showed the importance of time-varying variables and duration dependence. Under the assumption of fixed effects heterogeneity and the Weibull distribution for the duration of waiting time to conception, the empirical results revealed a negative parity effect, a negative impact from male children, and a positive effect from child mortality on the hazard rate of conception. The estimates of step functions for the hazard rate of conception showed parity-dependent fertility control, evidence of heterogeneity, and the possibility of nonmonotonic duration dependence. In a hazard rate model with piecewise-linear-segment duration dependence, the socioeconomic variables such as cohort, child mortality, income, and race had significant effects, after controlling for the length of the preceding birth. The duration dependence was consistant with the common finding, i.e., first increasing and then decreasing at a slow rate. The effects of education and opportunity cost on fertility were insignificant.

  2. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Gürpinar, Aybars, E-mail: aybarsgurpinar2007@yahoo.com [Nuclear & Risk Consultancy, Anisgasse 4, 1221 Vienna (Austria); Serva, Leonello, E-mail: lserva@alice.it [Independent Consultant, Via dei Dauni 1, 00185 Rome (Italy); Livio, Franz, E-mail: franz.livio@uninsubria.it [Dipartimento di Scienza ed Alta Tecnologia, Università degli Studi dell’Insubria, Via Velleggio, 11, 22100 Como (Italy); Rizzo, Paul C., E-mail: paul.rizzo@rizzoasoc.com [RIZZO Associates, 500 Penn Center Blvd., Suite 100, Pittsburgh, PA 15235 (United States)

    2017-01-15

    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  3. Use of fragile geologic structures as indicators of unexceeded ground motions and direct constraints on probabilistic seismic hazard analysis

    Science.gov (United States)

    Baker, J.W.; Whitney, John W.; Hanks, Thomas C.; Abramson, Norman A.; Board, Mark P.

    2013-01-01

    We present a quantitative procedure for constraining probabilistic seismic hazard analysis results at a given site, based on the existence of fragile geologic structures at that site. We illustrate this procedure by analyzing precarious rocks and undamaged lithophysae at Yucca Mountain, Nevada. The key metric is the probability that the feature would have survived to the present day, assuming that the hazard results are correct. If the fragile geologic structure has an extremely low probability of having survived (which would be inconsistent with the observed survival of the structure), then the calculations illustrate how much the hazard would have to be reduced to result in a nonnegligible survival probability. The calculations are able to consider structures the predicted failure probabilities of which are a function of one or more ground‐motion parameters, as well as structures that either rapidly or slowly evolved to their current state over time. These calculations are the only way to validate seismic hazard curves over long periods of time.

  4. Use of remote sensing and seismotectonic parameters for seismic hazard analysis of Bangalore

    Directory of Open Access Journals (Sweden)

    T. G. Sitharam

    2006-01-01

    Full Text Available Deterministic Seismic Hazard Analysis (DSHA for the Bangalore, India has been carried out by considering the past earthquakes, assumed subsurface fault rupture lengths and point source synthetic ground motion model. The sources have been identified using satellite remote sensing images and seismotectonic atlas map of India and relevant field studies. Maximum Credible Earthquake (MCE has been determined by considering the regional seismotectonic activity in about 350 km radius around Bangalore. The seismotectonic map has been prepared by considering the faults, lineaments, shear zones in the area and past moderate earthquakes of more than 470 events having the moment magnitude of 3.5 and above. In addition, 1300 number of earthquake tremors having moment magnitude of less than 3.5 has been considered for the study. Shortest distance from the Bangalore to the different sources is measured and then Peak Horizontal Acceleration (PHA is calculated for the different sources and moment magnitude of events using regional attenuation relation for peninsular India. Based on Wells and Coppersmith (1994 relationship, subsurface fault rupture length of about 3.8% of total length of the fault shown to be matching with past earthquake events in the area. To simulate synthetic ground motions, Boore (1983, 2003 SMSIM programs have been used and the PHA for the different locations is evaluated. From the above approaches, the PHA of 0.15 g was established. This value was obtained for a maximum credible earthquake having a moment magnitude of 5.1 for a source Mandya-Channapatna-Bangalore lineament. This particular source has been identified as a vulnerable source for Bangalore. From this study, it is very clear that Bangalore area can be described as seismically moderately active region. It is also recommended that southern part of Karnataka in particular Bangalore, Mandya and Kolar, need to be upgraded from current Indian Seismic Zone II to Seismic Zone III

  5. International collaboration towards a global analysis of volcanic hazards and risk

    Science.gov (United States)

    Loughlin, Susan; Duncan, Melanie; Volcano Model Network, Global

    2017-04-01

    Approximately 800 million people live within 100km of an active volcano and such environments are often subject to multiple natural hazards. Volcanic eruptions and related volcanic hazards are less frequent than many other natural hazards but when they occur they can have immediate and long-lived impacts so it is important that they are not overlooked in a multi-risk assessment. Based on experiences to date, it's clear that natural hazards communities need to address a series of challenges in order to move to a multi-hazard approach to risk assessment. Firstly, the need to further develop synergies and coordination within our own communities at local to global scales. Secondly, we must collaborate and identify opportunities for harmonisation across natural hazards communities: for instance, by ensuring our databases are accessible and meet certain standards, a variety of users will be then able to contribute and access data. Thirdly, identifying the scale and breadth of multi-risk assessments needs to be co-defined with decision-makers, which will constrain the relevant potential cascading/compounding hazards to consider. Fourthly, and related to all previous points, multi-risk assessments require multi-risk knowledge, requiring interdisciplinary perspectives, as well as discipline specific expertise. The Global Volcano Model network (GVM) is a growing international network of (public and private) institutions and organisations, which have the collective aim of identifying and reducing volcanic risks. GVM's values embody collaboration, scientific excellence, open-access (wherever possible) and, above all, public good. GVM highlights and builds on the best research available within the volcanological community, drawing on the work of IAVCEI Commissions and other research initiatives. It also builds on the local knowledge of volcano observatories and collaborating scientists, ensuring that global efforts are underpinned by local evidence. Some of GVM's most

  6. Fire hazard analysis of alcohol aqueous solution and Chinese liquor based on flash point

    Science.gov (United States)

    Chen, Qinpei; Kang, Guoting; Zhou, Tiannian; Wang, Jian

    2017-10-01

    In this paper, a series of experiments were conducted to study the flash point of alcohol aqueous solution and Chinese liquor. The fire hazard of the experimental results was analysed based on the standard GB50160-2008 of China. The result shows open-cup method doesn’t suit to alcohol aqueous solution. On the other hand, the closed-cup method shows good applicability. There is a non-linear relationship between closed-cup flash point and alcohol volume concentration. And the prediction equation established in this paper shows good fitting to the flash point and fire hazard classification of Chinese liquor.

  7. Water-molten uranium hazard analysis. Final report. LATA report No. 92

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, P.S.; Rigdon, L.D.; Donham, B.J.

    1979-08-21

    The hazard potential of cooling water leakage into the crucible of molten uranium in the MARS laser isotope separation experiment was investigated. A vapor-phase explosion is highly unlikely in any of the scenarios defined for MARS. For the operating basis accident, the gas pressure transient experienced by the vessel wall is 544 psia peak with a duration of 200 ..mu..s, and the peak hoop stress is about 20,000 psi in a 0.5-in. wall. Design and procedural recommendations are given for reducing the hazard. (DLC)

  8. Ground landslide hazard potency using geoelectrical resistivity analysis and VS30, case study at geophysical station, Lembang, Bandung

    Science.gov (United States)

    Rohadi, Supriyanto; Sakya, Andi Eka; Masturyono, Murjaya, Jaya; Sunardi, Bambang; Rasmid, Ngadmanto, Drajat; Susilanto, Pupung; Nugraha, Jimmi; Pakpahan, Suliyanti

    2017-07-01

    We have conducted geoelectric resistivity and shear wave velocity (Vs30) study to identify the landslide potential hazard, around Geophysics Station Lembang, Bandung (107,617° E and 6,825° S). The the geoelectric analysis using Dipole-Dipole resitivity configuration, while shear wave velocity analysis performed using the Multichannel Analysis of Surface Wave (MASW). The study results indicate that the assumed soil or clay depth from the electrical resistivity observation was in accordance with the confirmed soil or clay depth by the MASW investigation. Based on these conditions, indicate the high potential of landsliding in this area, landslide potential supported by high slope angle in this area.

  9. Hazards assessment for the Waste Experimental Reduction Facility

    Energy Technology Data Exchange (ETDEWEB)

    Calley, M.B.; Jones, J.L. Jr.

    1994-09-19

    This report documents the hazards assessment for the Waste Experimental Reduction Facility (WERF) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. This hazards assessment describes the WERF, the area surrounding WERF, associated buildings and structures at WERF, and the processes performed at WERF. All radiological and nonradiological hazardous materials stored, used, or produced at WERF were identified and screened. Even though the screening process indicated that the hazardous materials could be screened from further analysis because the inventory of radiological and nonradiological hazardous materials were below the screening thresholds specified by DOE and DOE-ID guidance for DOE Order 5500.3A, the nonradiological hazardous materials were analyzed further because it was felt that the nonradiological hazardous material screening thresholds were too high.

  10. Incorporating the effects of topographic amplification in the analysis of earthquake-induced landslide hazards using logistic regression

    Science.gov (United States)

    Lee, S. T.; Yu, T. T.; Peng, W. F.; Wang, C. L.

    2010-12-01

    Seismic-induced landslide hazards are studied using seismic shaking intensity based on the topographic amplification effect. The estimation of the topographic effect includes the theoretical topographic amplification factors and the corresponding amplified ground motion. Digital elevation models (DEM) with a 5-m grid space are used. The logistic regression model and the geographic information system (GIS) are used to perform the seismic landslide hazard analysis. The 99 Peaks area, located 3 km away from the ruptured fault of the Chi-Chi earthquake, is used to test the proposed hypothesis. An inventory map of earthquake-triggered landslides is used to produce a dependent variable that takes a value of 0 (no landslides) or 1 (landslides). A set of independent parameters, including lithology, elevation, slope gradient, slope aspect, terrain roughness, land use, and Arias intensity (Ia) with the topographic effect. Subsequently, logistic regression is used to find the best fitting function to describe the relationship between the occurrence and absence of landslides within an individual grid cell. The results of seismic landslide hazard analysis that includes the topographic effect (AUROC = 0.890) are better than those of the analysis without it (AUROC = 0.874).

  11. Smartphones for post-event analysis: a low-cost and easily accessible approach for mapping natural hazards

    Science.gov (United States)

    Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo

    2015-04-01

    A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar

  12. The Integration of an Operational Fire Hot Spots Processing Chain in a Multi-Hazard Emergency Management Service Platform (PHAROS)

    OpenAIRE

    Strobl, Christian; Stein, Enrico; Tungalagsaikhan, Padsuren; Ebke, Walter; Schwarz, Egbert; Ruppert, Thomas; Aravena Pelizari, Patrick; Raape, Ulrich

    2015-01-01

    The project PHAROS (Project on a Multi-Hazard Open Platform for Satellite Based Downstream Services) designs and implements a multi-hazard open service platform which integrates space-based earth observation, satellite communications and navigation (Galileo/GNSS) assets to provide sustainable (pre-operational) services for a wide variety of users in multi-application domains, such as prediction/early detection of emergencies, population alerting, environmental monitoring and crisis management...

  13. A cross-hazard analysis of terse message retransmission on Twitter

    Science.gov (United States)

    Sutton, Jeannette; Gibson, C. Ben; Phillips, Nolan Edward; Spiro, Emma S.; League, Cedar; Johnson, Britta; Fitzhugh, Sean M.; Butts, Carter T.

    2015-01-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes—local network properties, message content, and message style—that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates. PMID:26627233

  14. ON-SITE MERCURY ANALYSIS OF SOIL AT HAZARDOUS WASTE SITES BY IMMUNOASSAY AND ASV

    Science.gov (United States)

    Two field methods for Hg, immunoassay and anodic stripping voltammetry (ASV), that can provide onsite results for quick decisions at hazardous waste sites were evaluated. Each method was applied to samples from two Superfund sites that contain high levels of Hg; Sulphur Bank Me...

  15. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation.

    Science.gov (United States)

    Liu, Xiang; Saat, Mohd Rapik; Barkan, Christopher P L

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. A cross-hazard analysis of terse message retransmission on Twitter.

    Science.gov (United States)

    Sutton, Jeannette; Gibson, C Ben; Phillips, Nolan Edward; Spiro, Emma S; League, Cedar; Johnson, Britta; Fitzhugh, Sean M; Butts, Carter T

    2015-12-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes--local network properties, message content, and message style--that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates.

  17. Hazard analysis and critical control point evaluation of school food programs in Bahrain.

    Science.gov (United States)

    Ali, A A; Spencer, N J

    1996-03-01

    Hazard analyses were conducted in six food preparation sites and 16 school canteens in the State of Bahrain. Sandwiches made with cheese, meat, eggs, liver, and beef burgers were prepared in small shops or a bakery outside schools. Foods were cooked between 4 and 5 A.M. Time-temperature exposure during cooking was adequate to kill vegetative microbes and their spores, but potential for recontamination existed from the hands of food workers, utensils, and cloths and sponges used for wiping. All foods were left at room temperature before they were transported in vans to schools where they were also kept at room temperature between 17 degrees C and 41 degrees C. Air temperature inside the canteens during this investigation was between 18.5 and 28 degrees C with a relative humidity of 65 to 70%. Hazard analyses, which included observation of operations inside school canteens and sites of food preparation, measuring temperatures, and interviewing workers and consumers (teachers, students) were carried out. Hazards were primarily associated with preparation of foods long before they were consumed, physical touching of products, and holding foods at room temperature after preparation. Holding foods at room temperature would have allowed germination of bacterial spores and multiplication of microbes. Reheating of foods was not practiced. Health promoters must be aware of these hazards and need to educate food workers, administrators, and the public on the methods of prevention.

  18. Southwestern Oregon's Biscuit Fire: An Analysis of Forest Resources, Fire Severity, and Fire Hazard

    Science.gov (United States)

    David L. Azuma; Glenn A. Christensen

    2005-01-01

    This study compares pre-fire field inventory data (collected from 1993 to 1997) in relation to post-fire mapped fire severity classes and the Fire and Fuels Extension of the Forest Vegetation Simulator growth and yield model measures of fire hazard for the portion of the Siskiyou National Forest in the 2002 Biscuit fire perimeter of southwestern Oregon. Post-fire...

  19. Probabilistic seismic hazard maps from seismicity patterns analysis: the Iberian Peninsula case

    Directory of Open Access Journals (Sweden)

    A. Jiménez

    2004-01-01

    Full Text Available Earthquake prediction is a main topic in Seismology. Here, the goal is to know the correlation between the seismicity at a certain place at a given time with the seismicity at the same place, but at a following interval of time. There are no ways for exact predictions, but one can wonder about the causality relations between the seismic characteristics at a given time interval and another in a region. In this paper, a new approach to this kind of studies is presented. Tools which include cellular automata theory and Shannon's entropy are used. First, the catalogue is divided into time intervals, and the region into cells. The activity or inactivity of each cell at a certain time is described using an energy criterion; thus a pattern which evolves over time is given. The aim is to find the rules of the stochastic cellular automaton which best fits the evolution of the pattern. The neighborhood utilized is the cross template (CT. A grid search is made to choose the best model, being the mutual information between the different times the function to be maximized. This function depends on the size of the cells β on and the interval of time τ which is considered for studying the activity of a cell. With these β and τ, a set of probabilities which characterizes the evolution rules is calculated, giving a probabilistic approach to the spatiotemporal evolution of the region. The sample catalogue for the Iberian Peninsula covers since 1970 till 2001. The results point out that the seismic activity must be deduced not only from the past activity at the same region but also from its surrounding activity. The time and spatial highest interaction for the catalogue used are of around 3.3 years and 290x165 km2, respectively; if a cell is inactive, it will continue inactive with a high probability; an active cell has around the 60% probability of continuing active in the future. The Probabilistic Seismic Hazard Map obtained marks the main seismic active

  20. Limitations of Cox Proportional Hazards Analysis in Mortality Prediction of Patients with Acute Coronary Syndrome

    Directory of Open Access Journals (Sweden)

    Babińska Magdalena

    2015-12-01

    Full Text Available The aim of this study was to evaluate the possibility of incorrect assessment of mortality risk factors in a group of patients affected by acute coronary syndrome, due to the lack of hazard proportionality in the Cox regression model. One hundred and fifty consecutive patients with acute coronary syndrome (ACS and no age limit were enrolled. Univariable and multivariable Cox proportional hazard analyses were performed. The proportional hazard assumptions were verified using Schoenfeld residuals, χ2 test and rank correlation coefficient t between residuals and time. In the total group of 150 patients, 33 (22.0% deaths from any cause were registered in the follow-up time period of 64 months. The non-survivors were significantly older and had increased prevalence of diabetes and erythrocyturia, longer history of coronary artery disease, higher concentrations of serum creatinine, cystatin C, uric acid, glucose, C-reactive protein (CRP, homocysteine and B-type natriuretic peptide (NT-proBNP, and lower concentrations of serum sodium. No significant differences in echocardiography parameters were observed between groups. The following factors were risk of death factors and fulfilled the proportional hazard assumption in the univariable model: smoking, occurrence of diabetes and anaemia, duration of coronary artery disease, and abnormal serum concentrations of uric acid, sodium, homocysteine, cystatin C and NT-proBNP, while in the multivariable model, the risk of death factors were: smoking and elevated concentrations of homocysteine and NT-proBNP. The study has demonstrated that violation of the proportional hazard assumption in the Cox regression model may lead to creating a false model that does not include only time-independent predictive factors.

  1. Sensitivity analysis of seismic hazard for the northwestern portion of the state of Gujarat, India

    Science.gov (United States)

    Petersen, M.D.; Rastogi, B.K.; Schweig, E.S.; Harmsen, S.C.; Gomberg, J.S.

    2004-01-01

    We test the sensitivity of seismic hazard to three fault source models for the northwestern portion of Gujarat, India. The models incorporate different characteristic earthquake magnitudes on three faults with individual recurrence intervals of either 800 or 1600 years. These recurrence intervals imply that large earthquakes occur on one of these faults every 266-533 years, similar to the rate of historic large earthquakes in this region during the past two centuries and for earthquakes in intraplate environments like the New Madrid region in the central United States. If one assumes a recurrence interval of 800 years for large earthquakes on each of three local faults, the peak ground accelerations (PGA; horizontal) and 1-Hz spectral acceleration ground motions (5% damping) are greater than 1 g over a broad region for a 2% probability of exceedance in 50 years' hazard level. These probabilistic PGAs at this hazard level are similar to median deterministic ground motions. The PGAs for 10% in 50 years' hazard level are considerably lower, generally ranging between 0.2 g and 0.7 g across northwestern Gujarat. Ground motions calculated from our models that consider fault interevent times of 800 years are considerably higher than other published models even though they imply similar recurrence intervals. These higher ground motions are mainly caused by the application of intraplate attenuation relations, which account for less severe attenuation of seismic waves when compared to the crustal interplate relations used in these previous studies. For sites in Bhuj and Ahmedabad, magnitude (M) 7 3/4 earthquakes contribute most to the PGA and the 0.2- and 1-s spectral acceleration ground motion maps at the two considered hazard levels. ?? 2004 Elsevier B.V. All rights reserved.

  2. Risk prediction of Critical Infrastructures against extreme natural hazards: local and regional scale analysis

    Science.gov (United States)

    Rosato, Vittorio; Hounjet, Micheline; Burzel, Andreas; Di Pietro, Antonio; Tofani, Alberto; Pollino, Maurizio; Giovinazzi, Sonia

    2016-04-01

    Natural hazard events can induce severe impacts on the built environment; they can hit wide and densely populated areas, where there is a large number of (inter)dependent technological systems whose damages could cause the failure or malfunctioning of further different services, spreading the impacts on wider geographical areas. The EU project CIPRNet (Critical Infrastructures Preparedness and Resilience Research Network) is realizing an unprecedented Decision Support System (DSS) which enables to operationally perform risk prediction on Critical Infrastructures (CI) by predicting the occurrence of natural events (from long term weather to short nowcast predictions, correlating intrinsic vulnerabilities of CI elements with the different events' manifestation strengths, and analysing the resulting Damage Scenario. The Damage Scenario is then transformed into an Impact Scenario, where punctual CI element damages are transformed into micro (local area) or meso (regional) scale Services Outages. At the smaller scale, the DSS simulates detailed city models (where CI dependencies are explicitly accounted for) that are of important input for crisis management organizations whereas, at the regional scale by using approximate System-of-Systems model describing systemic interactions, the focus is on raising awareness. The DSS has allowed to develop a novel simulation framework for predicting earthquakes shake maps originating from a given seismic event, considering the shock wave propagation in inhomogeneous media and the subsequent produced damages by estimating building vulnerabilities on the basis of a phenomenological model [1, 2]. Moreover, in presence of areas containing river basins, when abundant precipitations are expected, the DSS solves the hydrodynamic 1D/2D models of the river basins for predicting the flux runoff and the corresponding flood dynamics. This calculation allows the estimation of the Damage Scenario and triggers the evaluation of the Impact Scenario

  3. Profitability Analysis of Soybean Oil Processes

    Directory of Open Access Journals (Sweden)

    Ming-Hsun Cheng

    2017-10-01

    Full Text Available Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV, break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP, is profitable when the capacity is larger than 17 million kg of annual oil production.

  4. Profitability Analysis of Soybean Oil Processes.

    Science.gov (United States)

    Cheng, Ming-Hsun; Rosentrater, Kurt A

    2017-10-07

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.

  5. Flux Analysis in Process Models via Causality

    Directory of Open Access Journals (Sweden)

    Ozan Kahramanoğulları

    2010-02-01

    Full Text Available We present an approach for flux analysis in process algebra models of biological systems. We perceive flux as the flow of resources in stochastic simulations. We resort to an established correspondence between event structures, a broadly recognised model of concurrency, and state transitions of process models, seen as Petri nets. We show that we can this way extract the causal resource dependencies in simulations between individual state transitions as partial orders of events. We propose transformations on the partial orders that provide means for further analysis, and introduce a software tool, which implements these ideas. By means of an example of a published model of the Rho GTP-binding proteins, we argue that this approach can provide the substitute for flux analysis techniques on ordinary differential equation models within the stochastic setting of process algebras.

  6. Systemic analysis of the caulking assembly process

    Directory of Open Access Journals (Sweden)

    Rodean Claudiu

    2017-01-01

    Full Text Available The present paper highlights the importance of a caulking process which is nowadays less studied in comparison with the growing of its usage in the automotive industry. Due to the fact that the caulking operation is used in domains with high importance such as shock absorbers and brake systems there comes the demand of this paper to detail the parameters which characterize the process, viewed as input data and output data, and the requirements asked for the final product. The paper presents the actual measurement methods used for analysis the performance of the caulking assembly. All this parameters leads to an analysis algorithm of performance established for the caulking process which it is used later in the paper for an experimental research. The study is a basis from which it will be able to go to further researches in order to optimize the following processing.

  7. Hazard Identification and Risk Assessment of Health and Safety Approach JSA (Job Safety Analysis) in Plantation Company

    Science.gov (United States)

    Sugarindra, Muchamad; Ragil Suryoputro, Muhammad; Tiya Novitasari, Adi

    2017-06-01

    Plantation company needed to identify hazard and perform risk assessment as an Identification of Hazard and Risk Assessment Crime and Safety which was approached by using JSA (Job Safety Analysis). The identification was aimed to identify the potential hazards that might be the risk of workplace accidents so that preventive action could be taken to minimize the accidents. The data was collected by direct observation to the workers concerned and the results were recorded on a Job Safety Analysis form. The data were as forklift operator, macerator worker, worker’s creeper, shredder worker, workers’ workshop, mechanical line worker, trolley cleaning workers and workers’ crepe decline. The result showed that shredder worker value was 30 and