WorldWideScience

Sample records for final hazard analysis

  1. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  2. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  3. K Basin Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  4. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  5. K Basins Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WEBB, R.H.

    1999-12-29

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062, Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  6. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  7. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  8. Canister storage building hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    POWERS, T.B.

    1999-05-11

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis was performed in accordance with the DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', and meets the intent of HNF-PRO-704, ''Hazard and Accident Analysis Process''. This hazard analysis implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports''.

  9. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  10. Canister storage building hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  11. Fact Sheet About the Hazardous Waste Generator Improvements Final Rule

    Science.gov (United States)

    October 28, 2016, EPA finalized a rule that revises the hazardous waste generator regulations by making them easier to understand and providing greater flexibility in how hazardous waste is managed to better fit today's business operations.

  12. Final Safety Analysis Addenda to Hazards Summary Report, Experimental Breeder Reactor II (EBR-II): upgrading of plant protection system. Volume II

    Energy Technology Data Exchange (ETDEWEB)

    Allen, N. L.; Keeton, J. M.; Sackett, J. I. [comps.

    1980-06-01

    This report is the second in a series of compilations of the formal Final Safety Analysis Addenda (FSAA`s) to the EBR-II Hazard Summary Report and Addendum. Sections 2 and 3 are edited versions of the original FSAA`s prepared in support of certain modifications to the reactor-shutdown-system portion of the EBR-II plant-protection system. Section 4 is an edited version of the original FSAA prepared in support of certain modifications to a system classified as an engineered safety feature. These sections describe the pre- and postmodification system, the rationale for the modification, and required supporting safety analysis. Section 5 provides an updated description and analysis of the EBR-II emergency power system. Section 6 summarizes all significant modifications to the EBR-II plant-protection system to date.

  13. Final Hazard Categorization and Auditable Safety Analysis for the Remediation of the 118-D-1, 118-D-2, 118-D-3, 118-H-1, 118-H-2 and 118-H-3 Solid Waste Burial Grounds

    Energy Technology Data Exchange (ETDEWEB)

    T. J. Rodovsky

    2006-03-01

    This report presents the initial hazard categorization, final hazard categorization and auditable safety analysis for the remediation of the 118-D-1, 118-D-2, and 118-D-3 Burial Grounds located within the 100-D/DR Area of the Hanford Site and the 118-H-1, 118-H-2, and 118-H-3 Burial Grounds located within the 100-H Area of the Hanford Site.

  14. MGR External Events Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  15. Frequent Questions about the Hazardous Waste Generator Improvements Final Rule

    Science.gov (United States)

    FAQs including What are the benefits of these revisions to the generator regulations? What changed in the final regulations since proposal? How and why will the hazardous waste generator regulations be reorganized? When will this rule become effective?

  16. Probabilistic analysis of tsunami hazards

    Science.gov (United States)

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  17. Hazardous Waste Sites not making the final EPA National Priority List of Hazardous Waste Sites

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — These are sites from EPA CERCLIS list that are not final National-Priority-List Hazardous Waste sites. The data was obtained from EPA's LandView CDs.

  18. Hazardous Materials Hazard Analysis, Portland, Oregon.

    Science.gov (United States)

    1981-06-01

    ACCIDENTS IN OREGON, 1976-1979 INJURY RATE FATALITY RATE (per 100 million nilles ) (per 100 million miles) Injuries Fatalities 100 - 94. 8 80 75 - - 6...commercial vehicle Involved. Driver fault--icy road conditions caused truck to jack -knIfe and skid. Resulted in hazardous material spill and relase and...Wheel gem tanks retrieved her body. Huerta Mayor Jack Pirog said Mobil Chemi- Corp. i Mendota. She distributed the revived after emergency treatment at

  19. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    Energy Technology Data Exchange (ETDEWEB)

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  20. 75 FR 51678 - Hazardous Waste Management System; Identification and Listing of Hazardous Waste; Final Exclusion

    Science.gov (United States)

    2010-08-23

    ...; Final Exclusion AGENCY: Environmental Protection Agency. ACTION: Final rule. SUMMARY: Environmental... Software (DRAS), EPA has concluded that the petitioned waste is not hazardous waste. This exclusion applies.... What are the limits of this exclusion? D. How will OxyChem manage the waste if it is delisted? E....

  1. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  2. The Integrated Hazard Analysis Integrator

    Science.gov (United States)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  3. 77 FR 65314 - Missouri: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2012-10-26

    ... AGENCY 40 CFR Part 271 Missouri: Final Authorization of State Hazardous Waste Management Program Revisions AGENCY: Environmental Protection Agency (EPA). ACTION: Direct final rule. SUMMARY: The Solid Waste..., Missouri received final authorization to implement its hazardous waste management program effective...

  4. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  5. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  6. Cold Vacuum Drying (CVD) Facility Hazards Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    CROWE, R.D.

    2000-08-07

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) Hazard Analysis to support the CVDF Final Safety Analysis Report and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports,'' and implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports.''

  7. 14 CFR 437.29 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with §...

  8. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  9. 75 FR 51671 - Hazardous Waste Management System; Identification and Listing of Hazardous Waste; Final Exclusion

    Science.gov (United States)

    2010-08-23

    ... exclude (or delist) a wastewater treatment plant (WWTP) sludge filter cake (called sludge hereinafter... to the petition submitted by Tokusen, to delist the WWTP sludge. After careful analysis and use of... waste. This exclusion applies to 2,000 cubic yards per year of the WWTP sludge with Hazardous Waste...

  10. 78 FR 32161 - Oklahoma: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2013-05-29

    ... AGENCY 40 CFR Part 271 Oklahoma: Final Authorization of State Hazardous Waste Management Program Revision... applied to the EPA for Final authorization of the changes to its hazardous waste program under the.... Therefore, we grant Oklahoma Final authorization to operate its hazardous waste program with the changes...

  11. 77 FR 60919 - Tennessee: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2012-10-05

    ... AGENCY 40 CFR Part 271 Tennessee: Final Authorization of State Hazardous Waste Management Program... has applied to EPA for final authorization of the changes to its hazardous waste program under the... Tennessee final authorization to operate its hazardous waste program with the changes described in the...

  12. 78 FR 35766 - North Carolina: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2013-06-14

    ... AGENCY 40 CFR Part 271 North Carolina: Final Authorization of State Hazardous Waste Management Program... Carolina has applied to EPA for final authorization of changes to its hazardous waste program under the... final complete program revision application, seeking authorization of changes to its hazardous waste...

  13. 77 FR 69788 - Colorado: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2012-11-21

    ... AGENCY 40 CFR Part 271 Colorado: Final Authorization of State Hazardous Waste Management Program... applied to the EPA for final authorization of changes to its hazardous waste program under the Resource Conservation and Recovery Act (RCRA). The EPA proposes to grant final authorization to the hazardous waste...

  14. 76 FR 37021 - Louisiana: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2011-06-24

    ... AGENCY 40 CFR Part 271 Louisiana: Final Authorization of State Hazardous Waste Management Program... has applied to the EPA for final authorization of the changes to its hazardous waste program under the... opportunity to apply for final authorization to operate all aspects of their hazardous waste management...

  15. 77 FR 15273 - Oklahoma: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2012-03-15

    ... AGENCY 40 CFR Part 271 Oklahoma: Final Authorization of State Hazardous Waste Management Program Revision... applied to the EPA for Final authorization of the changes to its hazardous waste program under the... established by RCRA. Therefore, we grant Oklahoma Final authorization to operate its hazardous waste program...

  16. 77 FR 13200 - Texas: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2012-03-06

    ... AGENCY 40 CFR Part 271 Texas: Final Authorization of State Hazardous Waste Management Program Revision... has applied to the EPA for Final authorization of the changes to its hazardous waste program under the... established by RCRA. Therefore, we grant the State of Texas Final Authorization to operate its hazardous waste...

  17. 78 FR 25579 - Georgia: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2013-05-02

    ...-.07(1). Treatment Exemptions for 10/04/05......... Hazardous Waste Mixtures (``Headworks exemptions... AGENCY 40 CFR Part 271 Georgia: Final Authorization of State Hazardous Waste Management Program Revisions... to EPA for final authorization of changes to its hazardous waste program under the Resource...

  18. 14 CFR 437.55 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee...

  19. Cold Vacuum Drying Facility hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  20. 75 FR 50932 - Massachusetts: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2010-08-18

    ... AGENCY 40 CFR Part 271 Massachusetts: Final Authorization of State Hazardous Waste Management Program Revisions AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: The Commonwealth of Massachusetts applied to EPA for final ] authorization of certain changes to its hazardous waste program under...

  1. 75 FR 43478 - Rhode Island: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2010-07-26

    ... AGENCY 40 CFR Part 271 Rhode Island: Final Authorization of State Hazardous Waste Management Program... Island has applied to EPA for final authorization of changes to its hazardous waste program under the Resource Conservation and Recovery Act (RCRA). EPA proposes to grant final authorization to Rhode Island...

  2. 77 FR 38566 - Louisiana: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2012-06-28

    ... AGENCY 40 CFR Part 271 Louisiana: Final Authorization of State Hazardous Waste Management Program... Louisiana has applied to EPA for Final authorization of the changes to its hazardous waste program under the Resource Conservation and Recovery Act (RCRA). EPA proposes to grant Final authorization to the State of...

  3. 77 FR 15343 - Oklahoma: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2012-03-15

    ... AGENCY 40 CFR Part 271 Oklahoma: Final Authorization of State Hazardous Waste Management Program... Oklahoma has applied to EPA for Final authorization of the changes to its hazardous waste program under the Resource Conservation and Recovery Act (RCRA). EPA proposes to grant Final authorization to the State of...

  4. 76 FR 6564 - Florida: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2011-02-07

    ... AGENCY 40 CFR Part 271 Florida: Final Authorization of State Hazardous Waste Management Program Revisions AGENCY: Environmental Protection Agency (EPA). ACTION: Immediate final rule. SUMMARY: Florida has applied to EPA for final authorization of the changes to its hazardous waste program under the Resource...

  5. 75 FR 58328 - Nebraska: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2010-09-24

    ... AGENCY 40 CFR Part 271 Nebraska: Final Authorization of State Hazardous Waste Management Program Revisions AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: The Solid Waste... final authorization for these revisions to its Federally-authorized hazardous waste program, along with...

  6. 77 FR 47797 - Arkansas: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2012-08-10

    ... AGENCY 40 CFR Part 271 Arkansas: Final Authorization of State Hazardous Waste Management Program... Arkansas has applied to EPA for Final authorization of the changes to its hazardous waste program under the Resource Conservation and Recovery Act (RCRA). EPA proposes to grant Final authorization to the State of...

  7. 76 FR 37048 - Louisiana; Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2011-06-24

    ... AGENCY 40 CFR Part 271 Louisiana; Final Authorization of State Hazardous Waste Management Program... Louisiana has applied to EPA for Final authorization of the changes to its hazardous waste program under the Resource Conservation and Recovery Act (RCRA). EPA proposes to grant Final authorization to the State of...

  8. 77 FR 38530 - Louisiana: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2012-06-28

    ... AGENCY 40 CFR Part 271 Louisiana: Final Authorization of State Hazardous Waste Management Program Revision AGENCY: Environmental Protection Agency (EPA). ACTION: Immediate final rule. SUMMARY: Louisiana has applied to the EPA for final authorization of the changes to its hazardous waste program under the...

  9. 78 FR 15338 - New York: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2013-03-11

    ... AGENCY 40 CFR Part 271 New York: Final Authorization of State Hazardous Waste Management Program... applied to EPA for final authorization of changes to its hazardous waste program under the Solid Waste... proposes to grant final authorization to New York for these changes, with limited exceptions. EPA has...

  10. 77 FR 69765 - Colorado: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2012-11-21

    ... AGENCY 40 CFR Part 271 Colorado: Final Authorization of State Hazardous Waste Management Program Revisions AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: The Solid Waste... established by RCRA. Therefore, we grant Colorado Final Authorization to operate its hazardous waste program...

  11. 78 FR 70255 - West Virginia: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2013-11-25

    ... AGENCY 40 CFR Part 271 West Virginia: Final Authorization of State Hazardous Waste Management Program... applied to EPA for final authorization of revisions to its hazardous waste program under the Resource Conservation and Recovery Act (RCRA). EPA proposes to grant final authorization to West Virginia. In the...

  12. 75 FR 35720 - Massachusetts: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2010-06-23

    ... AGENCY 40 CFR Part 271 Massachusetts: Final Authorization of State Hazardous Waste Management Program... Massachusetts has applied to EPA for final authorization of changes to its hazardous waste program under the Resource Conservation and Recovery Act (RCRA). EPA proposes to grant final authorization to Massachusetts...

  13. Final Report: Seismic Hazard Assessment at the PGDP

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhinmeng [KY Geological Survey, Univ of KY

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties of seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.

  14. Final Hazard Categorization for the Remediation of the 116-C-3 Chemical Waste Tanks

    Energy Technology Data Exchange (ETDEWEB)

    T. M. Blakley; W. D. Schofield

    2007-09-10

    This final hazard categorization (FHC) document examines the hazards, identifies appropriate controls to manage the hazards, and documents the commitments for the 116-C-3 Chemical Waste Tanks Remediation Project. The remediation activities analyzed in this FHC are based on recommended treatment and disposal alternatives described in the Engineering Evaluation for the Remediation to the 116-C-3 Chemical Waste Tanks (BHI 2005e).

  15. 75 FR 9345 - Michigan: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2010-03-02

    ... necessary to assure that all hazardous waste generated is designated for treatment, storage, or disposal in...'' enclosed treatment facility''. deleted and words ``of a hazardous waste'' added. MAC R 299.9108(k) 6/21... AGENCY 40 CFR Part 271 Michigan: Final Authorization of State Hazardous Waste Management Program Revision...

  16. Phase 2 fire hazard analysis for the canister storage building

    Energy Technology Data Exchange (ETDEWEB)

    Sadanaga, C.T., Westinghouse Hanford

    1996-07-01

    The fire hazard analysis assesses the risk from fire in a facility to ascertain whether the fire protection policies are met. This document provides a preliminary FHA for the CSB facility. Open items have been noted in the document. A final FHA will be required at the completion of definitive design, prior to operation of the facility.

  17. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  18. Assessment of technologies for hazardous waste site remediation: Non-treatment technologies and pilot scale facility implementation -- excavation -- storage technology -- safety analysis and review statement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, H.R.; Overbey, W.K. Jr.; Koperna, G.J. Jr.

    1994-02-01

    The purpose of this study is to assess the state-of-the-art of excavation technology as related to environmental remediation applications. A further purpose is to determine which of the excavation technologies reviewed could be used by the US Corp of Engineers in remediating contaminated soil to be excavated in the near future for construction of a new Lock and Dam at Winfield, WV. The study is designed to identify excavation methodologies and equipment which can be used at any environmental remediation site but more specifically at the Winfield site on the Kanawha River in Putnam County, West Virginia. A technical approach was determined whereby a functional analysis was prepared to determine the functions to be conducted during the excavation phase of the remediation operations. A number of excavation technologies were identified from the literature. A set of screening criteria was developed that would examine the utility and ranking of the technologies with respect to the operations that needed to be conducted at the Winfield site. These criteria were performance, reliability, implementability, environmental safety, public health, and legal and regulatory compliance. The Loose Bulk excavation technology was ranked as the best technology applicable to the Winfield site. The literature was also examined to determine the success of various methods of controlling fugitive dust. Depending upon any changes in the results of chemical analyses, or prior remediation of the VOCs from the vadose zone, consideration should be given to testing a new ``Pneumatic Excavator`` which removes the VOCs liberated during the excavation process as they outgas from the soil. This equipment however would not be needed on locations with low levels of VOC emissions.

  19. Schedulability Analysis for Java Finalizers

    DEFF Research Database (Denmark)

    Bøgholm, Thomas; Hansen, Rene Rydhof; Ravn, Anders P.;

    2010-01-01

    Java finalizers perform clean-up and finalisation of objects at garbage collection time. In real-time Java profiles the use of finalizers is either discouraged (RTSJ, Ravenscar Java) or even disallowed (JSR-302), mainly because of the unpredictability of finalizers and in particular their impact ...... programs. Finally, we extend the SARTS tool for automated schedulability analysis of Java bytecode programs to handle finalizers in a fully automated way.......Java finalizers perform clean-up and finalisation of objects at garbage collection time. In real-time Java profiles the use of finalizers is either discouraged (RTSJ, Ravenscar Java) or even disallowed (JSR-302), mainly because of the unpredictability of finalizers and in particular their impact...... on the schedulability analysis. In this paper we show that a controlled scoped memory model results in a structured and predictable execution of finalizers, more reminiscent of C++ destructors than Java finalizers. Furthermore, we incorporate finalizers into a (conservative) schedulability analysis for Predictable Java...

  20. Frequent Questions about the Hazardous Waste Export-Import Revisions Final Rule

    Science.gov (United States)

    Answers questions such as: What new requirements did EPA finalize in the Hazardous Waste Export-Import Revisions Final Rule? Why did EPA implement these changes now? What are the benefits of the final rule? What are the compliance dates for the final rule?

  1. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  2. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  3. Repository Subsurface Preliminary Fire Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  4. 40 CFR 68.67 - Process hazard analysis.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process hazard analysis. 68.67 Section...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a) The owner or operator shall perform an initial process hazard analysis (hazard evaluation)...

  5. Schedulability Analysis for Java Finalizers

    DEFF Research Database (Denmark)

    Bøgholm, Thomas; Hansen, Rene Rydhof; Søndergaard, Hans

    2010-01-01

    Java finalizers perform clean-up and finalisation of objects at garbage collection time. In real-time Java profiles the use of finalizers is either discouraged (RTSJ, Ravenscar Java) or even disallowed (JSR-302), mainly because of the unpredictability of finalizers and in particular their impact...... on the schedulability analysis. In this paper we show that a controlled scoped memory model results in a structured and predictable execution of finalizers, more reminiscent of C++ destructors than Java finalizers. Furthermore, we incorporate finalizers into a (conservative) schedulability analysis for Predictable Java...... programs. Finally, we extend the SARTS tool for automated schedulability analysis of Java bytecode programs to handle finalizers in a fully automated way....

  6. Evaluating Ecological Risk to Invertebrate Receptors from PAHs in Sediments at Hazardous Waste Sites (Final Report)

    Science.gov (United States)

    EPA's Ecological Risk Assessment Support Center (ERASC) announced the release of the final report, Evaluating Ecological Risk to Invertebrate Receptors from PAHs in Sediments at Hazardous Waste Sites. The report provides an overview of an approach for assessing risk to ...

  7. Evaluating Ecological Risk to Invertebrate Receptors from PAHs in Sediments at Hazardous Waste Sites (Final Report)

    Science.gov (United States)

    EPA's Ecological Risk Assessment Support Center (ERASC) announced the release of the final report, Evaluating Ecological Risk to Invertebrate Receptors from PAHs in Sediments at Hazardous Waste Sites. The report provides an overview of an approach for assessing risk to ...

  8. Probabilistic earthquake hazard analysis for Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  9. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  10. Fire hazard analysis for fusion energy experiments

    Energy Technology Data Exchange (ETDEWEB)

    Alvares, N.J.; Hasegawa, H.K.

    1979-01-01

    The 2XIIB mirror fusion facility at Lawrence Livermore Laboratory (LLL) was used to evaluate the fire safety of state-of-the-art fusion energy experiments. The primary objective of this evaluation was to ensure the parallel development of fire safety and fusion energy technology. Through fault-tree analysis, we obtained a detailed engineering description of the 2XIIB fire protection system. This information helped us establish an optimum level of fire protection for experimental fusion energy facilities as well as evaluate the level of protection provided by various systems. Concurrently, we analyzed the fire hazard inherent to the facility using techniques that relate the probability of ignition to the flame spread and heat-release potential of construction materials, electrical and thermal insulations, and dielectric fluids. A comparison of the results of both analyses revealed that the existing fire protection system should be modified to accommodate the range of fire hazards inherent to the 2XIIB facility.

  11. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2... REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a)...

  12. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  13. Results of the probabilistic volcanic hazard analysis project

    Energy Technology Data Exchange (ETDEWEB)

    Youngs, R.; Coppersmith, K.J.; Perman, R.C. [Geomatrix Consultants, Inc., San Francisco, CA (United States)

    1996-12-01

    The Probabilistic Volcanic Hazard Analysis (PVHA) project, sponsored by the U.S. Department of Energy (DOE), has been conducted to assess the probability of a future volcanic event disrupting the potential repository at Yucca Mountain. The methodology for the PVHA project is summarized in Coppersmith and others (this volume). The judgments of ten earth scientists who were members of an expert panel were elicited to ensure that a wide range of approaches were considered. Each expert identified one or more approaches for assessing the hazard and they quantified their uncertainties in models and parameter values. Aggregated results are expressed as a probability distribution on the annual frequency of intersecting the proposed repository block. This paper presents some of the key results of the PVHA assessments. These results are preliminary; the final report for the study is planned to be submitted to DOE in April 1996.

  14. Final Hazard Categorization for the Remediation of Six 300-FF-2 Operable Unit Solid Waste Burial Grounds

    Energy Technology Data Exchange (ETDEWEB)

    J. D. Ludowise

    2006-12-12

    This report provides the final hazard categorization (FHC) for the remediation of six solid waste disposal sites (referred to as burial grounds) located in the 300-FF-2 Operable Unit (OU) on the Hanford Site. These six sites (618-1, 618-2, 618-3, 618-7, 618-8, and 618-13 Burial Grounds) were determined to have a total radionuclide inventory (WCH 2005a, WCH 2005d, WCH 2005e and WCH 2006b) that exceeds the DOE-STD-1027 Category 3 threshold quantity (DOE 1997) and are the subject of this analysis. This FHC document examines the hazards, identifies appropriate controls to manage the hazards, and documents the FHC and commitments for the 300-FF-2 Burial Grounds Remediation Project.

  15. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    Science.gov (United States)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  16. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-03-01

    When training for a job in which human error has the potential of producing catastrophic results, an understanding of the hazards that may be encountered is of paramount importance. In high consequence activities, it is important that the training program be conducted in a safe environment and yet emphasize the potential hazards. Because of the high consequence of a human error the use of a high-fidelity simulation is of great importance to provide the safe environment the worker needs to learn and hone required skills. A hazards analysis identifies the operation hazards, potential human error, and associated positive measures that aid in the mitigation or prevention of the hazard. The information gained from the hazards analysis should be used in the development of training. This paper will discuss the integration of information from the hazards analysis into the development of simulation components of a training program.

  17. Hazardous materials transportation: a risk-analysis-based routing methodology.

    Science.gov (United States)

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  18. Fire hazards analysis of transuranic waste storage and assay facility

    Energy Technology Data Exchange (ETDEWEB)

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  19. Total system hazards analysis for the western area demilitarization facility

    Science.gov (United States)

    Pape, R.; Mniszewski, K.; Swider, E.

    1984-08-01

    The results of a hazards analysis of the Western Area Demilitarization facility (WADF) at Hawthorne, Nevada are summarized. An overview of the WADF systems, the hazards analysis methodology that was applied, a general discussion of the fault tree analysis results, and a compilation of the conclusions and recommendations for each area of the facility are given.

  20. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-12-01

    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  1. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    Science.gov (United States)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  2. 327 Building fire hazards analysis implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    Eggen, C.D.

    1998-09-16

    In March 1998, the 327 Building Fire Hazards Analysis (FRA) (Reference 1) was approved by the US Department of Energy, Richland Operations Office (DOE-RL) for implementation by B and W Hanford Company (B and WHC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in five areas and provided nine recommendations (11 items) to bring the 327 Building into compliance. To date, actions for five of the 11 items have been completed. Exemption requests will be transmitted to DOE-RL for two of the items. Corrective actions have been identified for the remaining four items. The completed actions address combustible loading requirements associated with the operation of the cells and support areas. The status of the recommendations and actions was confirmed during the July 1998 Fire Protection Assessment. B and WHC will use this Implementation Plan to bring the 327 Building and its operation into compliance with DOE Order 5480.7A and RLID 5480.7.

  3. 14 CFR 417.223 - Flight hazard area analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard...

  4. Seismic hazard methodology for the Central and Eastern United States. Volume 1: methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, R.K.; Veneziano, D.; Toro, G.; O' Hara, T.; Drake, L.; Patwardhan, A.; Kulkarni, R.; Kenney, R.; Winkler, R.; Coppersmith, K.

    1986-07-01

    A methodology to estimate the hazard of earthquake ground motion at a site has been developed. The methodology consists of systematic procedures to characterize earthquake sources, the seismicity parameters of those sources, and functions for the attenuation of seismic energy, incorporating multiple input interpretations by earth scientists. Uncertainties reflecting permissible alternative inperpretations are quantified by use of probability logic trees and are propagated through the hazard results. The methodology is flexible and permits, for example, interpretations of seismic sources that are consistent with earth-science practice in the need to depict complexity and to accommodate alternative hypotheses. This flexibility is achieved by means of a tectonic framework interpretation from which alternative seismic sources are derived. To estimate rates of earthquake recurrence, maximum use is made of the historical earthquake database in establishing a uniform measure of earthquake size, in identifying independent events, and in detemining the completeness of the earthquake record in time, space, and magnitude. Procedures developed as part of the methodology permit relaxation of the usual assumption of homogeneous seismicity within a source and provide unbiased estimates of recurrence parameters. The methodology incorporates the Poisson-exponential earthquake recurrence model and an extensive assessment of its applicability is provided. Finally, the methodology includes procedures to aggregate hazard results from a number of separate input interpretations to obtain a best-estimate value of hazard, together with its uncertainty, at a site.

  5. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-15

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, each based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.

  6. A LiDAR based analysis of hydraulic hazard mapping

    Science.gov (United States)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  7. Hazard Analysis of Japanese Boxed Lunches (Bento).

    Science.gov (United States)

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary…

  8. Landslide hazard zonation assessment using GIS analysis at Golmakan Watershed, northeast of Iran

    Institute of Scientific and Technical Information of China (English)

    Mohammad Reza MANSOURI DANESHVAR; Ali BAGHERZADEH

    2011-01-01

    Landslide hazard is one of the major environmental hazards in geomorphic studies in mountainous areas. For helping the planners in selection of suitable locations to implement development projects, a landslide hazard zonation map has been produced for the Golmakan Watershed as part of Binaloud northern hillsides (northeast of Iran). For this purpose, after preparation of a landslide inventory of the study area, some 15 major parameters were examined for integrated analysis of landslide hazard in the region. The analyses of parameters were done by geo-referencing and lateral model making, satellite imaging of the study area, and spatial analyses by using geographical information system (GIS). The produced factor maps were weighted with analytic hierarchy process (AHP) method and then classified. The study area was classified into four classes of relative landslide hazards:negligible, low, moderate, and high. The final produced map for landslide hazard zonation in Golmakan Watershed revealed that: 1 ) the parameters of land slope and geologic formation have strong correlation (R2 = 0.79 and 0.83,respectively) with the dependent variable landslide hazard (p < 0.05). 2) About 18.8% of the study area has low and negligible hazards to future landslides, while 81.2% of the land area of Golmakan Watershed falls into the high and moderate categories.

  9. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    Science.gov (United States)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  10. Seismic hazard analysis for Jayapura city, Papua

    Energy Technology Data Exchange (ETDEWEB)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A. [Geological Agency, Diponegoro Road No.57, Bandung, 40122 (Indonesia)

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  11. Integrated sequence analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.; Pyy, P

    1998-02-01

    The NKS/RAK subprojet 3 `integrated sequence analysis` (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term `methodology` denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  12. The Yucca Mountain probabilistic volcanic hazard analysis project

    Energy Technology Data Exchange (ETDEWEB)

    Coppersmith, K.J.; Perman, R.C.; Youngs, R.R. [Geomatrix Consultants, Inc., San Francisco, CA (United States)] [and others

    1996-12-01

    The Probabilistic Volcanic Hazard Analysis (PVHA) project, sponsored by the U.S. Department of Energy (DOE), was conducted to assess the probability of a future volcanic event disrupting the potential repository at Yucca Mountain. The PVHA project is one of the first major expert judgment studies that DOE has authorized for technical assessments related to the Yucca Mountain project. The judgments of members of a ten-person expert panel were elicited to ensure that a wide range of approaches were considered for the hazard analysis. The results of the individual elicitations were then combined to develop an integrated assessment of the volcanic hazard that reflects the diversity of alternative scientific interpretations. This assessment, which focused on the volcanic hazard at the site, expressed as the probability of disruption of the potential repository, will provide input to an assessment of volcanic risk, which expresses the probability of radionuclide release due to volcanic disruption.

  13. 76 FR 36879 - Minnesota: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2011-06-23

    ..., September 16, 1992 (57 FR 42832) Standards Applicable to Owners and Operators of Hazardous Waste Treatment... Characteristic Wastes Whose Treatment Standards Were Vacated, Checklist 124, May 24, 1993 (58 FR 29860) Hazardous... State Hazardous Waste Programs, Checklist 153, July 1, 1996 (61 FR 34252) Hazardous Waste Treatment...

  14. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... processing plant environment, including food safety hazards that can occur before, during, and after harvest... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product...

  15. Hazard screening application guide. Safety Analysis Report Update Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  16. A Bayesian Seismic Hazard Analysis for the city of Naples

    Science.gov (United States)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  17. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  18. Federal Register Notice: Final Rule Listing as Hazardous Wastes Certain Dioxin Containing Wastes

    Science.gov (United States)

    EPA is amending the regulations for hazardous waste management under the RCRA by listing as hazardous wastes certain wastes containing particular chlorinated dioxins, -dibenzofurans, and -phenols, and by specifying a engagement standards for these wastes.

  19. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    Science.gov (United States)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;

  20. Landslide hazards and systems analysis: A Central European perspective

    Science.gov (United States)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  1. Hazardous-waste analysis plan for LLNL operations

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  2. Fire Hazards Analysis for the Inactive Equipment Storage Sprung Structure

    Energy Technology Data Exchange (ETDEWEB)

    MYOTT, C.F.

    2000-02-03

    The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objective of DOE Order 5480.1A are met. The order acknowledges a graded approach commensurate with the hazards involved.

  3. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    Energy Technology Data Exchange (ETDEWEB)

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  4. Implementation of hazard analysis critical control point in jameed production.

    Science.gov (United States)

    Al-Saed, A K; Al-Groum, R M; Al-Dabbas, M M

    2012-06-01

    The average of standard plate count and coliforms, Staphylococcus aureus and Salmonella counts for three home-made jameed samples, a traditional fermented dairy product, before applying hazard analysis critical control point system were 2.1 × 10(3), 8.9 × 10(1), 4 × 10(1) and less than 10 cfu/g, respectively. The developed hazard analysis critical control point plan resulted in identifying ten critical control points in the flow chart of jameed production. The critical control points included fresh milk receiving, pasteurization, addition of starter, water and salt, straining, personnel hygiene, drying and packaging. After applying hazard analysis critical control point system, there was significant improvement in the microbiological quality of the home-made jameed. The standard plate count was reduced to 3.1 × 10(2) cfu/g whereas coliform and Staphylococcus aureus counts were less than 10 cfu/g and Salmonella was not detected. Sensory evaluation results of color and flavor of sauce prepared from jameed showed a significant increase in the average scores given after hazard analysis critical control point application.

  5. Hazard analysis of Clostridium perfringens in the Skylab Food System

    Science.gov (United States)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  6. PO*WW*ER mobile treatment unit process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damage and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  7. Analysis of SEAFP containment strategies regarding hydrogen hazard

    Energy Technology Data Exchange (ETDEWEB)

    Maunier, F.; Arnould, F. [Technicatome, Dir. de l' Ingenierie, SEPS, 13 - Aix-en-Provence (France); Marbach, G. [CEA/Cadarache, Dept. d' Etudes des Reacteurs (DER), 13 - Saint-Paul-lez-Durance (France)

    1998-07-01

    Since SEAFP is a safety-directed study, safety considerations dominate the concept for the confinement of hazard of the different options defined. The containment strategy is the principal safety function and includes all the measures required to ensure that uncontrolled release of radioactive and chemical materials will not occur. The study presented here corresponds to the safety analysis of the three containment strategies for SEAFP model 2 (Water Cooled) regarding Hydrogen Hazard. The objective is: to compare the different containmentstrategies, to define, for each containment strategy, the necessary Safety Systems in order to reduce the frequency of the H2 Hazard to a very low value (

  8. Frequency Analysis of Aircraft hazards for License Application

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  9. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    Science.gov (United States)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  10. D0 Detector Collision Hall Oxygen Deficiancy Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wu, J.; /Fermilab

    1992-08-06

    EN-258, D0 Platform ODH Analysts. provided the oxygen deficiency hazard analysts for the D0 detector in the Assembly Hall. This note covers the same analysis. but revised for the Collision Hall. Liquid cryogens. released and warming to atmosphere conditions, expand to, on average, seven hundred times their liquid volume, and displace vital atmospheric oxygen. An oxygen deficiency hazard analysis assesses the increased risk to personnel in areas containing cryogenic systems. The D0 detector Collision Hall ODH analysis has been approached five different ways using established methods. If the low beta quad magnets are powered, and the exhaust rate is below 4220 scfm, the area is ODH class 1. In any other case, the analysis shows the area to be ODH class 0 as equipped (with ventilation fans) and requiring no special safety provisions. System designers have provided for a reduced oxygen level detection and warning system as well as emergency procedures to address fault conditions.

  11. Approaches and practices related to hazardous waste management, processing and final disposal in germany and Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Passos, J.A.L.; Pereira, F.A.; Tomich, S. [CETREL S.A., Camacari, BA (Brazil)

    1993-12-31

    A general overview of the existing management and processing of hazardous wastes technologies in Germany and Brazil is presented in this work. Emphasis has been given to the new technologies and practices adopted in both countries, including a comparison of the legislation, standards and natural trends. Two case studies of large industrial hazardous waste sites are described. 9 refs., 2 figs., 9 tabs.

  12. 75 FR 65432 - New Mexico: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2010-10-25

    ... relevant) authority 1. Zinc Fertilizer Rule. 67 FR 48393-48415, New Mexico Statute (Checklist 200). July 24... Contaminated October 7, 2002. Annotated (NMSA) Batteries. (Checklist 201). 1978, Section 74- 4-1. Hazardous... November 3, 2008, effective March 1, 2009. 3. Hazardous Air Pollutant 67 FR 77687-77692, New Mexico...

  13. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    Science.gov (United States)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  14. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    Science.gov (United States)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  15. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    Science.gov (United States)

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  16. Long term volcanic hazard analysis in the Canary Islands

    Science.gov (United States)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  17. 76 FR 62303 - California: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2011-10-07

    ... and Mineral Processing Wastes; (7) Hazardous Soils Treatment Standards and Exclusions; (8... Compliance Date for Characteristic Slags; (11) Treatment Standards for Spent Potliners from Primary Aluminum... for PCBs in Soil; and (14) Certain Land Disposal Restrictions Technical Corrections and...

  18. Rankine bottoming cycle safety analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lewandowski, G.A.

    1980-02-01

    Vector Engineering Inc. conducted a safety and hazards analysis of three Rankine Bottoming Cycle Systems in public utility applications: a Thermo Electron system using Fluorinal-85 (a mixture of 85 mole % trifluoroethanol and 15 mole % water) as the working fluid; a Sundstrand system using toluene as the working fluid; and a Mechanical Technology system using steam and Freon-II as the working fluids. The properties of the working fluids considered are flammability, toxicity, and degradation, and the risks to both plant workers and the community at large are analyzed.

  19. Landslide Hazard Zonation Mapping and Comparative Analysis of Hazard Zonation Maps

    Institute of Scientific and Technical Information of China (English)

    S. Sarkar; R. Anbalagan

    2008-01-01

    Landslide hazard zonation mapping at regional level of a large area provides a broad trend of landslide potential zones. A macro level landslide hazard zonation for a small area may provide a better insight into the landslide hazards. The main objective of the present work was to carry out macro landslide hazard zonation mapping on 1:50,000 scale in an area where regional level zonation mapping was conducted earlier. In the previous work the regional landslide hazard zonation maps of Srinagar-Rudraprayag area of Garhwal Himalaya in the state of Uttarakhand were prepared using subjective and objective approaches. In the present work the landslide hazard zonation mapping at macro level was carded out in a small area using a Landslide Hazard Evaluation Factor rating scheme. The hazard zonation map produced by using this technique classifies the area into relative hazard classes in which the high hazard zones well correspond with high frequency of landslides. The results of this map when compared with the regional zonation maps prepared earlier show that application of the present technique identified more details of the hazard zones, which are broadly shown in the earlier zonation maps.

  20. Flood Hazard and Risk Analysis in Urban Area

    Science.gov (United States)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  1. A Hazard Analysis for a Generic Insulin Infusion Pump

    Science.gov (United States)

    Zhang, Yi; Jones, Paul L.; Jetley, Raoul

    2010-01-01

    Background Researchers at the Food and Drug Administration (FDA)/Center for Device and Radiological Health/Office of Science and Engineering Laboratories have been exploring the concept of model-based engineering as a means for improving the quality of medical device software. Insulin pumps were chosen as a research subject because their design provides the desired degree of research complexity and these types of devices present an ongoing regulatory challenge. Methods Insulin pump hazards and their contributing factors are considered in the context of a highly abstract generic insulin infusion pump (GIIP) model. Hazards were identified by consulting with manufacturers, pump users, and clinicians; by reviewing national and international standards and adverse event reports collected by the FDA; and from workshops sponsored by Diabetes Technology Society. This information has been consolidated in tabular form to facilitate further community analysis and discussion. Results A generic insulin infusion pump model architecture has been established. A fairly comprehensive hazard analysis document, corresponding to the GIIP model, is presented in this article. Conclusions We believe that this work represents the genesis of an insulin pump safety reference standard upon which future insulin pump designs can be based to help ensure a basic level of safety. More interaction with the diabetes community is needed to assure the quality of this safety modeling process. PMID:20307387

  2. Lithium-thionyl chloride cell system safety hazard analysis

    Science.gov (United States)

    Dampier, F. W.

    1985-03-01

    This system safety analysis for the lithium thionyl chloride cell is a critical review of the technical literature pertaining to cell safety and draws conclusions and makes recommendations based on this data. The thermodynamics and kinetics of the electrochemical reactions occurring during discharge are discussed with particular attention given to unstable SOCl2 reduction intermediates. Potentially hazardous reactions between the various cell components and discharge products or impurities that could occur during electrical or thermal abuse are described and the most hazardous conditions and reactions identified. Design factors influencing the safety of Li/SOCl2 cells, shipping and disposal methods and the toxicity of Li/SOCl2 battery components are additional safety issues that are also addressed.

  3. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Sevougian, S. David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  4. Solvent substitution: an analysis of comprehensive hazard screening indices.

    Science.gov (United States)

    Debia, M; Bégin, D; Gérin, M

    2011-06-01

    The air index (ψ(i)(air)) of the PARIS II software (Environmental Protection Agency), the Indiana Relative Chemical Hazard Score (IRCHS), and the Final Hazard Score (FHS) used in the P2OASys system (Toxics Use Reduction Institute) are comprehensive hazard screening indices that can be used in solvent substitution. The objective of this study was to evaluate these indices using a list of 67 commonly used or recommended solvents. The indices ψ(i)(air), IRCHS and FHS were calculated considering 9, 13, and 33 parameters, respectively, that summarized health and safety hazards, and environmental impacts. Correlation and sensitivity analyses were performed. The vapor hazard ratio (VHR) was used as a reference point. Two good correlations were found: (1) between VHR and ψ(i)(air) (ρ = 0.84), (2) and between IRCHS and FHS (ρ = 0.81). Values of sensitivity ratios above 0.2 were found with ψ(i)(air) (4 of 9 parameters) and IRCHS (3 of 13 parameters), but not with FHS. Overall, the three indices exhibited important differences in the way they integrate key substitution factors, such as volatility, occupational exposure limit, skin exposure, flammability, carcinogenicity, photochemical oxidation potential, atmospheric global effects, and environmental terrestrial and aquatic effects. These differences can result in different choices of alternatives between indices, including the VHR. IRCHS and FHS are the most comprehensive indices but are very tedious and complex to use and lack sensitivity to several solvent-specific parameters. The index ψ(i)(air) is simpler to calculate but does not cover some parameters important to solvents. There is presently no suitably comprehensive tool available for the substitution of solvents. A two-tier approach for the selection of solvents is recommended to avoid errors that could be made using only a global index or the consideration of the simple VHR. As a first tier, one would eliminate solvent candidates having crucial impacts. As a

  5. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan....

  6. Seismic Hazard analysis of Adjaria Region in Georgia

    Science.gov (United States)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  7. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    Science.gov (United States)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  8. Comparative analysis of hazardous household waste in two Mexican regions.

    Science.gov (United States)

    Delgado, Otoniel Buenrostro; Ojeda-Benítez, Sara; Márquez-Benavides, Liliana

    2007-01-01

    Household hazardous waste (HHW) generation in two Mexican regions was examined, a northern region (bordering with the USA) and a central region. The aim of this work was to determine the dynamics of solid waste generation and to be able to compare the results of both regions, regarding consumption patterns and solid waste generation rates. In the northern region, household solid waste was analysed quantitatively. In order to perform this analysis, the population was categorized into three socioeconomic strata (lower, middle, upper). Waste characterization revealed the presence of products that give origin to household hazardous waste. In the northern region (Mexicali city), household hazardous waste comprised 3.7% of municipal solid waste, the largest categories in this fraction were home care products (29.2%), cleaning products (19.5%) and batteries and electronic equipment (15.7%). In the central region, HHW comprised 1.03% of municipal solid waste; the main categories in this fraction were represented by cleaning products (39%), self care products (27.3%), and insecticides (14.4%). In Mexicali, the socioeconomic study demonstrated that the production of HHW is independent of the income level. Furthermore, the composition of the solid waste stream in both regions suggested the influence of another set of variables such as local climate, migration patterns and marketing coverage. Further research is needed in order to establish the effect of low quantities of HHW upon the environment and public health.

  9. CLEOPATRA holds strong in final analysis.

    Science.gov (United States)

    2014-12-01

    According to the final analysis of CLEOPATRA, first-line treatment with pertuzumab plus trastuzumab and docetaxel significantly improves overall survival for patients with HER2-positive metastatic breast cancer. As such, dual HER2 blockade plus chemotherapy should be the standard of care in this setting, researchers say.

  10. 78 FR 15299 - New York: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2013-03-11

    ..., additional testing, reporting and emergency procedures, and closure requirements: 374-2.5(a)(5) introductory.... New York regulates used oil containing greater than 50 ppm of PCB wastes as hazardous waste, unless... wastes under the Federal RCRA program. PCB wastes are regulated under the Federal Toxic...

  11. 78 FR 70225 - West Virginia: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2013-11-25

    ... significant economic impact on a substantial number of small entities under the Regulatory Flexibility Act (5... Waste Management System'' (33 CSR 20), effective June 16, 2011; and Title 45, Series 25 ``Control of Air Pollution from Hazardous Waste Treatment, Storage and Disposal Facilities'' (45 CSR 25), effective June 16...

  12. 78 FR 54178 - Virginia: Final Authorization of State Hazardous Waste Management Program Revisions

    Science.gov (United States)

    2013-09-03

    ... limitations of the Hazardous and Solid Waste Amendments of 1984 (HSWA). New Federal requirements and... Processed in a Gasification System to Produce Synthetic Gas, Revision Checklist 216. National Emission... FR 77954, 9 VAC Sec. Sec. 20- Fuel Exclusion, Revision December 19, 60-18, 20-60-261. Checklist...

  13. A compendium on mobile robots used in hazardous environments: Final report

    Energy Technology Data Exchange (ETDEWEB)

    Meieran, H.B.

    1987-05-01

    This report presents a compendium of information regarding specific mobile robotic/teleoperated vehicles which are employed in a spectrum of hazardous environments, including the nuclear industry. These devices can be used to inspect, locate, identify, manipulate, and/or maintain components and items in the hazardous environment as well as act as mobile surveillance/sensing/reconnaissance platforms while monitoring the various constituents in the hazardous atmospheres. Furthermore, they can be used as transporters of tools, equipment, and material to and from the hazardous environment. This initial compendium is restricted to those devices which are able to maneuver on the surface of floors and other terrain features in both out-of-water and shallow water situations; subsequent compendiums will present information regarding underwater devices which can be used in the nuclear/fossil fueled/hydraulic based electric power generating and other utility industries and information updates for the initial volume. This compendium excludes those devices which are being considered for applications in outer space. 17 refs.

  14. Fire hazards analysis for W030 tank farm ventilation upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Huckfeldt, R.A.

    1996-07-17

    This Fire Hazard Analysis (FHA) was prepared according to the requirements of U.S. Department of Energy (DOE) Order 5480.7A,FIRE PROTECTION, 2-17-93. The purpose of this FHA is to ascertain whether the objectives of DOE 5480.7A are being met. This purpose is accomplished through a conservative comprehensive assessment of the risk from fire and other perils within individual fire areas of a DOE facility in relation to proposed fire protection. This FHA is based on conditions set forth within this document and is valid only under these conditions.

  15. Two-dimensional hazard estimation for longevity analysis

    DEFF Research Database (Denmark)

    Fledelius, Peter; Guillen, M.; Nielsen, J.P.

    2004-01-01

    the two-dimensional mortality surface. Furthermore we look at aggregated synthetic population metrics as 'population life expectancy' and 'population survival probability'. For Danish women these metrics indicate decreasing mortality with respect to chronological time. The metrics can not directly be used......We investigate developments in Danish mortality based on data from 1974-1998 working in a two-dimensional model with chronological time and age as the two dimensions. The analyses are done with non-parametric kernel hazard estimation techniques. The only assumption is that the mortality surface...... for analysis of economic implications arising from mortality changes....

  16. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  17. Implementation of the Hazard Analysis Critical Control Point (HACCP system to UF white cheese production line

    Directory of Open Access Journals (Sweden)

    Mahmoud El-Hofi

    2010-09-01

    Full Text Available Background. HACCP, or the Hazard Analysis and Critical Control Point System has been recognised as an effective and rational means of assuring food safety from primary production through to final consumption, using a “farm to table” methodology. The application of this preventive oriented approach would give the food producer better control over operation, better manufacturing practices and greater efficiencies, including reduced wastes. Material and methods. The steps taken to put HACCP in place are described and the process was monitored to assess its impact. Assessment of the hygiene quality of the UF white cheese products line before and after HACCP showed an improvement in quality and an overall improvement in the conditions at the company. Results. HACCP was introduced for the in UF White Cheese line at Misr Milk and Food, Mansoura, Egypt, for safe and good quality foods products. All necessary quality control procedures were verified for completeness and to determine if they are being implemented to required standards. A hazard analysis was conducted to identify hazards that may occur in the product cycle, Critical Control Points (CCPs were determined to control the identified hazards. CCP signs were then posted on the factory floor. Critical limits were established at each CCP, corrective actions to be taken when monitoring indicates deviation or loss of control were established. Verification procedures were established to confirm that the HACCP system is working effectively. Documentation concerning all procedures and records was established and integrating HACCP with ISO 9000 under one management system was applied. Conclusions. The HACCP system in this study for UF White Cheese line manufacture is developed step-by-step based on the twelve steps mentioned in the literature review. The prerequisite program was provided to deal with some hazards before the production to simplify the HACCP plan.

  18. Fire hazard analysis for Plutonium Finishing Plant complex

    Energy Technology Data Exchange (ETDEWEB)

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  19. Regional Hazard Analysis For Use In Vulnerability And Risk Assessment

    Directory of Open Access Journals (Sweden)

    Maris Fotios

    2015-09-01

    Full Text Available A method for supporting an operational regional risk and vulnerability analysis for hydrological hazards is suggested and applied in the Island of Cyprus. The method aggregates the output of a hydrological flow model forced by observed temperatures and precipitations, with observed discharge data. A scheme supported by observed discharge is applied for model calibration. A comparison of different calibration schemes indicated that the same model parameters can be used for the entire country. In addition, it was demonstrated that, for operational purposes, it is sufficient to rely on a few stations. Model parameters were adjusted to account for land use and thus for vulnerability of elements at risk by comparing observed and simulated flow patterns, using all components of the hydrological model. The results can be used for regional risk and vulnerability analysis in order to increase the resilience of the affected population.

  20. Analyzing Distributed Functions in an Integrated Hazard Analysis

    Science.gov (United States)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  1. Uncertainty treatment and sensitivity analysis of the European Probabilistic Seismic Hazard Assessment

    Science.gov (United States)

    Woessner, J.; Danciu, L.; Giardini, D.

    2013-12-01

    Probabilistic seismic hazard assessment (PSHA) aims to characterize the best available knowledge on seismic hazard of a study area, ideally taking into account all sources of uncertainty. The EC-FP7 funded project Seismic Hazard Harmonization for Europe (SHARE) generated a time-independent community-based hazard model for the European region for ground motion parameters spanning from spectral ordinates of PGA to 10s and annual exceedance probabilities from one-in-ten to one-in-ten thousand years. The results will serve as reference to define engineering applications within the EuroCode 8 and provide homogeneous input for state-of-the art seismic safety assessment of critical infrastructure. The SHARE model accounts for uncertainties, whether aleatory or epistemic, via a logic tree. Epistemic uncertainties within the seismic source-model are represented by three source models including a traditional area source model, a model that characterizes fault sources, and an approach that uses kernel-smoothing for seismicity and fault source moment release. Activity rates and maximum magnitudes in the source models are treated as aleatory uncertainties. For practical implementation and computational purposes, some of the epistemic uncertainties in the source model (i.e. dip and strike angles) are treated as aleatory, and a mean seismicity model is considered. Epistemic uncertainties for ground motions are considered by multiple Ground Motion Prediction Equations as a function of tectonic settings and treated as being correlated. The final results contain the full distribution of ground motion variability. We show how we used the logic-tree approach to consider the alternative models and how, based on the degree-of-belief in the models, we defined the weights of the single branches. This contribution features results and sensitivity analysis of the entire European hazard model and selected sites.

  2. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Science.gov (United States)

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  3. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a

  4. Subtask 1.11 -- Spectroscopic field screening of hazardous waste and toxic spills. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grisanti, A.A.

    1997-10-01

    Techniques for the field characterization of soil contamination due to spillage of hazardous waste or toxic chemicals are time-consuming and expensive. Thus more economical, less time-intensive methods are needed to facilitate rapid field screening of contaminated sites. The overall objective of this project is to study the feasibility of using an evanescent field absorbance sensor Fourier transform infrared spectroscopic sensor coupled with cone penetrometry as a field screening method. The specific objectives of this project are as follows: design an accessory for use with FT-IR that interfaces the spectrometer to a cone penetrometer; characterize the response of the FT-IR accessory to selected hydrocarbons in a laboratory-simulated field environment; and determine the ability of the FT-IR-CPT instrument to measure hydrocarbon contamination in soil by direct comparison with a reference method (e.g., Soxhlet extraction followed by gas chromatography) to quantify hydrocarbons from the same soil.

  5. Portable sensor for hazardous waste. Final report, March 31, 1995--May 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    Piper, L.G.; Hunter, A.J.R.; Fraser, M.E.; Davis, S.H.; Finson, M.L.

    1997-12-31

    This report summarizes accomplishments for the second phase of a 5-year program designed to develop a portable monitor for sensitive hazardous waste detection. The approach is to excite atomic fluorescence by the technique of Spark-Induced Breakdown Spectroscopy (SIBS). The principal goals for this second phase of the program were to demonstrate sensitive detection of additional species, both RCRA metals (Sb, Be, Cd, Cr, Pb, As, Hg) and radionuclides (U, Th, Tc); to identify potential applications and develop instrument component processes, including, sample collection and excitation, measurement and test procedures, and calibration procedures; and to design a prototype instrument. Successful completion of these task results in being able to fabricate and field test a prototype of the instrument during the program`s third phase.

  6. Probabilistic Rockfall Hazard Analysis in the area affect by the Christchurch Earthquakes, New Zealand

    Science.gov (United States)

    Frattini, P.; Lari, S.; Agliardi, F.; Crosta, G. B.; Salzmann, H.

    2012-04-01

    To limit damages to human lives and property in case of natural disasters, land planning and zonation, as well as the design of countermeasures, are fundamental tools, requiring however a rigorous quantitative risk analysis. As a consequence of the 3rd September 2010 (Mw 7.1) Darfield Earthquake, and the 22nd February (Mw 6.2), the 16th April 2011 (Mw 5.3) and the 13th June, 2011 (Mw 6.2) aftershock events, about 6000 rockfalls were triggered in the Port Hills of Christchurch, New Zealand. Five people were killed by falling rocks in the area, and several hundred homes were damaged or evacuated. In this work, we present a probabilistic rockfall hazard analysis for a small area located in the south-eastern slope of Richmond Hill (0.6 km2, Sumner, Christchurch, NZ). For the analysis, we adopted a new methodology (Probabilistic Rockfall Hazard Analysis, PRHA), which allows to quantify the exceedance probability for a given slope location of being affected by a rockfall event with a specific level of kinetic energy, integrating the contribution of different rockfall magnitude (volume) scenarios. The methodology requires the calculation of onset annual frequency, rockfall runout, and spatially-varying kinetic energy. Onset annual frequencies for different magnitude scenarios were derived from frequency-magnitude relationship adapted from the literature. The probability distribution of kinetic energy for a given slope location and volume scenario was obtained by rockfall runout modeling of non-interacting blocks through the 3D Hy-Stone simulation code. The reference simulation was calibrated by back-analysis of rockfall events occurred during the earthquake. For each rockfall magnitude scenario, 20 rockfall trajectories have been simulated for each source cell using stochastically variable values of restitution parameters. Finally, probabilistic analysis integrating over six rockfall magnitude scenarios (ranging from 0.001 m3 to 1000 m3) was carried out to produce

  7. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  8. 76 FR 56708 - Ohio: Final Authorization of State Hazardous Waste Management Program Revision

    Science.gov (United States)

    2011-09-14

    ... February 16, 2009. From Production of Dyes, Pigments, and Food, Drug and Cosmetic Colorants; Mass Loadings...; 3745-51-30; 3745- From Production of Dyes, 51-32; 3745-270-20; 3745-270-40; Effective Pigments, and Food, Drug and February 16, 2009. Cosmetic Colorants; Mass Loadings-Based Listing; Final...

  9. UPDATE TO THE PROBABILISTIC VOLCANIC HAZARD ANALYSIS, YUCCA MOUNTAIN, NEVADA

    Energy Technology Data Exchange (ETDEWEB)

    K.J. Coppersmith

    2005-09-14

    A probabilistic volcanic hazard analysis (PVHA) was conducted in 1996 for the proposed repository at Yucca Mountain, Nevada. Based on data gathered by the Yucca Mountain Project over the course of about 15 years, the analysis integrated the judgments of a panel of ten volcanic experts using methods of formal expert elicitation. PVHA resulted in a probability distribution of the annual frequency of a dike intersecting the repository, which ranges from 10E-7 to 10E-10 (mean 1.6 x 10E-8). The analysis incorporates assessments of the future locations, rates, and types of volcanic dikes that could intersect the repository, which lies about 300 m below the surface. A particular focus of the analysis is the quantification of uncertainties. Since the 1996 PVHA, additional aeromagnetic data have been collected in the Yucca Mountain region, including a high-resolution low-altitude survey. A number of anomalies have been identified within alluvial areas and modeling suggests that some of these may represent buried eruptive centers (basaltic cinder cones). A program is currently underway to drill several of the anomalies to gain information on their origin and, if basalt, their age and composition. To update the PVHA in light of the new aeromagnetic and drilling data as well as other advancements in volcanic hazard modeling over the past decade, the expert panel has been reconvened and the expert elicitation process has been fully restarted. The analysis requires assessments of the spatial distribution of igneous events, temporal distributions, and geometries and characteristics of future events (both intrusive and extrusive). The assessments are for future time periods of 10,000 years and 1,000,000 years. Uncertainties are being quantified in both the conceptual models that define these elements as well as in the parameters for the models. The expert elicitation process is centered around a series of workshops that focus on the available data; alternative approaches to

  10. Hazard analysis system of urban post-earth-quake fire based on GIS

    Institute of Scientific and Technical Information of China (English)

    李杰; 江建华; 李明浩

    2001-01-01

    The authors study the structure, functions and data organization for the hazard analysis system of urban post-earthquake fire on the platform of GIS. A general hazard analysis model of the post-earthquake fire is presented. Taking Shanghai central district as background, a system for hazard analysis of the post-earthquake fire and auxili-ary decision-against fire is developed.

  11. ORNL necessary and sufficient standards for environment, safety, and health. Final report of the Identification Team for other industrial, radiological, and non-radiological hazard facilities

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-01

    This Necessary and Sufficient (N and S) set of standards is for Other Industrial, Radiological, and Non-Radiological Hazard Facilities at Oak Ridge National Laboratory (ORNL). These facility classifications are based on a laboratory-wide approach to classify facilities by hazard category. An analysis of the hazards associated with the facilities at ORNL was conducted in 1993. To identify standards appropriate for these Other Industrial, Radiological, and Non-Radiological Hazard Facilities, the activities conducted in these facilities were assessed, and the hazards associated with the activities were identified. A preliminary hazards list was distributed to all ORNL organizations. The hazards identified in prior hazard analyses are contained in the list, and a category of other was provided in each general hazard area. A workshop to assist organizations in properly completing the list was held. Completed hazard screening lists were compiled for each ORNL division, and a master list was compiled for all Other Industrial, Radiological Hazard, and Non-Radiological facilities and activities. The master list was compared against the results of prior hazard analyses by research and development and environment, safety, and health personnel to ensure completeness. This list, which served as a basis for identifying applicable environment, safety, and health standards, appears in Appendix A.

  12. Hazard and socioenvironmental weakness: radioactive waste final disposal in the perception of the Abadia de Goias residents, GO, Brazil; Risco e vulnerabilidade socioambiental: o deposito definitivo de rejeitos radioativos na percepcao dos moradores de Abadia de Goias

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Elaine Campos

    2005-07-01

    The work searches into the hazard and the weakness which involves the community around the radioactive waste final disposal, localized in Abadia de Goias municipality, Goias state, Brazil. In order to obtain a deep knowledge on the characteristic hazards of the modernity, the sociological aspects under discussion has been researched in the Anthony Giddens and Ulrich Beck works. The phenomenon was analyzed based on the the subjective experiences of the residents, which live there for approximately 16 years. This temporal analysis is related to the social impact suffered by the residents due to the radioactive wastes originated from the radiation accident with 137 cesium in Goiania, GO, Brazil, in 1987. In spite of the local security, they identified the disposal as a hazard source, although the longer time residents have been better adaptation. The weakness of the local is significant by the proximity of residences near the area of the radioactive waste final disposal. (author)

  13. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  14. Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing

    Science.gov (United States)

    Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.

    2008-01-01

    Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.

  15. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    Science.gov (United States)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    of glacial lakes and their hazard potential. This phase of glacial lake hazard assessment aims to be geographically comprehensive in order to identify potentially dangerous lakes that may have previously been ignored. A second phase of analysis that includes site visits will be necessary for a thorough analysis at each lake to determine the potential hazard for downstream communities. The objective of the work presented here is to identify potentially dangerous lakes that warrant further study rather than provide a final hazard assessment for each lake of the glacial lake inventory in the Cordillera Blanca. References: Emmer, A. and Vilímek, V.: New method for assessing the potential hazardousness of glacial lakes in the Cordillera Blanca, Peru, Hydrol. Earth Syst. Sci. Discuss., 11, 2391-2439, 2014. UGRH - Unidad de Glaciologia y Recursos Hidricos. Inventario de Lagunas Glaciares del Peru. Ministerio de Agricultura y Riego, Autoridad Nacional del Agua, Direcccion de Conservacion y Planeamiento de Recursos Hidricos, Huaraz, Peru, 2014. Wang, W., Yao, T., Gao, Y., Yang, X., and Kattel, D. B.: A first-order method to identify potentially dangerous glacial lakes in a region of the southeastern Tibetan Plateau, Mountain Res. Develop., 31, 122-130, 2011.

  16. Site-specific probabilistic seismic hazard analyses for the Idaho National Engineering Laboratory. Volume 1: Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-05-01

    This report describes and summarizes a probabilistic evaluation of ground motions for the Idaho National Engineering Laboratory (INEL). The purpose of this evaluation is to provide a basis for updating the seismic design criteria for the INEL. In this study, site-specific seismic hazard curves were developed for seven facility sites as prescribed by DOE Standards 1022-93 and 1023-96. These sites include the: Advanced Test Reactor (ATR); Argonne National Laboratory West (ANL); Idaho Chemical Processing Plant (ICPP or CPP); Power Burst Facility (PBF); Radioactive Waste Management Complex (RWMC); Naval Reactor Facility (NRF); and Test Area North (TAN). The results, probabilistic peak ground accelerations and uniform hazard spectra, contained in this report are not to be used for purposes of seismic design at INEL. A subsequent study will be performed to translate the results of this probabilistic seismic hazard analysis to site-specific seismic design values for the INEL as per the requirements of DOE Standard 1020-94. These site-specific seismic design values will be incorporated into the INEL Architectural and Engineering Standards.

  17. Hazard function analysis for flood planning under nonstationarity

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  18. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  19. Landslide and debris-flow hazard analysis and prediction using GIS in Minamata Hougawachi area, Japan

    Science.gov (United States)

    Wang, Chunxiang; Esaki, Tetsuro; Xie, Mowen; Qiu, Cheng

    2006-10-01

    On July 20, 2003, following a short duration of heavy rainfall, a debris-flow disaster occurred in the Minamata Hougawachi area, Kumamoto Prefecture, Japan. This disaster was triggered by a landslide. In order to assess the landslide and debris-flow hazard potential of this mountainous region, the study of historic landslides is critical. The objective of the study is to couple 3D slope-stability analysis models and 2D numerical simulation of debris flow within a geographical information systems in order to identity the potential landslide-hazard area. Based on field observations, the failure mechanism of the past landslide is analyzed and the mechanical parameters for 3D slope-stability analysis are calculated from the historic landslide. Then, to locate potential new landslides, the studied area is divided into slope units. Based on 3D slope-stability analysis models and on Monte Carlo simulation, the spots of potential landslides are identified. Finally, we propose a depth-averaged 2D numerical model, in which the debris and water mixture is assumed to be a uniform continuous, incompressible, unsteady Newtonian fluid. The method accurately models the historic debris flow. According to the 2D numerical simulation, the results of the debris-flow model, including the potentially inundated areas, are analyzed, and potentially affected houses, river and road are mapped.

  20. Analysis Landslide Hazard in Banjarmangu Sub District, Banjarnegara District

    Directory of Open Access Journals (Sweden)

    Kuswaji Dwi Priyono

    2016-05-01

    Full Text Available The objective of the research is to find the most suitable soil conservation practice that may be applied to control landslide hazard. In order to achieve that objective, some research steps must be done, are: (1 to identify the land characteristics of the study area that is based on the understanding of some factors that caused and triggered the landslide hazard, i.e.: slope morphology, rocks/soils characteristics, climatic condition, and landuse; (2 to study the types of landslide that occurs in every landforms and determine the area having ideal landslide form; The proposed landslide in this research is the process of masswasting down-slope as a result of the gravitation action on materials being sliding. The landslide types is including creep, slide, slump, and rocks/soils fall. The methods that being applied in the research include field survey methods and the method for determining landslide hazard by using geographic information techniques. Field survey method was intended to characterize the location of every landslide that have been happened in the study area. The results of field survey were applied as materials for determinating the grade of landslide hazard. Scorring and weighting methods of factors that influence landslide was apllied to determine the grade of landslide hazard. Scor and weight were not same for every parameters used for evaluation. The result of field research shows that landslide happen in every landform unit The study area can be devided into 9 landform unit. The landform units are differentiated into the landslide hazard classes, the study area there were found 5 classes of landslide hazard, namely: (1 vary low hazard equal to 16,65% (1 landform unit; (2 low hazard equal to 7,63% (1 landform unit; (3 medium hazard equal to 37,58% (3 landform unit; (4 high hazard equal to 25,41% (2 landforms unit; and (5 highest hazard equal to 12,73% (2 landform unit. Evaluation of landslide hazard shows hat most of study area

  1. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    Science.gov (United States)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  2. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis

    Science.gov (United States)

    Schaefer, K.; Prakash, A.; Witte, W.

    2011-12-01

    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  3. Analysis of hazardous biological material by MALDI mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  4. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  5. Additional EIPC Study Analysis. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, Stanton W [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gotham, Douglas J. [Purdue Univ., West Lafayette, IN (United States); Luciani, Ralph L. [Navigant Consultant Inc., Suwanee, GA (United States)

    2014-12-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 14 topics was developed for further analysis. This paper brings together the earlier interim reports of the first 13 topics plus one additional topic into a single final report.

  6. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  7. Open Source Seismic Hazard Analysis Software Framework (OpenSHA)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — OpenSHA is an effort to develop object-oriented, web- & GUI-enabled, open-source, and freely available code for conducting Seismic Hazard Analyses (SHA). Our...

  8. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    , the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.

  9. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    Energy Technology Data Exchange (ETDEWEB)

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  10. Final Report: Hydrogen Storage System Cost Analysis

    Energy Technology Data Exchange (ETDEWEB)

    James, Brian David [Strategic Analysis Inc., Arlington, VA (United States); Houchins, Cassidy [Strategic Analysis Inc., Arlington, VA (United States); Huya-Kouadio, Jennie Moton [Strategic Analysis Inc., Arlington, VA (United States); DeSantis, Daniel A. [Strategic Analysis Inc., Arlington, VA (United States)

    2016-09-30

    The Fuel Cell Technologies Office (FCTO) has identified hydrogen storage as a key enabling technology for advancing hydrogen and fuel cell power technologies in transportation, stationary, and portable applications. Consequently, FCTO has established targets to chart the progress of developing and demonstrating viable hydrogen storage technologies for transportation and stationary applications. This cost assessment project supports the overall FCTO goals by identifying the current technology system components, performance levels, and manufacturing/assembly techniques most likely to lead to the lowest system storage cost. Furthermore, the project forecasts the cost of these systems at a variety of annual manufacturing rates to allow comparison to the overall 2017 and “Ultimate” DOE cost targets. The cost breakdown of the system components and manufacturing steps can then be used to guide future research and development (R&D) decisions. The project was led by Strategic Analysis Inc. (SA) and aided by Rajesh Ahluwalia and Thanh Hua from Argonne National Laboratory (ANL) and Lin Simpson at the National Renewable Energy Laboratory (NREL). Since SA coordinated the project activities of all three organizations, this report includes a technical description of all project activity. This report represents a summary of contract activities and findings under SA’s five year contract to the US Department of Energy (Award No. DE-EE0005253) and constitutes the “Final Scientific Report” deliverable. Project publications and presentations are listed in the Appendix.

  11. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  12. Hazard Detection Analysis for a Forward-Looking Interferometer

    Science.gov (United States)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; hide

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  13. Evaluation of potential surface rupture and review of current seismic hazards program at the Los Alamos National Laboratory. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1991-12-09

    This report summarizes the authors review and evaluation of the existing seismic hazards program at Los Alamos National Laboratory (LANL). The report recommends that the original program be augmented with a probabilistic analysis of seismic hazards involving assignment of weighted probabilities of occurrence to all potential sources. This approach yields a more realistic evaluation of the likelihood of large earthquake occurrence particularly in regions where seismic sources may have recurrent intervals of several thousand years or more. The report reviews the locations and geomorphic expressions of identified fault lines along with the known displacements of these faults and last know occurrence of seismic activity. Faults are mapped and categorized into by their potential for actual movement. Based on geologic site characterization, recommendations are made for increased seismic monitoring; age-dating studies of faults and geomorphic features; increased use of remote sensing and aerial photography for surface mapping of faults; the development of a landslide susceptibility map; and to develop seismic design standards for all existing and proposed facilities at LANL.

  14. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, R.D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  15. Hazardous Glaciers In Switzerland: A Statistical Analysis of Inventory Data

    Science.gov (United States)

    Raymond, M.; Funk, M.; Wegmann, M.

    Because of the recent increase in both occupation and economical activities in high mountain areas, a systematic overview of potential hazard zones of glaciers is needed to avoid the constuction of settlements and infrastructures in endangered areas in fu- ture. Historical informations about glacier disasters show that catastrophic events can happen repeatedly for the same causes and with the same dramatic consequences. Past catastrophic events are not only useful to identify potentially dangerous glaciers, but represent an indication of the kind of glacier hazards to expect for any given glacier. An inventory containing all known events having caused damages in the past has been compiled for Switzerland. Three different types of glacier hazards are distinguished , e.g. ice avalanches, glacier floods and glacier length changes.Hazardous glaciers have been identified in the alpine cantons of Bern, Grison, Uri, Vaud and Valais so far. The inventory data were analysed in terms of periodicity of different types of events as well as of damage occured.

  16. Estimating Source Recurrence Rates for Probabilistic Tsunami Hazard Analysis (PTHA)

    Science.gov (United States)

    Geist, E. L.; Parsons, T.

    2004-12-01

    A critical factor in probabilistic tsunami hazard analysis (PTHA) is estimating the average recurrence rate for tsunamigenic sources. Computational PTHA involves aggregating runup values derived from numerical simulations for many far-field and local sources, primarily earthquakes, each with a specified probability of occurrence. Computational PTHA is the primary method used in the ongoing FEMA pilot study at Seaside, Oregon. For a Poissonian arrival time model, the probability for a given source is dependent on a single parameter: the mean inter-event time of the source. In other probability models, parameters such as aperiodicity are also included. In this study, we focus on methods to determine the recurrence rates for large, shallow subduction zone earthquakes. For earthquakes below about M=8, recurrence rates can be obtained from modified Gutenberg-Richter distributions that are constrained by the tectonic moment rate for individual subduction zones. However, significant runup from far-field sources is commonly associated with the largest magnitude earthquakes, for which the recurrence rates are poorly constrained by the tail of empirical frequency-magnitude relationships. For these earthquakes, paleoseismic evidence of great earthquakes can be used to establish recurrence rates. Because the number of geologic horizons representing great earthquakes along a particular subduction zone is limited, special techniques are needed to account for open intervals before the first and after the last observed events. Uncertainty in age dates for the horizons also has to be included in estimating recurrence rates and aperiodicity. A Monte Carlo simulation is performed in which a random sample of earthquake times is drawn from a specified probability distribution with varying average recurrence rates and aperiodicities. A recurrence rate can be determined from the mean rate of all random samples that fit the observations, or a range of rates can be carried through the

  17. Probabilistic Tsunami Hazard Analysis for Eastern Sicily (Italy)

    Science.gov (United States)

    Lorito, Stefano; Piatanesi, Alessio; Romano, Fabrizio; Basili, Roberto; Kastelic, Vanja; Tiberti, Mara Monica; Valensise, Gianluca; Selva, Jacopo

    2010-05-01

    We present preliminary results of a Probabilistic Tsunami Hazard Analysis (PTHA) for the coast of eastern Sicily. We only consider earthquake-generated tsunamis. We focus on important cities such as Messina, Catania, and Augusta. We consider different potentially tsunamigenic Source Zones (SZ) in the Mediterranean basin, basing on geological and seismological evidences. Considering many synthetic earthquakes for each SZ, we numerically simulate the entire tsunami propagation, from sea-floor displacement to inundation. We evaluate different tsunami damage metrics, as for example maximum runup, current speed, momentum and Froude number. We use a finite difference scheme in the shallow-water approximation for the tsunami propagation at open sea, and a finite volumes scheme for the inundation phase. For the shoaling and inundation stages, we have built a bathy-topo model by merging GEBCO database, multibeam soundings, and topographic data at 10 m of resolution. Accounting for their relative probability of occurrence, deterministic scenarios are merged together to assess PTHA at the selected target sites, expressed as a probability of exceedance of a given threshold (e.g. 1 m wave height) in a given time (e.g. 100 yr). First order epistemic and aleatory uncertainties are accessed through a logic tree, accounting for changes in the variables judged to have a major impact on PTHA, and for eventual incompleteness of the SZs. The SZs are located at short, intermediate and large distances with respect to the target coastlines. We thus highlight, for different source-target distances, the relative importance of the different source parameters, and/or the role of the uncertainties in the input parameters estimation. Our results suggest that in terms of inundation extent the Hellenic Arc SZ has the highest impact on the selected target coastlines. In terms of exceedance probability instead, there is a larger variability depending not only on location and recurrence but also on

  18. Hazard analysis for 300 Area N Reactor Fuel Fabrication and Storage Facilty

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, D.J.; Brehm, J.R.

    1994-01-25

    This hazard analysis (HA) has been prepared for the 300 Area N Reactor Fuel Fabrication and Storage Facility (Facility), in compliance with the requirements of Westinghouse Hanford Company (Westinghouse Hanford) controlled manual WHC-CM-4-46, Nonreactor Facility Safety Analysis Manual, and to the direction of WHC-IP-0690, Safety Analysis and Regulation Desk Instructions, (WHC 1992). An HA identifies potentially hazardous conditions in a facility and the associated potential accident scenarios. Unlike the Facility hazard classification documented in WHC-SD-NR-HC-004, Hazard Classification for 300 Area N Reactor Fuel Fabrication and Storage Facility, (Huang 1993), which is based on unmitigated consequences, credit is taken in an HA for administrative controls or engineered safety features planned or in place. The HA is the foundation for the accident analysis. The significant event scenarios identified by this HA will be further evaluated in a subsequent accident analysis.

  19. RHDM procedure for analysis of the potential specific risk due to a rockfall hazard

    Directory of Open Access Journals (Sweden)

    Blažo Đurović

    2005-06-01

    Full Text Available Theoretical basis and practical legislation (Water Law and regulation acts would allow in future the determination and classification of endangered territorial zones due to various natural hazards, among them also due to rock collapse and rockfall hazard as forms of the mass movement hazard. Interdisciplinary risk analysis, assessment and management of natural hazard are factors of harmonious spatial development in future. Especially risk analysis is the essential part of preventive mitigation actions and forms the basis for evaluation of the spatial plans, programs and policies.In accordance with the basic principles of the risk analysis the Rockfall Hazard Determination Method (RHDM for estimation of the potential specific risk degree due to a rock fall hazard along roadways and in hinterland is introduced. The method is derivedfrom the Rockfall Hazard Rating System (RHRS and adjusted to a holistic concept of the risk analysis procedure. The outcomes of the phenomenon simulation with a computer programme for rock mass movement analysis at local scale are included as well as climateand seismic conditions criteria which are newly introduced, thus making this method more adequate for specific geologic conditions in Slovenia.

  20. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-04-26

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... the proposed rule, ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  1. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-02-19

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  2. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-11-20

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... 3646), entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk- Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  3. Development of hazard analysis by critical control points (HACCP) procedures to control organic chemical hazards in the agricultural production of raw food commodities.

    Science.gov (United States)

    Ropkins, Karl; Ferguson, Andrew; Beck, Angus J

    2003-01-01

    Hazard Analysis by Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards in the food chain. Effective HACCP requires the consideration of all chemical microbiological, and physical hazards. However, current procedures focus primarily on microbiological and physical hazards, while chemical aspects of HACCP have received relatively little attention. In this article we discuss the application of HACCP to organic chemical contaminants and the problems that are likely to be encountered in agriculture. We also present generic templates for the development of organic chemical contaminant HACCP procedures for selected raw food commodities, that is, cereal crops,raw meats, and milk.

  4. Hazard Analysis and Critical Control Point Program for Foodservice Establishments.

    Science.gov (United States)

    Control Point ( HACCP ) inspections in foodservice operations throughout the state. The HACCP system , which first emerged in the late 1960s, is a rational...has been adopted for use in the foodservice industry. The HACCP system consists of three main components which are the: (1) Assessment of the hazards...to monitor critical control points. This system has shown promise as a tool to reduce the frequency of foodborne disease outbreaks in foodservice

  5. Rockfall hazard analysis using LiDAR and spatial modeling

    Science.gov (United States)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  6. [Investigation and analysis on occupational hazards in a carbon enterprise].

    Science.gov (United States)

    Lu, C D; Ding, Q F; Wang, Z X; Shao, H; Sun, X C; Zhang, F

    2017-04-20

    Objective: To explore occupational-disease-inductive in a carbon enterprise workplace and personnel occupational health examination, providing the basis for occupational disease prevention and control of the industry. Methods: Field occupational health survey and inspection law were used to study the the situation and degree of occupational disease hazards in carbon enterprise from 2013 to 2015.Occupational health monitoring was used for workers, physical examination, detection of occupational hazard factors and physical examination results were analyzed comprehensive. Results: Dust, coal tar pitch volatiles, and noise in carbon enterprise were more serious than others. Among them, the over standard rate of coal tar pitch volatiles was 76.67%, the maximum point detection was 1.06 mg/m(3), and the maximum of the individual detection was 0.67 mg/m(3). There was no statistical difference among the 3 years (P>0.05) . There were no significant differences in the incidence of occupation health examination, chest X-ray, skin audiometry, blood routine, blood pressure, electrocardiogram between 3 years (P>0.05) , in which the skin and audiometry abnormal rate was higher than 10% per year. Conclusion: Dust, coal tar, and noise are the main occupational hazard factors of carbon enterprise, should strengthen the corresponding protection.

  7. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

    Science.gov (United States)

    Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

    2016-04-01

    measures in the area. After designing measures, the users can re-calculate risk by updating hazard intensity and object layers. This is achieved by manual editing of shape (vector) layers in the web-GIS interface interactively. Within the application, a cost-benefit analysis tool is also integrated to support the decision-making process for the selection of different protection measures. Finally, the resultant risk information (vector layers and data) can be exported in the form of shapefiles and excel sheets. A prototype application is realized using open-source geospatial software and technologies. Boundless framework with its client-side SDK environment is applied for the rapid prototyping. Free and open source components such as PostGIS spatial database, GeoServer and GeoWebCache, GeoExt and OpenLayers are used for the development of the platform. This developed prototype is demonstrated with a case study area located in Les Diablerets, Switzerland. This research work is carried out within a project funded by the Canton of Vaud, Switzerland. References: Bründl, M., Romang, H. E., Bischof, N., and Rheinberger, C. M.: The risk concept and its application in natural hazard risk management in Switzerland, Nat. Hazards Earth Syst. Sci., 9, 801-813, 2009. DGE: Valdorisk - Direction Générale de l'Environnement, www.vd.ch, accessed 9 January 2016, 2016. OFEV: EconoMe - Office fédéral de l'environnement, www.econome.admin.ch, accessed 9 January 2016, 2016.

  8. Hazard Analysis and Risk Assessment for an Automated Unmanned Protective Vehicle

    OpenAIRE

    Stolte, Torben; Bagschik, Gerrit; Reschka, Andreas; Maurer, and Markus

    2017-01-01

    For future application of automated vehicles in public traffic, ensuring functional safety is essential. In this context, a hazard analysis and risk assessment is an important input for designing functionally vehicle automation systems. In this contribution, we present a detailed hazard analysis and risk assessment (HARA) according to the ISO 26262 standard for a specific Level 4 application, namely an unmanned protective vehicle operated without human supervision for motorway hard shoulder r...

  9. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    Science.gov (United States)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  10. Progresses in geology and hazards analysis of Tianchi Volcano

    Institute of Scientific and Technical Information of China (English)

    WEI Hai-quan; JIN Bo-lu; LIU Yong-shun

    2004-01-01

    A number of different lahars have been recognized from a systematic survey of a mapping project. The high setting temperature feature of the deposits indicates a relationship between the lahar and the Millennium eruption event of Tianchi Volcano. The lahars caused a dramatic disaster. Recognize of the huge avalanche scars and deposits around Tianchi Volcano imply another highly destructive hazard. Three types of different texture of the avalanche deposits have been recognized. There was often magma mixing processes during the Millennium eruption of Tianchi Volcano, indicating a mixing and co-eruption regime of the eruption.

  11. Bedded-salt repository analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Guiffre, M.S.; Kaplan, M.F.; Ensminger, D.A.; Oston, S.G.; Nalbandian, J.Y.

    1980-03-31

    This report contains a description of an analysis of generic nuclear waste repository in bedded salt. This analysis was performed by TASC for inclusion in a major Lawrence Livermore Laboratory report to NRC; this report therefore should be viewed as providing more complete and detailed information about this analysis than was possible to include in the LLL report. The analysis is performed with the NUTRAN computer codes which are described in the report. The model to be analyzed is defined, and the results of a series of possible waste migration scenarios are presented. Several of these scenarios are used as the basis for a sensitivity analysis, and an uncertainty analysis utilizing Monte Carlo techniques is also performed. A new method for defining the consequences to users of a well drilled near the repository is also described, and results are presented based on two of the waste migration scenarios.

  12. Debris flow and landslide hazard mapping and risk analysis in China

    Institute of Scientific and Technical Information of China (English)

    Xilin LIU; Chengjun YU; Peijun SHI; Weihua FANG

    2012-01-01

    This paper assesses the hazardousness,vulnerability and risk of debris flow and landslide in China and compiles maps with a scale of 1∶6000000,based on Geographical Information System (GIS) technology,hazard regionalization map,socioeconomic data from 2000.Integrated hazardousness of debris flow and landslide is equivalent to the sum of debris flow hazardousness and landslide hazardousness.Vulnerability is assessed by employing a simplified assessment model.Risk is calculated by the following formula:Risk =Hazardousness × Vulnerability.The analysis results of assessment of hazardousness,vulnerability and risk show that there are extremely high risk regions of 104 km2,high risk regions of 283008 km2,moderate risk regions of 3161815 km2,low risk regions of 3299604km2,and extremely low risk regions of 2681709 km2.Exploitation activities should be prohibited in extremely high risk and high risk regions and restricted in moderate risk regions.The present study on risk analysis of debris flow and landslide not only sheds new light on the future work in this direction but also provides a scientific basis for disaster prevention and mitigation policy making.

  13. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  14. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  15. Comparison of Hazard Analysis Requirements for Instrumentation and Control System of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo [KAERI, Daejeon (Korea, Republic of); Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2014-08-15

    A hazard, in general, is defined as 'potential for harm.' In this paper, the scope of 'harm' is limited to the loss of a safety function in a Nuclear Power Plant (NPP). The Hazard Analysis (HA) of an Instrumentation and Control (I and C) systems is to identify the relationship from the logical faults, error, and failure of I and C systems to the physical harm of the nuclear power plant, and also to find the impact of the external hazard, e.g., tsunami, of the nuclear power plant to the I and C systems. This paper includes the survey of the existing hazard analysis requirements in the nuclear industries. The purpose of the paper is to compare the HA requirements in various international standards in unclear domain, specifically the safety requirements and guidance for the instrumentation and control system for the nuclear power plant from IAEA, IEC, IEEE, and NRC.

  16. Verification of C. G. Jung's analysis of Rowland Hazard and the history of Alcoholics Anonymous.

    Science.gov (United States)

    Bluhm, Amy Colwell

    2006-11-01

    Extant historical scholarship in the Jungian literature and the Alcoholics Anonymous (AA) literature does not provide a complete picture of the treatment of Rowland Hazard by C. G. Jung, an analysis that AA co-founder Bill Wilson claimed was integral to the foundation of AA in theory and practice. Wilson's original report resulted in archivists and historians incorrectly calibrating their searches to the wrong date. The current work definitively solves the mystery of the timing of Hazard's treatment with Jung by placing his preliminary analysis with Jung in the year 1926, rather than 1930 or 1931. Previously unexamined correspondence originating from Jung, Hazard, his cousin Leonard Bacon, his uncle Irving Fisher, and his aunt Margaret Hazard Fisher is supplemented by relevant primary and secondary source material.

  17. The recent trends and perspectives for final refusing of the hazardous waste in the Republic of Macedonia

    OpenAIRE

    Krstev, Boris; Lazarov, Aleksandar; Krstev, Aleksandar; Danevski, Tome; Trajkova, Sofce; Golomeov, Blagoj; Golomeova, Mirjana

    2012-01-01

    As a result of the developments of industrial production and increased consumption with produced hazardous waste, although in the last two to three decades technological processes has been reported, the amounts of hazardous waste has been significantly increased, which is a worrying problem for today's civilization. The current state of treatment of waste can qualify as irregular and chaotic. This unfavorable situation is result to lack of a system for integrated management of the waste ...

  18. Two-dimensional hazard estimation for longevity analysis

    DEFF Research Database (Denmark)

    Fledelius, Peter; Guillen, M.; Nielsen, J.P.

    2004-01-01

    We investigate developments in Danish mortality based on data from 1974-1998 working in a two-dimensional model with chronological time and age as the two dimensions. The analyses are done with non-parametric kernel hazard estimation techniques. The only assumption is that the mortality surface...... the two-dimensional mortality surface. Furthermore we look at aggregated synthetic population metrics as 'population life expectancy' and 'population survival probability'. For Danish women these metrics indicate decreasing mortality with respect to chronological time. The metrics can not directly be used...... for prediction purposes. However, we suggest that life insurance companies use the estimation technique and the cross-validation for bandwidth selection when analyzing their portfolio mortality. The non-parametric approach may give valuable information prior to developing more sophisticated prediction models...

  19. Towards increased reliability by objectification of Hazard Analysis and Risk Assessment (HARA) of automated automotive systems

    OpenAIRE

    Khastgir, Siddartha; Birrell, Stewart A.; Dhadyalla, Gunwant; Sivencrona, Håkan; Jennings, P. A. (Paul A.)

    2017-01-01

    Hazard Analysis and Risk Assessment (HARA) in various domains like automotive, aviation, process industry etc. suffer from the issues of validity and reliability. While there has been an increasing appreciation of this subject, there have been limited approaches to overcome these issues. In the automotive domain, HARA is influenced by the ISO 26262 international standard which details functional safety of road vehicles. While ISO 26262 was a major step towards analysing hazards and risks, lik...

  20. Site specific seismic hazard analysis and determination of response spectra of Kolkata for maximum considered earthquake

    Science.gov (United States)

    Shiuly, Amit; Sahu, R. B.; Mandal, Saroj

    2017-06-01

    This paper presents site specific seismic hazard analysis of Kolkata city, former capital of India and present capital of state West Bengal, situated on the world’s largest delta island, Bengal basin. For this purpose, peak ground acceleration (PGA) for a maximum considered earthquake (MCE) at bedrock level has been estimated using an artificial neural network (ANN) based attenuation relationship developed on the basis of synthetic ground motion data for the region. Using the PGA corresponding to the MCE, a spectrum compatible acceleration time history at bedrock level has been generated by using a wavelet based computer program, WAVEGEN. This spectrum compatible time history at bedrock level has been converted to the same at surface level using SHAKE2000 for 144 borehole locations in the study region. Using the predicted values of PGA and PGV at the surface, corresponding contours for the region have been drawn. For the MCE, the PGA at bedrock level of Kolkata city has been obtained as 0.184 g, while that at the surface level varies from 0.22 g to 0.37 g. Finally, Kolkata has been subdivided into eight seismic subzones, and for each subzone a response spectrum equation has been derived using polynomial regression analysis. This will be very helpful for structural and geotechnical engineers to design safe and economical earthquake resistant structures.

  1. Hangmen and Associations: The Final Analysis.

    Science.gov (United States)

    Garner, Mark; Newsome, Bernard

    1979-01-01

    Applies Ferdinand de Saussure's linguistic theories on the construction of a text to the literary analysis of texts. Recounts the use of this derivation in a literature class, showing that sensitivity to student experiences facilitates their understanding and appreciation of literary works. (RL)

  2. Fallout radiation effects analysis methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1988-03-31

    Fallout radiation is viewed by the weapons effects community as a potentially serious impediment to maintaining or restoring critical National Security Emergency Preparedness (NSEP) telecommunication capabilities in a nuclear environment. The OMNCS' Electromagnetic Pulse Mitigation Program is designed, in part, to identify the survival probability (survivability) of the nation's NSEP telecommunications infrastructure against fallout radiation effects. The OMNCS (Office of the Manager National Communications System) is developing a balanced approach consisting of fallout radiation stress tests on the electronic piece-parts and the use of estimated performance measures of telecommunication network elements in network simulation models to predict user connectivity levels. It is concluded that, given limited available data, the proposed method can predict fallout radiation effects on network telecommunication equipment. The effects of fallout radiation are small at low dosage levels (bin 1 and bin 2). More pronounced variations in equipment performance were exhibited for radiation dosage in the 1k-5k Rads(Si) bin. Finally, the results indicate that by increasing the sample size to approximately 200 samples, the statistical quality of survivability predictions can be significantly improved.

  3. Logic-tree Approach for Probabilistic Tsunami Hazard Analysis and its Applications to the Japanese Coasts

    Science.gov (United States)

    Annaka, Tadashi; Satake, Kenji; Sakakiyama, Tsutomu; Yanagisawa, Ken; Shuto, Nobuo

    2007-03-01

    For Probabilistic Tsunami Hazard Analysis (PTHA), we propose a logic-tree approach to construct tsunami hazard curves (relationship between tsunami height and probability of exceedance) and present some examples for Japan for the purpose of quantitative assessments of tsunami risk for important coastal facilities. A hazard curve is obtained by integration over the aleatory uncertainties, and numerous hazard curves are obtained for different branches of logic-tree representing epistemic uncertainty. A PTHA consists of a tsunami source model and coastal tsunami height estimation. We developed the logic-tree models for local tsunami sources around Japan and for distant tsunami sources along the South American subduction zones. Logic-trees were made for tsunami source zones, size and frequency of tsunamigenic earthquakes, fault models, and standard error of estimated tsunami heights. Numerical simulation rather than empirical relation was used for estimating the median tsunami heights. Weights of discrete branches that represent alternative hypotheses and interpretations were determined by the questionnaire survey for tsunami and earthquake experts, whereas those representing the error of estimated value were determined on the basis of historical data. Examples of tsunami hazard curves were illustrated for the coastal sites, and uncertainty in the tsunami hazard was displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves.

  4. Final evaluation of PETC coal conversion solid and hazardous wastes. Final report, September 15, 1977-November 30, 1979. [PETC's own operations

    Energy Technology Data Exchange (ETDEWEB)

    Neufeld, R.D.; Shapiro, M.; Bern, J.

    1979-08-01

    Hazards and pollutional impacts from residuals generated at the Pittsburgh Energy Technology Center are explained in the context of hazardous waste regulations proposed by the federal government (RCRA). Nine hazard characteristics are defined and an overview of their significance to PETC is presented. Pollutional impacts on air, water and land are discussed in the energy research perspective. Legislative and statutory relationships between the Center and local, county, state and federal enforcement agencies are listed and analyzed. Expected liability resting on the Center in this framework is outlined. One hundred seven different chemical and indeterminate wastes were reported in an inventory conducted as an earlier task of this project. All of these are tabulated, classified in accordance with the latest proposed federal regulations, with recommended treatment and disposal methodologies included. The existing residuals management system is described to establish baseline conditions in preparing the recommended system. Management policies as they are presently practiced are included in the presentation. A recommended residuals management plan is offered for consideration. It includes the organizational arrangement of PETC personnel, a description of authority and responsibilities of the various human elements of the plan, an information network with detailed data sheets and installation of a mandatory manifest system, a carefully designed hazardous chemical storage area, and short as well as long term choices.

  5. Manpower analysis in transportation safety. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, C.S.; Bowden, H.M.; Colford, C.A.; DeFilipps, P.J.; Dennis, J.D.; Ehlert, A.K.; Popkin, H.A.; Schrader, G.F.; Smith, Q.N.

    1977-05-01

    The project described provides a manpower review of national, state and local needs for safety skills, and projects future manning levels for transportation safety personnel in both the public and private sectors. Survey information revealed that there are currently approximately 121,000 persons employed directly in transportation safety occupations within the air carrier, highway and traffic safety, motor carrier, pipeline, rail carrier, and marine carrier transportation industry groups. The projected need for 1980 is over 145,000 of which over 80 percent will be in highway safety. An analysis of transportation tasks is included, and shows ten general categories about which the majority of safety activities are focused. A skills analysis shows a generally high level of educational background and several years of experience are required for most transportation safety jobs. An overall review of safety programs in the transportation industry is included, together with chapters on the individual transportation modes.

  6. Final report on the Pathway Analysis Task

    Energy Technology Data Exchange (ETDEWEB)

    Whicker, F.W.; Kirchner, T.B. [Colorado State Univ., Fort Collins, CO (United States)

    1993-04-01

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University`s Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere.

  7. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  8. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    Science.gov (United States)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  9. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    Science.gov (United States)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  10. Regulatory analysis technical evaluation handbook. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-01-01

    The purpose of this Handbook is to provide guidance to the regulatory analyst to promote preparation of quality regulatory analysis documents and to implement the policies of the Regulatory Analysis Guidelines of the US Nuclear Regulatory Commission (NUREG/BR-0058 Rev. 2). This Handbook expands upon policy concepts included in the NRC Guidelines and translates the six steps in preparing regulatory analyses into implementable methodologies for the analyst. It provides standardized methods of preparation and presentation of regulatory analyses, with the inclusion of input that will satisfy all backfit requirements and requirements of NRC`s Committee to Review Generic Requirements. Information on the objectives of the safety goal evaluation process and potential data sources for preparing a safety goal evaluation is also included. Consistent application of the methods provided here will result in more directly comparable analyses, thus aiding decision-makers in evaluating and comparing various regulatory actions. The handbook is being issued in loose-leaf format to facilitate revisions. NRC intends to periodically revise the handbook as new and improved guidance, data, and methods become available.

  11. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  12. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    Science.gov (United States)

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  13. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    Science.gov (United States)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  14. Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation

    Science.gov (United States)

    Faied, D.; Sanchez, A.

    2009-04-01

    Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) Dohy.Faied@masters.isunet.edu While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. VIDA also offers substantial educational potential: the framework includes a centralized clearinghouse for volcanology data which could support education at a variety of levels. Basic geophysical data, satellite maps, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.

  15. Seismic hazard methodology for the Central and Eastern United States: Volume 1: Part 2, Methodology (Revision 1): Final report

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, R.K.; Veneziano, D.; Van Dyck, J.; Toro, G.; O' Hara, T.; Drake, L.; Patwardhan, A.; Kulkarni, R.; Keeney, R.; Winkler, R.

    1988-11-01

    Aided by its consultant, the US Geologic Survey (USGS), the Nuclear Regulatory Commission (NRC) reviewed ''Seismic Hazard Methodology for the Central and Eastern United States.'' This topical report was submitted jointly by the Seismicity Owners Group (SOG) and the Electric Power Research Institute (EPRI) in July 1986 and was revised in February 1987. The NRC staff concludes that SOG/EPRI Seismic Hazard Methodology as documented in the topical report and associated submittals, is an acceptable methodology for use in calculating seismic hazard in the Central and Eastern United States (CEUS). These calculations will be based upon the data and information documented in the material that was submitted as the SOG/EPRI topical report and ancillary submittals. However, as part of the review process the staff conditions its approval by noting areas in which problems may arise unless precautions detailed in the report are observed. 23 refs.

  16. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...steps are applicable for a software HAZOP . 2 Plan HAZOP Establish HAZOP analysis goals, definitions, worksheets, schedule and process. Divide the...Subtle Incorrect Output’s value is wrong, but cannot be detected Table 3. HAZOP Guide Words for Software or System Interface Analysis31 The

  17. A New Methodology for Decreasing Uncertainties in the Seismic Hazard Assessment Results by Using Sensitivity Analysis. An Application to Sites in Eastern Spain

    Science.gov (United States)

    Giner, J. J.; Molina, S.; Jáuregui, P.; Delgado, J.

    - In this study a sensitivity analysis has been carried out by means of the seismic hazard results obtained using the non-zoning methodology (Epstein and Lomnitz, 1966) and the extreme value distribution functions proposed by Gumbel (1958), via a logic tree procedure. The aim of the sensitivity analysis is to identify the input parameters that have the largest impact on assessed hazard and its uncertainty. The research findings from the study of these parameters can serve as a useful guide to facilitate further research studies on seismic hazard evaluations because it allows us to identify parameters that have little or no effect on the seismic hazard results as well as parameters that have great effects on them. In this way, using the obtained results, we have proposed objective criteria in assigning probabilities to the different logic tree branches in a more objective way. It should be noted that, although the sensitivity of the logic tree branches depends on the site, it does not always do so in the same way. Finally, re-evaluation of seismic hazard using the proposed methodology applied to eastern Spain leads to a reduction of uncertainty from 52% to 27% of the expected acceleration with 10% probability of exceedence, at the site with the highest value of seismic hazard (Site 1: Torrevieja).

  18. Risk analysis for roadways subjected to multiple landslide-related hazards

    Science.gov (United States)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two

  19. Assessing sensitivity of Probabilistic Seismic Hazard Analysis (PSHA) to fault parameters: Sumatra case study

    Science.gov (United States)

    Omang, A.; Cummins, P. R.; Horspool, N.; Hidayati, S.

    2012-12-01

    Slip rate data and fault geometry are two important inputs in determining seismic hazard, because they are used to estimate earthquake recurrence intervals which strongly influence the hazard level in an area. However, the uncertainty of slip-rates and geometry of the fault are rarely considered in any probabilistic seismic hazard analysis (PSHA), which is surprising given the estimates of slip-rates can vary significantly from different data sources (e.g. geological vs. Geodetic). We use the PSHA method to assess the sensitivity of seismic hazard to fault slip-rates along the Great Sumatran Fault in Sumatra, Indonesia. We will consider the epistemic uncertainty of fault slip rate by employing logic trees to include alternative slip rate models. The weighting of the logic tree is determined by the probability density function of the slip rate estimates using the approach of Zechar and Frankel (2009). We consider how the PSHA result accounting for slip rate uncertainty differs from that for a specific slip rate by examining hazard values as a function of return period and distance from the fault. We also consider the geometry of the fault, especially the top and the bottom of the rupture area within a fault, to study the effect from different depths. Based on the results of this study, in some cases the uncertainty in fault slip-rates, fault geometry and maximum magnitude have a significant effect on hazard level and area impacted by earthquakes and should be considered in PSHA studies.

  20. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    Science.gov (United States)

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-08

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  1. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  2. Use of quantitative hazard analysis to evaluate risk associated with US Department of Energy Nuclear Explosive Operations

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, S.R.; O`Brien, D.A.; Martinez, J.; LeDoux, M.

    1996-03-01

    Quantitative hazard assessments (QHAs) are being used to support the US Department of Energy (DOE) Integrated Safety Process (SS-21), Nuclear Explosive Safety Studies (NESS), and Environmental Safety and Health (ES&H) initiatives. The QHAs are used to identify hazards associated with DOE nuclear explosive operations. In 1994, Los Alamos National Laboratory, Sandia National Laboratory, and the Pantex Plant participated in a joint effort to demonstrate the utility of performing hazard assessments (HAs) concurrently with process design and development efforts. Early identification of high risk operations allow for process modifications before final process design is completed. This demonstration effort, which used an integrated design process (SS-21), resulted in the redesign of the dismantlement process for the B61 center case. The SS-21 program integrates environment, safety, and health (ES&H) and nuclear explosive safety requirements. QHAs are used to identify accidents that have the potential for worker injury or public health or environmental impact. The HA is to evaluate the likelihood of accident sequences that have the potential for worker or public injury or environmental damage; identify safety critical tooling and procedural steps; identify operational safety controls; identify safety-class/significant systems, structures and components; identify dominant accident sequences; demonstrate that the facility Safety Analysis Report (SAR) design-basis accident envelops process-specific accidents; and support future change control activities.

  3. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    Science.gov (United States)

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  4. PENERAPAN SISTEM HAZARD ANALYSIS CRITICAL CONTROL POINT (HACCP PADA PROSES PEMBUATAN KERIPIK TEMPE

    Directory of Open Access Journals (Sweden)

    Rahmi Yuniarti

    2015-06-01

    Full Text Available Malang is one of the industrial centers of tempe chips. To maintain the quality and food safety, analysis is required to identify the hazards during the production process. This study was conducted to identify the hazards during the production process of tempe chips and provide recommendations for developing a HACCP system. The phases of production process of tempe chips are started from slice the tempe, move it to the kitchen, coat it with flour dough, fry it in the pan, drain it, package it, and then storage it. There are 3 types of potential hazards in terms of biological, physical, and chemical during the production process. With the CCP identification, there are three processes that have Critical Control Point. There are the process of slicing tempe, immersion of tempe into the flour mixture and draining. Recommendations for the development of HACCP systems include recommendations related to employee hygiene, supporting equipment, 5-S analysis, and the production layout.

  5. Liquefaction hazard analysis for infrastructure development in gulf of Jakarta

    Science.gov (United States)

    Dinata, Indra A.; Darlan, Yudi; Sadisun, Imam A.; Pindratno, Haris; Saryanto, Agus

    2016-05-01

    Gulf of Jakarta is an area of active sedimentation. There exist a wide sediment deposition area on the north coast of Jakarta. Generally, these sediments have not been consolidated, so that the conditions in these area is an important factor to determining liquefaction in these area. Liquefaction may occur because of earthquake that cause loss of strength and stiffness in soils. Analysis of liquefaction potential based from SPT data taken at gulf of Jakarta, include susceptibility rate and the factors that triggering. Liquefaction analysis methods compared with each other to get the factor of safety against liquefaction according to the characteristics of the soil. Liquefaction analysis at surface using susceptibility rating factor (SRF). SRF method controled by factors: history, geology, composition, and groundwater. Each factors have parameters that determine the value of SRF.From the analysis, Gulf of Jakarta has susceptibility rating from liquefaction with SRF value 12 - 35. The value shows that Gulf of Jakarta dominated by area that have susceptibility rating from medium to high. High susceptibility rating from liquefaction concentrated at coast area.

  6. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  7. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  8. Damage functions for climate-related hazards: unification and uncertainty analysis

    Science.gov (United States)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  9. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    Science.gov (United States)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  10. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    Science.gov (United States)

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module.

  11. A comparative analysis of hazard models for predicting debris flows in Madison County, VA

    Science.gov (United States)

    Morrissey, Meghan M.; Wieczorek, Gerald F.; Morgan, Benjamin A.

    2001-01-01

    During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a sixteen-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, Virginia. Field studies showed that the majority (70%) of these debris flows initiated with a thickness of 0.5 to 3.0 m in colluvium on slopes from 17 o to 41 o (Wieczorek et al., 2000). This paper evaluated and compared the approaches of SINMAP, LISA, and Iverson's (2000) transient response model for slope stability analysis by applying each model to the landslide data from Madison County. Of these three stability models, only Iverson's transient response model evaluated stability conditions as a function of time and depth. Iverson?s model would be the preferred method of the three models to evaluate landslide hazards on a regional scale in areas prone to rain-induced landslides as it considers both the transient and spatial response of pore pressure in its calculation of slope stability. The stability calculation used in SINMAP and LISA is similar and utilizes probability distribution functions for certain parameters. Unlike SINMAP that only considers soil cohesion, internal friction angle and rainfall-rate distributions, LISA allows the use of distributed data for all parameters, so it is the preferred model to evaluate slope stability over SINMAP. Results from all three models suggested similar soil and hydrologic properties for triggering the landslides that occurred during the 1995 storm in Madison County, Virginia. The colluvium probably had cohesion of less than 2KPa. The root-soil system is above the failure plane and consequently root strength and tree surcharge had negligible effect on slope stability. The result that the final location of the water table was near the ground surface is supported by the water budget analysis of the rainstorm conducted by Smith et al. (1996).

  12. Example process hazard analysis of a Department of Energy water chlorination process

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  13. Analysis of the Correlation between GDP and the Final Consumption

    Directory of Open Access Journals (Sweden)

    Constantin ANGHELACHE

    2011-09-01

    Full Text Available This paper presents the results of the researches performed by the author regarding the evolution of Gross Domestic Product. One of the main aspects of GDP analysis is the correlation with the final consumption, an important macroeconomic indicator. The evolution of the Gross Domestic Product is highly influenced by the evolution of the final consumption. To analyze the correlation, the paper proposes the use of the linear regression model, as one of the most appropriate instruments for such scientific approach. The regression model described in the article uses the GDP as resultant variable and the final consumption as factorial variable.

  14. Reduction of uncertainties in probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-02-01

    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  15. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    Science.gov (United States)

    Sung, Chih-Hsuan; Gao, Jia-Cian; Lee, Chyi-Tyi

    2015-04-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  16. Spatial temporal analysis of urban heat hazard in Tangerang City

    Science.gov (United States)

    Wibowo, Adi; Kuswantoro; Ardiansyah; Rustanto, Andry; Putut Ash Shidiq, Iqbal

    2016-11-01

    Urban heat is a natural phenomenon which might caused by human activities. The human activities were represented by various types of land-use such as urban and non-urban area. The aim of this study is to identify the urban heat behavior in Tangerang City as it might threats the urban environment. This study used three types of remote sensing data namely, Landsat TM, Landsat ETM+ and Landsat OLI-TIRS, to capture the urban heat behavior and to analysis the urban heat signature of Tangerang City in 2001, 2012, 2013, 2014, 2015 and 2016. The result showed that urban heat signature change dynamically each month based on the sun radiation. The urban heat island covered only small part of Tangerang City in 2001, but it was significantly increased and reached 50% of the area in 2012. Based on the result on urban heat signature, the threshold for threatening condition is 30 oC which recognized from land surface temperature (LST). The effective temperature (ET) index explains that condition as warm, uncomfortable, increase stress due to sweating and blood flow and may causing cardiovascular disorder.

  17. The research of mine rock burst hazard identification based on fault tree analysis

    Institute of Scientific and Technical Information of China (English)

    LI Wen; JI Hong-guang; CHENG Jiu-long; CAI Si-jing

    2007-01-01

    In order to identify the rock burst hazard in coalmine and thus to give a credible forecast, firstly, analyzed such effect factors as natural geological factors and mining technological conditions based on the investigation of more than one hundred mine rock burst cases. Secondly, adopted the fault tree analysis (FTA) technology to the mine rock burst hazard identification for the first time and confirmed twelve kinds of basic events,that is, the large mining depth, the burst-orientation coal seams, the solid strata of roof and bottom, near the faults with bigger fall, the folds, the change of seam thickness, other regional tectonics transformation or stress strip, the drilling, blasting and extracting operation,the unscientific extracting methods, the illogical extracting sequence, the residual pillars and the too close distance between the working face and the residual areas or the stopping extracting lines. Moreover, worked out the fault tree of mine rock burst. At last, it made qualitative analysis and quantitative analysis and forecasted the rock burst hazard according to the characteristic of geologic structure and exploitation technology conditions in certain mine of Shandong Province, China, the rock burst accidents happened in the following exploitation validated that it is of feasibility and veracity adopting FTA to identify the mine rock burst hazard.

  18. In silico analysis of nanomaterials hazard and risk.

    Science.gov (United States)

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    false positive relative to false negative predictions and the reliability of toxicity data. To establish the environmental impact of ENMs identified as toxic, researchers will need to estimate the potential level of environmental exposure concentration of ENMs in the various media such as air, water, soil, and vegetation. When environmental monitoring data are not available, models of ENMs fate and transport (at various levels of complexity) serve as alternative approaches for estimating exposure concentrations. Risk management decisions regarding the manufacturing, use, and environmental regulations of ENMs would clearly benefit from both the assessment of potential ENMs exposure concentrations and suitable toxicity metrics. The decision process should consider the totality of available information: quantitative and qualitative data and the analysis of nanomaterials toxicity, and fate and transport behavior in the environment. Effective decision-making to address the potential impacts of nanomaterials will require considerations of the relevant environmental, ecological, technological, economic, and sociopolitical factors affecting the complete lifecycle of nanomaterials, while accounting for data and modeling uncertainties. Accordingly, researchers will need to establish standardized data management and analysis tools through nanoinformatics as a basis for the development of rational decision tools.

  19. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    Science.gov (United States)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard

  20. 78 FR 48636 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-08-09

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food... period. These two proposals are related to the proposed rule ``Current Good Manufacturing Practice...

  1. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Science.gov (United States)

    2010-02-24

    ... Service 7 CFR Parts 210 and 220 RIN 0584-AD65 School Food Safety Program Based on Hazard Analysis and... rule entitled School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Program (SBP) to develop a school food safety program for the preparation and service of school meals...

  2. Investigation of lithium-thionyl chloride battery safety hazards. Final report 28 Sep 81-31 Dec 82

    Energy Technology Data Exchange (ETDEWEB)

    Attia, A.I.; Gabriel, K.A.; Burns, R.P.

    1983-01-01

    In the ten years since the feasibility of a lithium-thionyl chloride cell was first recognized (1) remarkable progress has been made in hardware development. Cells as large as 16,000 Ah (2) and batteries of 10.8 MWh (3) have been demonstrated. In a low rate configuration, energy densities of 500 to 600 Wh/kg are easily achieved. Even in the absence of reported explosions, safety would be a concern for such a dense energetic package; the energy density of a lithium-thionyl chloride cell is approaching that of dynamite (924 Wh/kg). In fact explosions have occurred. In general the hazards associated with lithium-thionyl chloride batteries may be divided into four categories: Explosions as a result of an error in battery design. Very large cells were in prototype development prior to a full appreciation of the hazards of the system. It is possible that some of the remaining safety issues are related to cell design; Explosions as a result of external physical abuse such as cell incineration and puncture; Explosions due to short circuiting which could lead to thermal runaway reactions. These problems appear to have been solved by changes in the battery design (4); and Expolsions due to abnormal electrical operation (i.e., charging (5) and overdischarging (6) and in partially or fully discharged cells on storage (7 and 8).

  3. Sandis irradiator for dried sewage solids. Final safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Morris, M.

    1980-07-01

    Analyses of the hazards associated with the operation of the Sandia irradiator for dried sewage solids, as well as methods and design considerations to minimize these hazards, are presented in accordance with DOE directives.

  4. Current issues and related activities in seismic hazard analysis in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong-Moon [Korea Atomic Energy Research Inst., Taejon (Korea, Republic of); Lee, Jong-Rim; Chang, Chun-Joong

    1997-03-01

    This paper discusses some technical issues identified from the seismic hazard analyses for probabilistic safety assessment on the operating Korean nuclear power plants and the related activities to resolve the issues. Since there are no strong instrumental earthquake records in Korea, the seismic hazard analysis is mainly dependent on the historical earthquake records. Results of the past seismic hazard analyses show that there are many uncertainties in attenuation function and intensity level and that there is a need to improve statistical method. The identification of the activity of the Yangsan Fault, which is close to nuclear power plant sites, has been an important issue. But the issue has not been resolved yet in spite of much research works done. Recently, some capable faults were found in the offshore area of Gulupdo Island in the Yellow Sea. It is anticipated that the results of research on both the Yangsan Fault and reduction of uncertainty in seismic hazard analysis will have an significant influence on seismic design and safety assessment of nuclear power plants in the future. (author)

  5. Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)

    Science.gov (United States)

    Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.

    2016-10-01

    Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.

  6. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  7. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    Science.gov (United States)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  8. A Hazard Analysis-based Approach to Improve the Landing Safety of a BWB Remotely Piloted Vehicle

    Institute of Scientific and Technical Information of China (English)

    LU Yi; ZHANG Shuguang; LI Xueqing

    2012-01-01

    The BUAA-BWB remotely piloted vehicle (RPV) designed by our research team encountered an unexpected landing safety problem in flight tests.It has obviously affected further research project for blended-wing-body (BWB) aircraft configuration characteristics.Searching for a safety improvement is an urgent requirement in the development work of the RPV.In view of the vehicle characteristics,a new systemic method called system-theoretic process analysis (STPA) has been tentatively applied to the hazardous factor analysis of the RPV flight test.An uncontrolled system behavior “path sagging phenomenon” is identified by implementing a three degrees of freedom simulation based on wind tunnel test data and establishing landing safety system dynamics archetype.To obtain higher safety design effectiveness and considering safety design precedence,a longitudinal “belly-flap” control surface is innovatively introduced and designed to eliminate hazards in landing.Finally,flight tests show that the unsafe factor has been correctly identified and the landing safety has been efficiently improved.

  9. Research on the spatial analysis method of seismic hazard for island

    Science.gov (United States)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  10. Chlorine hazard evaluation for the zinc-chlorine electric vehicle battery. Final technical report. [50 kWh

    Energy Technology Data Exchange (ETDEWEB)

    Zalosh, R. G.; Bajpai, S. N.; Short, T. P.; Tsui, R. K.

    1980-04-01

    Hazards associated with conceivable accidental chlorine releases from zinc-chlorine electric vehicle batteries are evaluated. Since commercial batteries are not yet available, this hazard assessment is based on both theoretical chlorine dispersion models and small-scale and large-scale spill tests with chlorine hydrate (which is the form of chlorine storage in the charged battery). Six spill tests involving the chlorine hydrate equivalent of a 50-kWh battery indicate that the danger zone in which chlorine vapor concentrations intermittently exceed 100 ppM extends at least 23 m directly downwind of a spill onto a warm (30 to 38/sup 0/C) road surface. Other accidental chlorine release scenarios may also cause some distress, but are not expected to produce the type of life-threatening chlorine exposures that can result from large hydrate spills. Chlorine concentration data from the hydrate spill tests compare favorably with calculations based on a quasi-steady area source dispersion model and empirical estimates of the hydrate decomposition rate. The theoretical dispersion model was combined with assumed hydrate spill probabilities and current motor vehicle accident statistics in order to project expected chlorine-induced fatality rates. These calculations indicate that expected chlorine fataility rates are several times higher in a city such as Los Angeles with a warm and calm climate than in a colder and windier city such as Boston. Calculated chlorine-induced fatality rate projections for various climates are presented as a function of hydrate spill probability in order to illustrate the degree of vehicle/battery crashworthiness required to maintain chlorine-induced fatality rates below current vehicle fatality rates due to fires and asphyxiations. 37 figures, 19 tables.

  11. Environmental justice implications of industrial hazardous waste generation in India: a national scale analysis

    Science.gov (United States)

    Basu, Pratyusha; Chakraborty, Jayajit

    2016-12-01

    While rising air and water pollution have become issues of widespread public concern in India, the relationship between spatial distribution of environmental pollution and social disadvantage has received less attention. This lack of attention becomes particularly relevant in the context of industrial pollution, as India continues to pursue industrial development policies without sufficient regard to its adverse social impacts. This letter examines industrial pollution in India from an environmental justice (EJ) perspective by presenting a national scale study of social inequities in the distribution of industrial hazardous waste generation. Our analysis connects district-level data from the 2009 National Inventory of Hazardous Waste Generating Industries with variables representing urbanization, social disadvantage, and socioeconomic status from the 2011 Census of India. Our results indicate that more urbanized and densely populated districts with a higher proportion of socially and economically disadvantaged residents are significantly more likely to generate hazardous waste. The quantity of hazardous waste generated is significantly higher in more urbanized but sparsely populated districts with a higher proportion of economically disadvantaged households, after accounting for other relevant explanatory factors such as literacy and social disadvantage. These findings underscore the growing need to incorporate EJ considerations in future industrial development and waste management in India.

  12. Geological Hazards analysis in Urban Tunneling by EPB Machine (Case study: Tehran subway line 7 tunnel

    Directory of Open Access Journals (Sweden)

    Hassan Bakhshandeh Amnieh

    2016-06-01

    Full Text Available Technological progress in tunneling has led to modern and efficient tunneling methods in vast underground spaces even under inappropriate geological conditions. Identification and access to appropriate and sufficient geological hazard data are key elements to successful construction of underground structures. Choice of the method, excavation machine, and prediction of suitable solutions to overcome undesirable conditions depend on geological studies and hazard analysis. Identifying and investigating the ground hazards in excavating urban tunnels by an EPB machine could augment the strategy for improving soil conditions during excavation operations. In this paper, challenges such as geological hazards, abrasion of the machine cutting tools, clogging around these tools and inside the chamber, diverse work front, severe water level fluctuations, existence of water, and fine-grained particles in the route were recognized in a study of Tehran subway line 7, for which solutions such as low speed boring, regular cutter head checks, application of soil improving agents, and appropriate grouting were presented and discussed. Due to the presence of fine particles in the route, foam employment was suggested as the optimum strategy where no filler is needed.

  13. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    Energy Technology Data Exchange (ETDEWEB)

    Adelman, D.D. [Water Resources Engineer, Lincoln, NE (United States); Stansbury, J. [Univ. of Nebraska-Lincoln, Omaha, NE (United States)

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  14. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    Science.gov (United States)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  15. Probabilistic Earthquake-Tsunami Multi-Hazard Analysis: Application to the Tohoku Region, Japan.

    Directory of Open Access Journals (Sweden)

    Raffaele De Risi

    2016-10-01

    Full Text Available This study develops a novel simulation-based procedure for the estimation of the likelihood that seismic intensity (in terms of spectral acceleration and tsunami inundation (in terms of wave height, at a particular location, will exceed given hazard levels. The procedure accounts for a common physical rupture process for shaking and tsunami. Numerous realizations of stochastic slip distributions of earthquakes having different magnitudes are generated using scaling relationships of source parameters for subduction zones and then using a stochastic synthesis method of earthquake slip distribution. Probabilistic characterization of earthquake and tsunami intensity parameters is carried out by evaluating spatially correlated strong motion intensity through the adoption of ground motion prediction equations as a function of magnitude and shortest distance from the rupture plane and by solving nonlinear shallow water equations for tsunami wave propagation and inundation. The minimum number of simulations required to obtain stable estimates of seismic and tsunami intensity measures is investigated through a statistical bootstrap analysis. The main output of the proposed procedure is the earthquake-tsunami hazard curves representing, for each mean annual rate of occurrence, the corresponding seismic and inundation tsunami intensity measures. This simulation-based procedure facilitates the earthquake-tsunami hazard deaggregation with respect to magnitude and distance. Results are particularly useful for multi-hazard mapping purposes and the developed framework can be further extended to probabilistic earthquake-tsunami risk assessment.

  16. Application of Hazard Analysis and Critical Control Points in Cherry Juice Processing Enterprises

    OpenAIRE

    Peilong Xu; Na Na

    2015-01-01

    Qingdao is one of the homelands for Cherry in China and in recent years, deep processing industry of cherry is developing rapidly. In this study, Hazard Analysis and Critical Control Points (HACCP) quality control system is introduced into production process of cherry juice, which has effectively controlled food safety risks in food production processes. The practices have proved that application of HACCP system reduced probability of pollution in cherry juice production process effectively. ...

  17. Assessment of industrial hazardous waste practices, storage and primary batteries industries. Final report, Apr--Sep 1974

    Energy Technology Data Exchange (ETDEWEB)

    McCandless, L.C.; Wetzel, R.; Casana, J.; Slimak, K.

    1975-01-01

    This report, which covers battery manufacturing operations, is one of a series of several studies which examine land-destined wastes from selected industries. The battery industry is divided into two groups by the Bureau of Census: Standard Industrial Classification (SIC) 3691 Storage Batteries (such as lead--acid automobile batteries) and SIC 3692 Primary Batteries (such as carbon--zinc flashlight batteries). The battery industry was studied because heavy metals such as mercury, cadmium, zinc, and lead are used in some of its manufacturing processes. These metals can be toxic in certain concentrations and forms. The potentially hazardous wastes destined for land disposal from the battery industry consist of industrial processing wastes, reject cells, and sludges from water pollution control devices. The amount of sludges destined for land disposal is expected to experience a large short term increase as water effluent guidelines are implemented. The impact of water effluent guidelines on land disposal of wastes is the largest single factor in determining future trends for this industry.

  18. Mapping the hazard of extreme rainfall by peaks-over-threshold extreme value analysis and spatial regression techniques

    NARCIS (Netherlands)

    Beguería, S.; Vicente-Serrano, S.M.

    2006-01-01

    The occurrence of rainfalls of high magnitude constitutes a primary natural hazard in many parts of the world, and the elaboration of maps showing the hazard of extreme rainfalls has great theoretical and practical interest. In this work a procedure based on extreme value analysis and spatial

  19. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    Energy Technology Data Exchange (ETDEWEB)

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  20. Application of Hazard Analysis and Critical Control Points (HACCP) to the Cultivation Line of Mushroom and Other Cultivated Edible Fungi.

    Science.gov (United States)

    Pardo, José E; de Figueirêdo, Vinícius Reis; Alvarez-Ortí, Manuel; Zied, Diego C; Peñaranda, Jesús A; Dias, Eustáquio Souza; Pardo-Giménez, Arturo

    2013-09-01

    The Hazard analysis and critical control points (HACCP) is a preventive system which seeks to ensure food safety and security. It allows product protection and correction of errors, improves the costs derived from quality defects and reduces the final overcontrol. In this paper, the system is applied to the line of cultivation of mushrooms and other edible cultivated fungi. From all stages of the process, only the reception of covering materials (stage 1) and compost (stage 3), the pre-fruiting and induction (step 6) and the harvest (stage 7) have been considered as critical control point (CCP). The main hazards found were the presence of unauthorized phytosanitary products or above the permitted dose (stages 6 and 7), and the presence of pathogenic bacteria (stages 1 and 3) and/or heavy metals (stage 3). The implementation of this knowledge will allow the self-control of their productions based on the system HACCP to any plant dedicated to mushroom or other edible fungi cultivation.

  1. Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria

    Science.gov (United States)

    Maltzkait, Anika; Pfurtscheller, Clemens

    2014-05-01

    Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria A. Maltzkait (1) & C. Pfurtscheller (1) (1) Institute for Interdisciplinary Mountain Research (IGF), Austrian Academy of Sciences, Innsbruck, Austria The extreme flood events of 2002, 2005 and 2013 in Austria underlined the importance of local emergency services being able to withstand and reduce the adverse impacts of natural hazards. Although for legal reasons municipal emergency and crisis management plans exist in Austria, they mostly do not cover risk analyses of natural hazards - a sound, comparable assessment to identify and evaluate risks. Moreover, total losses and operational emergencies triggered by natural hazards have increased in recent decades. Given sparse public funds, objective budget decisions are needed to ensure the efficient provision of operating resources, like personnel, vehicles and equipment in the case of natural hazards. We present a case study of the municipality of Au, Austria, which was hardly affected during the 2005 floods. Our approach is primarily based on a qualitative risk analysis, combining existing hazard plans, GIS data, field mapping and data on operational efforts of the fire departments. The risk analysis includes a map of phenomena discussed in a workshop with local experts and a list of risks as well as a risk matrix prepared at that workshop. On the basis for the exact requirements for technical and non-technical mitigation measures for each natural hazard risk were analysed in close collaboration with members of the municipal operation control and members of the local emergency services (fire brigade, Red Cross). The measures includes warning, evacuation and, technical interventions with heavy equipment and personnel. These results are used, first, to improve the municipal emergency and crisis management plan by providing a risk map, and a

  2. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-risk informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.

  3. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette Jackson [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the new methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.

  4. Hazard analysis of EUCLIDIAN: an image-guided robotic brachytherapy system.

    Science.gov (United States)

    Hu, Yida; Podder, Tarun; Buzurovic, Ivan; Yan, Kaiguo; Ng, Wan Sing; Yu, Yan

    2007-01-01

    Robotic assistance can help clinicians to improve the flexibility of needle insertion and accuracy of seed deposition. However, the robotic platform is a safety critical system for its automated operational mode. Thus, it is important to perform Hazard Identification & Safety Insurance Control (HISIC) for securing the safety of a medical robotic system. In this paper, we have performed HISIC for our robotic platform, called Endo-Uro Computer Lattice for Intratumoral Delivery, Implementation, and Ablation with Nanosensing (ECLIDIAN). The definition and requirements of the system are described by Unified Modeling Language (UML). Failure Mode and Effect Analysis (FMEA) are executed for the principles of HISIC, such as hazard identification, safety insurance control, safety critical limit, monitoring and control. FMEA combined with UML can also be implemented to ensure reliability of the human operation. On the basis of safety control index and fuzzy mathematics, safety effective value is outlined to assess the validity of safety insurance control for robotic system. The above principles and methods are feasible and effective for hazard analysis during the development of the robotic system.

  5. Ergonomics hazards analysis of linemen's power line fixing work in China.

    Science.gov (United States)

    Yu, Ming; Sun, Linyan; Du, Jianhua; Wu, Fengge

    2009-01-01

    This study used qualitative and quantitative methods, such as OWAS (Ovako working posture analysis system) and behavior observation, to analyze musculoskeletal disorder (MSD) risk factors of power line fixing work in China. Video-based sampling was used to record and analyze the frequency and posture of on-pole activities. Those key subtasks showed ergonomics characteristics of on-pole fixing tasks. Insulator-fixing was the longest subtask (33% of total working time). Bar-installing was the second longest (26% of total working time). It was evident that bar-installing and insulator-fixing were full of hazardous risks. The action categories of the 2 subtasks were higher than of the other ones. The 2 subtasks were also time-consuming, difficult and induced MSDs. Assistant linemen faced more hazardous factors than chief linemen.

  6. Probabilistic hazard analysis of Citlaltépetl (Pico de Orizaba) Volcano, eastern Mexican Volcanic Belt

    Science.gov (United States)

    De la Cruz-Reyna, Servando; Carrasco-Núñez, Gerardo

    2002-03-01

    Citlaltépetl or Pico de Orizaba is the highest active volcano in the North American continent. Although Citlaltépetl is at present in repose, its eruptive history reveals repetitive explosive eruptions in the past. Its relatively low eruption rate has favored significant population growth in areas that may be affected by a potential eruptive activity. The need of some criteria for hazards assessment and land-use planning has motivated the use of statistical methods to estimate the time and space distribution of volcanic hazards around this volcano. The analysis of past activity, from late Pleistocene to historic times, and the extent of some well-identified deposits are used to calculate the recurrence probabilities of eruptions of various size during time periods useful for land-use planning.

  7. Rockfall Hazard Analysis From Discrete Fracture Network Modelling with Finite Persistence Discontinuities

    Science.gov (United States)

    Lambert, Cédric; Thoeni, Klaus; Giacomini, Anna; Casagrande, Davide; Sloan, Scott

    2012-09-01

    Developing an accurate representation of the rock mass fabric is a key element in rock fall hazard analysis. The orientation, persistence and density of fractures control the volume and shape of unstable blocks or compartments. In this study, the discrete fracture modelling technique and digital photogrammetry were used to accurately depict the fabric. A volume distribution of unstable blocks was derived combining polyhedral modelling and kinematic analyses. For each block size, probabilities of failure and probabilities of propagation were calculated. A complete energy distribution was obtained by considering, for each block size, its occurrence in the rock mass, its probability of falling, its probability to reach a given location, and the resulting distribution of energies at each location. This distribution was then used with an energy-frequency diagram to assess the hazard.

  8. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Gürpinar, Aybars, E-mail: aybarsgurpinar2007@yahoo.com [Nuclear & Risk Consultancy, Anisgasse 4, 1221 Vienna (Austria); Serva, Leonello, E-mail: lserva@alice.it [Independent Consultant, Via dei Dauni 1, 00185 Rome (Italy); Livio, Franz, E-mail: franz.livio@uninsubria.it [Dipartimento di Scienza ed Alta Tecnologia, Università degli Studi dell’Insubria, Via Velleggio, 11, 22100 Como (Italy); Rizzo, Paul C., E-mail: paul.rizzo@rizzoasoc.com [RIZZO Associates, 500 Penn Center Blvd., Suite 100, Pittsburgh, PA 15235 (United States)

    2017-01-15

    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  9. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    Science.gov (United States)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  10. Seismic Hazard Analysis Using the Adaptive Kernel Density Estimation Technique for Chennai City

    Science.gov (United States)

    Ramanna, C. K.; Dodagoudar, G. R.

    2012-01-01

    Conventional method of probabilistic seismic hazard analysis (PSHA) using the Cornell-McGuire approach requires identification of homogeneous source zones as the first step. This criterion brings along many issues and, hence, several alternative methods to hazard estimation have come up in the last few years. Methods such as zoneless or zone-free methods, modelling of earth's crust using numerical methods with finite element analysis, have been proposed. Delineating a homogeneous source zone in regions of distributed seismicity and/or diffused seismicity is rather a difficult task. In this study, the zone-free method using the adaptive kernel technique to hazard estimation is explored for regions having distributed and diffused seismicity. Chennai city is in such a region with low to moderate seismicity so it has been used as a case study. The adaptive kernel technique is statistically superior to the fixed kernel technique primarily because the bandwidth of the kernel is varied spatially depending on the clustering or sparseness of the epicentres. Although the fixed kernel technique has proven to work well in general density estimation cases, it fails to perform in the case of multimodal and long tail distributions. In such situations, the adaptive kernel technique serves the purpose and is more relevant in earthquake engineering as the activity rate probability density surface is multimodal in nature. The peak ground acceleration (PGA) obtained from all the three approaches (i.e., the Cornell-McGuire approach, fixed kernel and adaptive kernel techniques) for 10% probability of exceedance in 50 years is around 0.087 g. The uniform hazard spectra (UHS) are also provided for different structural periods.

  11. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)

    2016-09-15

    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  12. Analysis of occupational health hazards and associated risks in fuzzy environment: a case research in an Indian underground coal mine.

    Science.gov (United States)

    Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar

    2017-09-01

    This paper presents a unique hierarchical structure on various occupational health hazards including physical, chemical, biological, ergonomic and psychosocial hazards, and associated adverse consequences in relation to an underground coal mine. The study proposes a systematic health hazard risk assessment methodology for estimating extent of hazard risk using three important measuring parameters: consequence of exposure, period of exposure and probability of exposure. An improved decision making method using fuzzy set theory has been attempted herein for converting linguistic data into numeric risk ratings. The concept of 'centre of area' method for generalized triangular fuzzy numbers has been explored to quantify the 'degree of hazard risk' in terms of crisp ratings. Finally, a logical framework for categorizing health hazards into different risk levels has been constructed on the basis of distinguished ranges of evaluated risk ratings (crisp). Subsequently, an action requirement plan has been suggested, which could provide guideline to the managers for successfully managing health hazard risks in the context of underground coal mining exercise.

  13. Pasteurised milk and implementation of HACCP (Hazard Analysis Critical Control Point

    Directory of Open Access Journals (Sweden)

    T.B Murdiati

    2004-10-01

    Full Text Available The purpose of pasteurisation is to destroy pathogen bacteria without affecting the taste, flavor, and nutritional value. A study on the implementation of HACCP (Hazard Analysis Critical Control Point in producing pasteurized milk was carried out in four processing unit of pasteurised milk, one in Jakarta, two in Bandung and one in Bogor. The critical control points in the production line were identified. Milk samples were collected from the critical points and were analysed for the total number of microbes. Antibiotic residues were detected on raw milks. The study indicated that one unit in Bandung dan one unit in Jakarta produced pasteurized milk with lower number of microbes than the other units, due to better management and control applied along the chain of production. Penisilin residues was detected in raw milk used by unit in Bogor. Six critical points and the hazard might arise in those points were identified, as well as how to prevent the hazards. Quality assurance system such as HACCP would be able to produce high quality and safety of pasteurised milk, and should be implemented gradually.

  14. Spatial hazard analysis and prediction on rainfall-induced landslide using GIS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The application of landslide hazard model cou-pled with GIS provides an effective means to spatial hazardanalysis and prediction on rainfall-induced landslides. Amodified SINMAP model is established based upon the sys-tematic investigation on previous GIS-based landslide analy-sis models. By integrating the landslide deterministic modelwith the hydrological distribution model based on DEM, thismodel deeply studied the effect of underground water dis-tribution due to rainfall on the slope stability and landslideoccurrence, including the effect of dynamic water pressureresulting from the down slope seepage process as well as thatof static water pressure. Its applicability has been testified onthe Xiaojiang watershed, the rainfall-induced landslideswidespread area in Southeast China. Detailed discussion wascarried out on the spatial distribution characteristics oflandslide hazard and its extending trend, as well as thequantitative relationship between landslide hazard with pre-cipitation, slope angle and specific catchment area in theXiaojiang watershed. And the precipitation threshold forlandslide occurrence was estimated. These analytical resultsare proved useful for geohazard control and engineeringdecision-making in the Xiaojiang watershed.

  15. Laser hazard analysis for airborne AURA (Big Sky variant) Proteus platform.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2004-02-01

    A laser safety and hazard analysis was performed for the airborne AURA (Big Sky Laser Technology) lidar system based on the 2000 version of the American National Standard Institute's (ANSI) Standard Z136.1, for the Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for the Safe Use of Lasers Outdoors. The AURA lidar system is installed in the instrument pod of a Proteus airframe and is used to perform laser interaction experiments and tests at various national test sites. The targets are located at various distances or ranges from the airborne platform. In order to protect personnel, who may be in the target area and may be subjected to exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength, calculate the Nominal Ocular Hazard Distance (NOHD), and determine the maximum 'eye-safe' dwell times for various operational altitudes and conditions. It was also necessary to calculate the appropriate minimum Optical Density (ODmin) of the laser safety eyewear used by authorized personnel who may receive hazardous exposures during ground base operations of the airborne AURA laser system (system alignment and calibration).

  16. Seismic hazard analysis of Sinop province, Turkey using probabilistic and statistical methods

    Indian Academy of Sciences (India)

    Recai Feyiz Kartal; Günay Beyhan; Ayhan Keskinsezer

    2014-04-01

    Using 4.0 and greater magnitude earthquakes which occurred between 1 January 1900 and 31 December 2008 in the Sinop province of Turkey this study presents a seismic hazard analysis based on the probabilistic and statistical methods. According to the earthquake zonation map, Sinop is divided into first, second, third and fourth-degree earthquake regions. Our study area covered the coordinates between 40.66°–42.82°N and 32.20°–36.55°E. The different magnitudes of the earthquakes during the last 108 years recorded on varied scales were converted to a common scale (Mw). The earthquake catalog was then recompiled to evaluate the potential seismic sources in the aforesaid province. Using the attenuation relationships given by Boore et al. (1997) and Kalkan and Gülkan (2004), the largest ground accelerations corresponding to a recurrence period of 475 years are found to be 0.14 g for bedrock at the central district. Comparing the seismic hazard curves, we show the spatial variations of seismic hazard potential in this province, enumerating the recurrence period in the order of 475 years.

  17. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    Science.gov (United States)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2017-01-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  18. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, North-Western Italy

    Science.gov (United States)

    Turconi, L.; Tropeano, D.; Savio, G.; De, S. Kumar; Mason, P. J.

    2015-04-01

    A Civil Protection Plan has been drafted for a 600 km2 mountainous region in NW Italy Consisting of Orco and Soana Valleys. It is a part of the oldest natural park in Italy and attracts several thousand tourists every year. The work is concerned with the analysis of relevant physiographic characteristics of this Alpine landscapehaving extremely variable geomorphology and possess a long history of instability. Thousands of records as well as digital maps (involving overlay and comparison of up to 90 GIS layers) have been analyzed and cross-correlated to find out the details of the events. The study area experienced different types of natural hazards, typical of the whole Alpine environment. Thus, the present area has been selected for such multi-hazard research in which several natural processes have been investigated, concerning their damaging effects over the land. Due to 36 different severe hazardous events at least 250 deaths have been recorded in the area since 18th Century, in the occasion of.

  19. Standards Applicable to Generators of Hazardous Waste; Alternative Requirements for Hazardous Waste Determination and Accumulation of Unwanted Material at Laboratories Owned by Colleges and Universities and Other Eligible Academic Entities Formally Affiliated With Colleges and Universities. Final Rule. Federal Register, Environmental Protection Agency. 40 CFR Parts 261 and 262. Part II

    Science.gov (United States)

    National Archives and Records Administration, 2008

    2008-01-01

    The Environmental Protection Agency (EPA or the Agency) is finalizing an alternative set of generator requirements applicable to laboratories owned by eligible academic entities, as defined in this final rule. The rule provides a flexible and protective set of regulations that address the specific nature of hazardous waste generation and…

  20. Final Hazard Categorization for the Remediation of the 118-D-1, 118-D-2, 118-D-3, 118-H-1, 118-H-2, and 118-H-3 Solid Waste Burial Grounds

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Ludowise

    2009-06-17

    This report presents the final hazard categorization for the remediation of the 118-D-1, 118-D-2, 118-D-3 Burial Grounds located within the 100-D/DR Area of the Hanford Site and the 118-H-1, 118-H-2, and 118-H-3 Burial Grounds located within the 100-H Area of the Hanford Site. A material at risk calculation was performed that determined the radiological inventory for each burial ground to be Hazard Category 3.

  1. Criticality analysis for hazardous materials transportation; Classificacao da criticidade das rotas do transporte rodoviario de produtos perigosos da BRASKEM

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Katia; Brady, Mariana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Diniz, Americo [BRASKEM S.A., Sao Paulo, SP (Brazil)

    2008-07-01

    The bad conditions of Brazilians roads drive the companies to be more exigent with the transportation of hazardous materials to avoid accidents or materials releases with actions to contain the releases to community and water sources. To minimize this situation, DNV and BRASKEM developed a methodology for risk analysis called Criticality Analysis for Hazardous Materials Transportation. The objective of this methodology is identifying the most critical points of routes to make actions to avoid accidents. (author)

  2. Analysis of Final Energy Consumption Patterns in 10 Arab Countries

    Science.gov (United States)

    Al-Hinti, I.; Al-Ghandoor, A.

    2009-08-01

    This study presents an analysis of the energy consumption patterns in 10 Arab countries: Saudi Arabia, Kuwait, United Arab Emirates (UAE), Syria, Lebanon, Jordan, Egypt, Libya, Tunisia, and Algeria. Commonalities and variations between these countries are discussed and explained through key economic and energy indicators, and the relationship between the overall final energy consumption per capita and the GDP per capita is examined. The distribution of the final energy consumption across different sectors is also analysed, and the patterns of consumption in the industrial, transportation, and residential sectors are discussed with focus on the types of energy consumed, and the main drivers of this consumption. The findings and the conclusions of this study are believed to be beneficial to the national energy policy planners in identifying possible strengths, weaknesses, and areas of emphasis and improvement in their strategic energy plans.

  3. Critical load analysis in hazard assessment of metals using a Unit World Model.

    Science.gov (United States)

    Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L

    2011-09-01

    A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic

  4. The hazard analysis and critical control point system in food safety.

    Science.gov (United States)

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series.

  5. Region-specific deterministic and probabilistic seismic hazard analysis of Kanpur city

    Indian Academy of Sciences (India)

    Anbazhagan P; Ketan Bajaj; Nairwita Dutta; Sayed S R Moustafa; Nassir S N Al-Arifi

    2017-02-01

    A seismic hazard map of Kanpur city has been developed considering the region-specific seismotectonic parameters within a 500-km radius by deterministic and probabilistic approaches. The maximum probable earthquake magnitude \\textit(Mmax) for each seismic source has been estimated by considering the regional rupture characteristics method and has been compared with the maximum magnitude observed \\textit(Mobsmax), \\textit(Mobsmax) + 0.5 and Kijko method. The best suitable ground motion prediction equations (GMPE) were selected from 27 applicable GMPEs based on the ‘efficacy test’. Furthermore, different weight factors were assigned to different Mmax values and the selected GMPE to calculate the final hazard value. Peak ground acceleration and spectral acceleration at 0.2 and 1 s were estimated and mapped for worstcase scenario and 2 and 10% probability of exceedance for 50 years. Peak ground acceleration (PGA) showed a variation from 0.04 to 0.36 g for DSHA, from 0.02 to 0.32 g and 0.092 to 0.1525 g for 2 and 10% probability in 50 years, respectively. A normalised site-specific design spectrum has been developed considering three vulnerable sources based on deaggregation at the city center and the results are compared with the recent 2011 Sikkim and 2015 Nepal earthquakes, and the Indian seismic code IS 1893.

  6. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-02-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  7. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Project

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-10-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  8. Asymptotics on Semiparametric Analysis of Multivariate Failure Time Data Under the Additive Hazards Model

    Institute of Scientific and Technical Information of China (English)

    Huan-bin Liu; Liu-quan Sun; Li-xing Zhu

    2005-01-01

    Many survival studies record the times to two or more distinct failures on each subject. The failures may be events of different natures or may be repetitions of the same kind of event. In this article, we consider the regression analysis of such multivariate failure time data under the additive hazards model. Simple weighted estimating functions for the regression parameters are proposed, and asymptotic distribution theory of the resulting estimators are derived. In addition, a class of generalized Wald and generalized score statistics for hypothesis testing and model selection are presented, and the asymptotic properties of these statistics are examined.

  9. [Incorporation of the Hazard Analysis and Critical Control Point system (HACCP) in food legislation].

    Science.gov (United States)

    Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R

    2004-01-01

    The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety.

  10. Application of Hazard Analysis and Critical Control Points in Cherry Juice Processing Enterprises

    Directory of Open Access Journals (Sweden)

    Peilong Xu

    2015-09-01

    Full Text Available Qingdao is one of the homelands for Cherry in China and in recent years, deep processing industry of cherry is developing rapidly. In this study, Hazard Analysis and Critical Control Points (HACCP quality control system is introduced into production process of cherry juice, which has effectively controlled food safety risks in food production processes. The practices have proved that application of HACCP system reduced probability of pollution in cherry juice production process effectively. The application of risk control system in cherry juice production provides benefits for standardization of the production process and helps in food safety supervision in production processes.

  11. Fuel Storage Facility Final Safety Analysis Report. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Linderoth, C.E.

    1984-03-01

    The Fuel Storage Facility (FSF) is an integral part of the Fast Flux Test Facility. Its purpose is to provide long-term storage (20-year design life) for spent fuel core elements used to provide the fast flux environment in FFTF, and for test fuel pins, components and subassemblies that have been irradiated in the fast flux environment. This Final Safety Analysis Report (FSAR) and its supporting documentation provides a complete description and safety evaluation of the site, the plant design, operations, and potential accidents.

  12. Fast Flux Test Facility final safety analysis report. Amendment 73

    Energy Technology Data Exchange (ETDEWEB)

    Gantt, D.A.

    1993-08-01

    This report provides Final Safety Analysis Report (FSAR) Amendment 73 for incorporation into the Fast Flux Test Facility (FFTR) FSAR set. This page change incorporates Engineering Change Notices (ECNs) issued subsequent to Amendment 72 and approved for incorparoration before May 6, 1993. These changes include: Chapter 3, design criteria structures, equipment, and systems; chapter 5B, reactor coolant system; chapter 7, instrumentation and control systems; chapter 9, auxiliary systems; chapter 11, reactor refueling system; chapter 12, radiation protection and waste management; chapter 13, conduct of operations; chapter 17, technical specifications; chapter 20, FFTF criticality specifications; appendix C, local fuel failure events; and appendix Fl, operation at 680{degrees}F inlet temperature.

  13. Fast Flux Test Facility final safety analysis report. Amendment 73

    Energy Technology Data Exchange (ETDEWEB)

    Gantt, D.A.

    1993-08-01

    This report provides Final Safety Analysis Report (FSAR) Amendment 73 for incorporation into the Fast Flux Test Facility (FFTR) FSAR set. This page change incorporates Engineering Change Notices (ECNs) issued subsequent to Amendment 72 and approved for incorparoration before May 6, 1993. These changes include: Chapter 3, design criteria structures, equipment, and systems; chapter 5B, reactor coolant system; chapter 7, instrumentation and control systems; chapter 9, auxiliary systems; chapter 11, reactor refueling system; chapter 12, radiation protection and waste management; chapter 13, conduct of operations; chapter 17, technical specifications; chapter 20, FFTF criticality specifications; appendix C, local fuel failure events; and appendix Fl, operation at 680{degrees}F inlet temperature.

  14. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steve [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ma, Zhegang [Idaho National Lab. (INL), Idaho Falls, ID (United States); Spears, Bob [Idaho National Lab. (INL), Idaho Falls, ID (United States); Szilard, Ronaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kosbab, Ben [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-07-26

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA models for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.

  15. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    Science.gov (United States)

    Grasso, S.; Maugeri, M.

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either

  16. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    Science.gov (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  17. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    Directory of Open Access Journals (Sweden)

    H. Apel

    2015-08-01

    Full Text Available Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU for time-efficient flood propagation modelling. All hazards – fluvial, pluvial and combined – were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median

  18. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

    Energy Technology Data Exchange (ETDEWEB)

    Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

    2010-11-01

    Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

  19. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

    Energy Technology Data Exchange (ETDEWEB)

    Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

    2010-11-01

    Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

  20. Strong Ground-Motion Prediction in Seismic Hazard Analysis: PEGASOS and Beyond

    Science.gov (United States)

    Scherbaum, F.; Bommer, J. J.; Cotton, F.; Bungum, H.; Sabetta, F.

    2005-12-01

    The SSHAC Level 4 approach to probabilistic seismic hazard analysis (PSHA), which could be considered to define the state-of-the-art in PSHA using multiple expert opinions, has been fully applied only twice, firstly in the multi-year Yucca Mountain study and subsequently (2002-2004) in the PEGASOS project. The authors of this paper participated as ground-motion experts in this latter project, the objective of which was comprehensive seismic hazard analysis for four nuclear power plant sites in Switzerland, considering annual exceedance frequencies down to 1/10000000. Following SSHAC procedure, particular emphasis was put on capturing both the aleatory and epistemic uncertainties. As a consequence, ground motion prediction was performed by combining several empirical ground motion models within a logic tree framework with the weights on each logic tree branch expressing the personal degree-of-belief of each ground-motion expert. In the present paper, we critically review the current state of ground motion prediction methodology in PSHA in particular for regions of low seismicity. One of the toughest lessons from PEGASOS was that in systematically and rigorously applying the laws of uncertainty propagation to all of the required conversions and adjustments of ground motion models, a huge price has to be paid in an ever-growing aleatory variability. Once this path has been followed, these large sigma values will drive the hazard, particularly for low annual frequencies of exceedance. Therefore, from a post-PEGASOS perspective, the key issues in the context of ground-motion prediction for PSHA for the near future are to better understand the aleatory variability of ground motion and to develop suites of ground-motion prediction equations that employ the same parameter definitions. The latter is a global rather than a regional challenge which might be a desirable long-term goal for projects similar to the PEER NGA (Pacific Earthquake Engineering Research Center, Next

  1. Final Safety Analysis Report (FSAR) for Building 332, Increment III

    Energy Technology Data Exchange (ETDEWEB)

    Odell, B. N.; Toy, Jr., A. J.

    1977-08-31

    This Final Safety Analysis Report (FSAR) supplements the Preliminary Safety Analysis Report (PSAR), dated January 18, 1974, for Building 332, Increment III of the Plutonium Materials Engineering Facility located at the Lawrence Livermore Laboratory (LLL). The FSAR, in conjunction with the PSAR, shows that the completed increment provides facilities for safely conducting the operations as described. These documents satisfy the requirements of ERDA Manual Appendix 6101, Annex C, dated April 8, 1971. The format and content of this FSAR complies with the basic requirements of the letter of request from ERDA San to LLL, dated March 10, 1972. Included as appendices in support of th FSAR are the Building 332 Operational Safety Procedure and the LLL Disaster Control Plan.

  2. Fire Hazard Analysis for the Cold Vacuum Drying facility (CVD) Facility

    CERN Document Server

    Singh, G

    2000-01-01

    The CVDF is a nonreactor nuclear facility that will process the Spent Nuclear Fuels (SNF) presently stored in the 105-KE and 105-KW SNF storage basins. Multi-canister overpacks (MCOs) will be loaded (filled) with K Basin fuel transported to the CVDF. The MCOs will be processed at the CVDF to remove free water from the fuel cells (packages). Following processing at the CVDF, the MCOs will be transported to the CSB for interim storage until a long-term storage solution can be implemented. This operation is expected to start in November 2000. A Fire Hazard Analysis (FHA) is required for all new facilities and all nonreactor nuclear facilities, in accordance with U.S. Department of Energy (DOE) Order 5480.7A, Fire Protection. This FHA has been prepared in accordance with DOE 5480.7A and HNF-PRO-350, Fire Hazard Analysis Requirements. Additionally, requirements or criteria contained in DOE, Richland Operations Office (RL) RL Implementing Directive (RLID) 5480.7, Fire Protection, or other DOE documentation are cite...

  3. Geomorphological analysis of sinkhole and landslide hazard in a karst area of the Venetian Prealps- Italy

    Science.gov (United States)

    Tiberi, Valentina

    2010-05-01

    In the pedemountain area of the Asiago Plateau (Venetian Prealps - NE Italy) sinkholes and landslides represent in many cases a complex response to karst processes. Field survey showed that both soil and bedrock are involved, mainly represented by colluvial-alluvial sediments and carbonate rocks. Preliminary observations also reveal the key role of piping and cave-collapse phenomena and the importance of human remedial measures. Within study area, these processes cause damage mainly to agricultural and pasture activities and expose peoples and farm animals to very high hazards. This work provides preliminary results of geomorphological analysis carried out to define sinkhole and landslide hazard and his connections with karst processes. During first phases of the research program, an inventory of interesting phenomena has been elaborated employing GIS technologies. The database has been constantly revised and enriched with new field measurements and thematic maps (i.e. geomorphological, geo-structural, hydrogeological, caves development maps). Specifically, field survey focused on the morphodynamic definition of instability elements allowing to recognize a wide range of morphotypes (mainly with regard to sinkholes) and polygenic morphologies (i.e. mixed sinkholes-landslides configurations). Geomorphological analysis also revealed specific evolutionary trends of instability processes; they could be useful employed to program more effective mitigation strategies.

  4. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    Science.gov (United States)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  5. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    OpenAIRE

    2015-01-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and doc...

  6. Incorporating Climate Change Projections into a Hydrologic Hazard Analysis for Friant Dam

    Science.gov (United States)

    Holman, K. D.; Novembre, N.; Sankovich-Bahls, V.; England, J. F.

    2015-12-01

    The Bureau of Reclamation's Dam Safety Office has initiated a series of pilot studies focused on exploring potential impacts of climate change on hydrologic hazards at specific dam locations across the Western US. Friant Dam, located in Fresno, California, was chosen for study because the site had recently undergone a high-level hydrologic hazard analysis using the Stochastic Event Flood Model (SEFM). SEFM is a deterministic flood-event model that treats input parameters as variables, rather than fixed values. Monte Carlo sampling allows the hydrometeorological input parameters to vary according to observed relationships. In this study, we explore the potential impacts of climate change on the hydrologic hazard at Friant Dam using historical and climate-adjusted hydrometeorological inputs to the SEFM. Historical magnitude-frequency relationships of peak inflow and reservoir elevation were developed at Friant Dam for the baseline study using observed temperature and precipitation data between 1966 and 2011. Historical air temperatures, antecedent precipitation, mean annual precipitation, and the precipitation-frequency curve were adjusted for the climate change study using the delta method to create climate-adjusted hydrometeorological inputs. Historical and future climate projections are based on the Bias-Corrected Spatially-Disaggregated CMIP5 dataset (BCSD-CMIP5). The SEFM model was run thousands of times to produce magnitude-frequency relationships of peak reservoir inflow, inflow volume, and reservoir elevation, based on historical and climate-adjusted inputs. Results suggest that peak reservoir inflow and peak reservoir elevation increase (decrease) for all return periods under mean increases (decreases) in precipitation, independently of changes in surface air temperature.

  7. Microbiological quality of food in relation to hazard analysis systems and food hygiene training in UK catering and retail premises.

    Science.gov (United States)

    Little, C L; Lock, D; Barnes, J; Mitchell, R T

    2003-09-01

    A meta-analysis of eight UK food studies was carried out to determine the microbiological quality of food and its relationship with the presence in food businesses of hazard analysis systems and food hygiene training. Of the 19,022 premises visited to collect food samples in these studies between 1997 and 2002, two thirds (66%) were catering premises and one third (34%) were retail premises. Comparison with PHLS Microbiological Guidelines revealed that significantly more ready-to-eat food samples from catering premises (20%; 2,511/12,703) were of unsatisfactory or unacceptable microbiological quality compared to samples from retail premises (12%; 1,039/8,462) (p catering premises (p catering premises (p catering) compared with premises where the manager had received food hygiene training (11% retail, 19% catering) (p catering) were from premises where there was no hazard analysis system in place compared to premises that had a documented hazard analysis system in place (10% retail, 18% catering) (p catering premises compared with those collected from retail premises may reflect differences in management food hygiene training and the presence of a hazard analysis system. The importance of adequate training for food handlers and their managers as a pre-requisite for effective hazard analysis and critical control point (HACCP) based controls is therefore emphasised.

  8. Regional analysis assessment of landslide hazard and zoning map for transmission line route selection using GIS

    Science.gov (United States)

    Baharuddin, I. N. Z.; Omar, R. C.; Usman, F.; Mejan, M. A.; Abd Halim, M. K.; Zainol, M. A.; Zulkarnain, M. S.

    2013-06-01

    The stability of ground as foundation for infrastructure development is always associated with geology and geomorphology aspects. Failure to carefully analyze these aspects may induce ground instability such subsidence and landslide which eventually can cause catastrophe to the infrastructure i.e. instability of transmission tower. However, in some cases such as the study area this is unavoidable. A GIS system for analysis of route was favoured to perform optimal route predictions based selection by incorporating multiple influence factors into its analysis by incorporating the Landslide Hazard Map (LHM) that was produced on basis of slope map, aspect map, land use map and geological map with the help of ArcGIS using weighted overlay method. Based on LHM it is safe to conclude that the proposed route for Ulu Jelai- Neggiri-Lebir-LILO transmission line has very low risk in term of landslides.

  9. A Sensitivity Study for an Evaluation of Input Parameters Effect on a Preliminary Probabilistic Tsunami Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyun-Me; Kim, Min Kyu; Choi, In-Kil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Dong-Hoon [Chonnam National University, Gwangju (Korea, Republic of)

    2014-10-15

    The tsunami hazard analysis has been based on the seismic hazard analysis. The seismic hazard analysis has been performed by using the deterministic method and the probabilistic method. To consider the uncertainties in hazard analysis, the probabilistic method has been regarded as attractive approach. The various parameters and their weight are considered by using the logic tree approach in the probabilistic method. The uncertainties of parameters should be suggested by analyzing the sensitivity because the various parameters are used in the hazard analysis. To apply the probabilistic tsunami hazard analysis, the preliminary study for the Ulchin NPP site had been performed. The information on the fault sources which was published by the Atomic Energy Society of Japan (AESJ) had been used in the preliminary study. The tsunami propagation was simulated by using the TSUNAMI{sub 1}.0 which was developed by Japan Nuclear Energy Safety Organization (JNES). The wave parameters have been estimated from the result of tsunami simulation. In this study, the sensitivity analysis for the fault sources which were selected in the previous studies has been performed. To analyze the effect of the parameters, the sensitivity analysis for the E3 fault source which was published by AESJ was performed. The effect of the recurrence interval, the potential maximum magnitude, and the beta were suggested by the sensitivity analysis results. Level of annual exceedance probability has been affected by the recurrence interval.. Wave heights have been influenced by the potential maximum magnitude and the beta. In the future, the sensitivity analysis for the all fault sources in the western part of Japan which were published AESJ would be performed.

  10. Evaluation of the Potential of NASA Multi-satellite Precipitation Analysis in Global Landslide Hazard Assessment

    Science.gov (United States)

    Hong, Yang; Adler, Robert F.; Huffman, George J.

    2007-01-01

    Landslides are one of the most widespread natural hazards on Earth, responsible for thousands of deaths and billions of dollars in property damage every year. In the U.S. alone landslides occur in every state, causing an estimated $2 billion in damage and 25- 50 deaths each year. Annual average loss of life from landslide hazards in Japan is 170. The situation is much worse in developing countries and remote mountainous regions due to lack of financial resources and inadequate disaster management ability. Recently, a landslide buried an entire village on the Philippines Island of Leyte on Feb 17,2006, with at least 1800 reported deaths and only 3 houses left standing of the original 300. Intense storms with high-intensity , long-duration rainfall have great potential to trigger rapidly moving landslides, resulting in casualties and property damage across the world. In recent years, through the availability of remotely sensed datasets, it has become possible to conduct global-scale landslide hazard assessment. This paper evaluates the potential of the real-time NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA) system to advance our understanding of and predictive ability for rainfall-triggered landslides. Early results show that the landslide occurrences are closely associated with the spatial patterns and temporal distribution of rainfall characteristics. Particularly, the number of landslide occurrences and the relative importance of rainfall in triggering landslides rely on the influence of rainfall attributes [e.g. rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms). TMPA precipitation data are available in both real-time and post-real-time versions, which are useful to assess the location and timing of rainfall-triggered landslide hazards by monitoring landslide-prone areas while receiving heavy rainfall. For the purpose of identifying rainfall-triggered landslides, an empirical global rainfall intensity

  11. [Epidemiologic aspects of a new approach to monitoring hygienic food handling using the hazard analysis critical control points (HACCP) system].

    Science.gov (United States)

    Matyás, Z

    1992-10-01

    The hitherto used traditional control of food hygiene focused on assessment whether the controlled sanitary and technological practice is consistent with requirements of regulations sometimes comprises also details of minor importance. To put it briefly, in the course of the production process are many check-up points, but only some or possibly only one is a critical control point. Moreover, by periodic supervision the hygienist is able to record the hygienic and technological state typical only for the time of control. Microbiological examination of final products can reveal only negative sequelae of microbial processes; it does not provide information on the conditions of contamination nor ensure protection against it. For these and other reasons the conclusion is reached that the hitherto used traditional approach of the hygiene supervision is not quite effective and must be replaced by a more active approach focused on the control of factors threatening the wholesomeness already during the production process. The new approach to supervision of food hygiene is the HACCP system (hazard analysis critical control points). The system works rationally as it is based on analysis of systematically assembled data on the causes and conditions which evoked the illness of the consumers by food products or meals. HACCP can be described as prompt, as health or quality problems are revealed immediately after their genesis during production or processing and are eliminated immediately. The system is also comprehensive as it comprises not only the basic technological process incl. processing or modification of ingredients but takes into account also the handling of the given food product after termination of production and in particular final culinar processing. The system can be applied to all pathogenic agents transmitted by foods to man from bacteria and their toxins, viruses, parasites, moulds and mycotoxins, biotoxins but also contaminants and radionuclides. The system

  12. Using of BEPU methodology in a final safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Menzel, Francine; Sabundjian, Gaiane, E-mail: fmenzel@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); D' auria, Francesco, E-mail: f.dauria@ing.unipi.it [Universita degli Studi di Pisa, Gruppo di Ricerca Nucleare San Piero a Grado (GRNSPG), Pisa (Italy); Madeira, Alzira A., E-mail: alzira@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  13. Adoption and Foster Care Analysis and Reporting System. Final rule.

    Science.gov (United States)

    2016-12-14

    The Social Security Act (the Act) requires that ACF regulate a national data collection system that provides comprehensive demographic and case-specific information on children who are in foster care and adopted. This final rule replaces existing Adoption and Foster Care Analysis and Reporting System (AFCARS) regulations and the appendices to require title IV-E agencies to collect and report data to ACF on children in out-of-home care, and who exit out-of-home care to adoption or legal guardianship, children in out-of-home care who are covered by the Indian Child Welfare Act, and children who are covered by a title IV-E adoption or guardianship assistance agreement.

  14. Urban Integrated Industrial Cogeneration Systems Analysis. Phase II final report

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    Through the Urban Integrated Industrial Cogeneration Systems Analysis (UIICSA), the City of Chicago embarked upon an ambitious effort to identify the measure the overall industrial cogeneration market in the city and to evaluate in detail the most promising market opportunities. This report discusses the background of the work completed during Phase II of the UIICSA and presents the results of economic feasibility studies conducted for three potential cogeneration sites in Chicago. Phase II focused on the feasibility of cogeneration at the three most promising sites: the Stockyards and Calumet industrial areas, and the Ford City commercial/industrial complex. Each feasibility case study considered the energy load requirements of the existing facilities at the site and the potential for attracting and serving new growth in the area. Alternative fuels and technologies, and ownership and financing options were also incorporated into the case studies. Finally, site specific considerations such as development incentives, zoning and building code restrictions and environmental requirements were investigated.

  15. Health care system hazard vulnerability analysis: an assessment of all public hospitals in Abu Dhabi.

    Science.gov (United States)

    Fares, Saleh; Femino, Meg; Sayah, Assaad; Weiner, Debra L; Yim, Eugene Sun; Douthwright, Sheila; Molloy, Michael Sean; Irfan, Furqan B; Karkoukli, Mohamed Ali; Lipton, Robert; Burstein, Jonathan L; Mazrouei, Mariam Al; Ciottone, Gregory

    2014-04-01

    Hazard vulnerability analysis (HVA) is used to risk-stratify potential threats, measure the probability of those threats, and guide disaster preparedness. The primary objective of this project was to analyse the level of disaster preparedness in public hospitals in the Emirate of Abu Dhabi, utilising the HVA tool in collaboration with the Disaster Medicine Section at Harvard Medical School. The secondary objective was to review each facility's disaster plan and make recommendations based on the HVA findings. Based on the review, this article makes eight observations, including on the need for more accurate data; better hazard assessment capabilities; enhanced decontamination capacities; and the development of hospital-specific emergency management programmes, a hospital incident command system, and a centralised, dedicated regional disaster coordination centre. With this project, HVAs were conducted successfully for the first time in health care facilities in Abu Dhabi. This study thus serves as another successful example of multidisciplinary emergency preparedness processes. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.

  16. Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment

    Science.gov (United States)

    Catelli, J.; Nong, S.

    2014-12-01

    Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.

  17. Analysis of aerosol emission and hazard evaluation of electrical discharge machining (EDM) process.

    Science.gov (United States)

    Jose, Mathew; Sivapirakasam, S P; Surianarayanan, M

    2010-01-01

    The safety and environmental aspects of a manufacturing process are important due to increased environmental regulations and life quality. In this paper, the concentration of aerosols in the breathing zone of the operator of Electrical Discharge Machining (EDM), a commonly used non traditional manufacturing process is presented. The pattern of aerosol emissions from this process with varying process parameters such as peak current, pulse duration, dielectric flushing pressure and the level of dielectric was evaluated. Further, the HAZOP technique was employed to identify the inherent safety aspects and fire risk of the EDM process under different working conditions. The analysis of aerosol exposure showed that the concentration of aerosol was increased with increase in the peak current, pulse duration and dielectric level and was decreased with increase in the flushing pressure. It was also found that at higher values of peak current (7A) and pulse duration (520 micros), the concentration of aerosols at breathing zone of the operator was above the permissible exposure limit value for respirable particulates (5 mg/m(3)). HAZOP study of the EDM process showed that this process is vulnerable to fire and explosion hazards. A detailed discussion on preventing the fire and explosion hazard is presented in this paper. The emission and risk of fire of the EDM process can be minimized by selecting proper process parameters and employing appropriate control strategy.

  18. Final Hazard Categorization for the Remediation of the 118-D-1, 118-D-2, 118-D-3, 118-H-1, 118-H-2, and 118-H-3 Solid Waste Burial Grounds

    Energy Technology Data Exchange (ETDEWEB)

    T. J. Rodovsky

    2007-04-12

    This report presents the final hazard categorization (FHC) for the remediation of the 118-D-1, 118-D-2, and 118-D-3 Burial Grounds located within the 100-D/DR Area of the Hanford Site and the 118-H-1, 118-H-2, and 118-H-3 Burial Grounds located within the 100-H Area of the Hanford Site.

  19. Final Hazard Categorization for the Remediation of the 118-D-1, 118-D-2, 118-D-3, 118-H-1, 118-H-2 and 118-H-3 Solid Waste Burial Grounds

    Energy Technology Data Exchange (ETDEWEB)

    K. L. Vialetti

    2008-05-20

    This report presents the final hazard categorization for the remediation of the 118-D-1, 118-D-2, and 118-D-3 Burial Grounds located within the 100-D/DR Area of the Hanford Site and the 118-H-1, 118-H-2, and 118-H-3 Burial Grounds located within the 100-H Area of the Hanford Site.

  20. Final Hazard Categorization for the Remediation of the 118-D-1, 118-D-2, 118-D-3, 118-H-1, 118-H-2, and 118-H-3 Solid Waste Burial Grounds

    Energy Technology Data Exchange (ETDEWEB)

    T. J. Rodovsky

    2006-12-06

    This report presents the final hazard categorization (FHC) for the remediation of the 118-D-1, 118-D-2, and 118-D-3 Burial Grounds located within the 100-D/DR Area of the Hanford Site and the 118-H-1, 118-H-2, and 118-H-3 Burial Grounds located within the 100-H Area of the Hanford Site.

  1. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    Science.gov (United States)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  2. The use of representative cases in hazard analysis of the tank waste remediation system at Hanford. The information in this document is a combination of HNF-SA-3168-A {ampersand} HNF-SA-3169-A - The control identification process

    Energy Technology Data Exchange (ETDEWEB)

    Niemi, B.J.

    1997-04-24

    During calendar year 1996, Duke Engineering and Services Hanford, Inc. conducted a safety analysis in accordance with DOE-STD-3009-94 as part of the development of a Final Safety Analysis Report (TSAR) for the Tank Waste Remediation System (TWRS) at the DOE Hanford site. The scope of the safety analysis of TWRS primarily addressed 177 large underground liquid waste storage tanks and associated equipment for transferring waste to and from tanks. The waste in the tanks was generated by the nuclear production and processing facilities at Hanford. The challenge facing the safety analysis team was to efficiently analyze the system within the time and budget allotted to provide the necessary and sufficient information for accident selection, control identification, and justification on the acceptability of the level of safety of TWRS. It was clear from the start that a hazard and accident analysis for each of the 177 similar tanks and supporting equipment was not practical nor necessary. For example, many of the tanks were similar enough that the results of the analysis of one tank would apply to many tanks. This required the development and use of a tool called the ''Hazard Topography''. The use of the Hazard Topography assured that all tank operations and configurations were adequately assessed in the hazard analysis and that the results (e.g., hazard identification and control decisions) were appropriately applied to all tanks and associated systems. The TWRS Hazard Topography was a data base of all the TWRS facilities (e.g., tanks, diversion boxes, transfer lines, and related facilities) along with data on their configuration, material at risk (MAR), hazards, and known safety related phenomenological issues. Facilities were then classified into groups based on similar combinations of configuration, MAR, hazards and phenomena. A hazard evaluation was performed for a tank or facility in each group. The results of these evaluations, also contained in

  3. Off-Road Terrain Traversability Analysis and Hazard Avoidance for UGVs

    Science.gov (United States)

    2011-01-01

    vehicle to perform hazard detection and avoidance at speeds of up to 10 mph (4.5 m/s), as long as the hazards can be detected at sufficient ranges. The...ranges of hazard detection in this data set are provided in table I. 6 Figure 10: Off-road course Google sky-view image Hazard Feature Max. Detection...Steep slope 115.1 Steep hill Table I: Hazard detection ranges V. FUTURE WORK We have noticed that even in off-road environments, there is usually some

  4. Grand Junction projects office mixed-waste treatment program, VAC*TRAX mobile treatment unit process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bloom, R.R.

    1996-04-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changes to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  5. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    Science.gov (United States)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  6. A regional analysis of elements at risk exposed to mountain hazards in the Eastern European Alps

    Science.gov (United States)

    Fuchs, Sven; Zischg, Andreas

    2014-05-01

    We present a method to quantify the number and value of buildings exposed to torrents and snow avalanches in the Austrian Alps, as well as the number of exposed people. Based on a unique population and building register dataset, a relational SQL database was developed that allows in combination with GIS data a rule-based nation-wide automated analysis. Furthermore, possibilities and challenges are discussed with respect to the use of such data in vulnerability assessment and with respect to resilience measures. We comprehensively address the challenge of data accuracy, scale and uncertainties. From the total of approximately 2.4 million buildings with a clearly attributable geographical location, around 120,000 are exposed to torrent processes (5 %) and snow avalanches (0.4 %); exposition was defined here as located within the digitally available hazard maps of the Austrian Torrent and Avalanche Control Service. Around 5 % of the population (360,000 out of 8.5 million inhabitants), based on those people being compulsory listed in the population register, are located in these areas. The analysis according to the building category resulted in 2.05 million residential buildings in Austria (85 %), 93,000 of which (4.5 %) are exposed to these hazards. In contrast, 37,300 buildings (1.6 %) throughout the country belong to the category of accommodation facilities, 5,600 of which are exposed (15 %). Out of the 140,500 commercial buildings, 8,000 (5 %) are exposed. A considerable spatial variation was detectable within the communities and Federal States. In general, an above-average exposition of buildings to torrent process and snow avalanches was detectable in communities located in the Federal State of Salzburg, Styria and Vorarlberg (torrents), and Tyrol and Vorarlberg (snow avalanches). In the alpine part of Austria, the share of exposed accommodation buildings was two times (Salzburg) and three times (Vorarlberg) higher than the regional average of exposed buildings

  7. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    Energy Technology Data Exchange (ETDEWEB)

    Tang, A; Samost, A [Massachusetts Institute of Technology, Cambridge, Massachusetts (United States); Viswanathan, A; Cormack, R; Damato, A [Dana-Farber Cancer Institute - Brigham and Women’s Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios were then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in

  8. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    Science.gov (United States)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  9. Explosive Potential Analysis of AB Process-Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Bullock, J.S.; Giles, G.E. jr.; Wendel, M.W.; Sulfredge, C.D.

    2001-10-12

    A need arose to define the hazards associated with the operation of a process. The process involved the evolution of a hydrogen gas stream from thermal decomposition of uranium hydride at approximately 400 C into the interior of a purged argon-filled glove box. Specific hazards of interest included the potential reaction severity of the evolved hydrogen with atmospheric oxygen, either downstream in the vent system or inside the box in the event of serious air inleakage. Another hazard might be the energetic reaction of inleaked air with the hot uranium and uranium hydride powder bed, possibly resulting in the dispersion of powders into an air atmosphere and the rapid combustion of the powders. This was approached as a problem in calculational simulation. Given the parameters associated with the process and the properties of the glove box system, certain scenarios were defined and the potential for flammable or detonation reactions estimated. Calculation tools included a comprehensive fluid dynamics code, a spreadsheet, a curve-fitting program, an equation solver, and a thermochemistry software package. Results are reported which suggest that the process can be operated without significant hazard to operators or significant damage to equipment, assuming that operators take account of potential upset scenarios.

  10. Analysis and evaluation of "noise" of occupational hazards in pumped storage power station

    Science.gov (United States)

    Zhao, Xin; Yang, Hongjian; Zhang, Huafei; Chen, Tao

    2017-05-01

    Aiming at the influence of "noise" of occupational hazards on the physical health of workers, the noise intensity of a working area of a hydropower station in China was evaluated comprehensively. Under the condition of power generation, noise detection is conducted on the main patrol area of the operator, and the noise samples in different regions are analyzed and processed by the single factor analysis of variance. The results show that the noise intensity of different working areas is significantly different, in which the overall noise level of the turbine layer is the highest and beyond the national standard, the protection measures need to be strengthened and the noise intensity of the rest area is normal

  11. Expressed breast milk on a neonatal unit: a hazard analysis and critical control points approach.

    Science.gov (United States)

    Cossey, Veerle; Jeurissen, Axel; Thelissen, Marie-José; Vanhole, Chris; Schuermans, Annette

    2011-12-01

    With the increasing use of human milk and growing evidence of the benefits of mother's milk for preterm and ill newborns, guidelines to ensure its quality and safety are an important part of daily practice in neonatal intensive care units. Operating procedures based on hazard analysis and critical control points can standardize the handling of mother's expressed milk, thereby improving nutrition and minimizing the risk of breast milk-induced infection in susceptible newborns. Because breast milk is not sterile, microorganisms can multiply when the milk is not handled properly. Additional exogenous contamination should be prevented. Strict hygiene and careful temperature and time control are important during the expression, collection, transport, storage, and feeding of maternal milk. In contrast to formula milk, no legal standards exist for the use of expressed maternal milk. The need for additional measures, such as bacteriological screening or heat treatment, remains unresolved.

  12. Developing Sustainable Modeling Software and Necessary Data Repository for Volcanic Hazard Analysis -- Some Lessons Learnt

    Science.gov (United States)

    Patra, A. K.; Connor, C.; Webley, P.; Jones, M.; Charbonnier, S. J.; Connor, L.; Gallo, S.; Bursik, M. I.; Valentine, G.; Hughes, C. G.; Aghakhani, H.; Renschler, C. S.; Kosar, T.

    2014-12-01

    We report here on an effort to improve the sustainability, robustness and usability of the core modeling and simulation tools housed in the collaboratory VHub.org and used in the study of complex volcanic behavior. In particular, we focus on tools that support large scale mass flows (TITAN2D), ash deposition/transport and dispersal (Tephra2 and PUFF), and lava flows (Lava2). These tools have become very popular in the community especially due to the availability of an online usage modality. The redevelopment of the tools ot take advantage of new hardware and software advances was a primary thrust for the effort. However, as we start work we have reoriented the effort to also take advantage of significant new opportunities for supporting the complex workflows and use of distributed data resources that will enable effective and efficient hazard analysis.

  13. Assessment of hygiene standards and Hazard Analysis Critical Control Points implementation on passenger ships.

    Science.gov (United States)

    Mouchtouri, Varavara; Malissiova, Eleni; Zisis, Panagiotis; Paparizou, Evina; Hadjichristodoulou, Christos

    2013-01-01

    The level of hygiene on ferries can have impact on travellers' health. The aim of this study was to assess the hygiene standards of ferries in Greece and to investigate whether Hazard Analysis Critical Control Points (HACCP) implementation contributes to the hygiene status and particularly food safety aboard passenger ships. Hygiene inspections on 17 ferries in Greece were performed using a standardized inspection form, with a 135-point scale. Thirty-four water and 17 food samples were collected and analysed. About 65% (11/17) of ferries were scored with >100 points. Ferries with HACCP received higher scores during inspection compared to those without HACCP (p value hygiene aboard passenger ships.

  14. [The Hazard Analysis Critical Control Point approach (HACCP) in meat production].

    Science.gov (United States)

    Berends, B R; Snijders, J M

    1994-06-15

    The Hazard Analysis Critical Control Point (HACCP) approach is a method that could transform the current system of safety and quality assurance of meat into a really effective and flexible integrated control system. This article discusses the origin and the basic principles of the HACCP approach. It also discusses why the implementation of the approach is not as widespread as might be expected. It is concluded that a future implementation of the approach in the entire chain of meat production, i.e. from conception to consumption, is possible. Prerequisites are, however, that scientifically validated risk analyses become available, that future legislation forms a framework that actively supports the approach, and that all parties involved in meat production not only become convinced of the advantages, but also are trained to implement the HACCP approach with insight.

  15. Pathogen Reduction and Hazard Analysis and Critical Control Point (HACCP) systems for meat and poultry. USDA.

    Science.gov (United States)

    Hogue, A T; White, P L; Heminover, J A

    1998-03-01

    The United States Department of Agriculture (USDA) Food Safety Inspection Service (FSIS) adopted Hazard Analysis and Critical Control Point Systems and established finished product standards for Salmonella in slaughter plants to improve food safety for meat and poultry. In order to make significant improvements in food safety, measures must be taken at all points in the farm-to-table chain including production, transportation, slaughter, processing, storage, retail, and food preparation. Since pathogens can be introduced or multiplied anywhere along the continuum, success depends on consideration and comparison of intervention measures throughout the continuum. Food animal and public health veterinarians can create the necessary preventative environment that mitigates risks for food borne pathogen contamination.

  16. [Monitoring of a HACCP (Hazard Analysis Critical Control Point) plan for Listeria monocytogenes control].

    Science.gov (United States)

    Mengoni, G B; Apraiz, P M

    2003-01-01

    The monitoring of a HACCP (Hazard Analysis Critical Control Point) plan for the Listeria monocytogenes control in the cooked and frozen meat section of a thermo-processing meat plant was evaluated. Seventy "non-product-contact" surface samples and fourteen finished product samples were examined. Thirty eight positive sites for the presence of Listeria sp. were obtained. Twenty-two isolates were identified as L. monocytogenes, two as L. seeligeri and fourteen as L. innocua. Non isolates were obtained from finished product samples. The detection of L. monocytogenes in cooked and frozen meat section environment showed the need for the HACCP plan to eliminate or prevent product contamination in the post-thermal step.

  17. Validation of acid washes as critical control points in hazard analysis and critical control point systems.

    Science.gov (United States)

    Dormedy, E S; Brashears, M M; Cutter, C N; Burson, D E

    2000-12-01

    A 2% lactic acid wash used in a large meat-processing facility was validated as an effective critical control point (CCP) in a hazard analysis and critical control point (HACCP) plan. We examined the microbial profiles of beef carcasses before the acid wash, beef carcasses immediately after the acid wash, beef carcasses 24 h after the acid wash, beef subprimal cuts from the acid-washed carcasses, and on ground beef made from acid-washed carcasses. Total mesophilic, psychrotrophic, coliforms, generic Escherichia coli, lactic acid bacteria, pseudomonads, and acid-tolerant microorganisms were enumerated on all samples. The presence of Salmonella spp. was also determined. Acid washing significantly reduced all counts except for pseudomonads that were present at very low numbers before acid washing. All other counts continued to stay significantly lower (P HACCP plans and can significantly reduce the total number of microorganisms present on the carcass and during further processing.

  18. 230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis

    Science.gov (United States)

    Paces, James B.

    2014-01-01

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  19. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Paces, James B. [U.S. Geological Survey

    2014-08-31

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  20. IMPORTANCE OF APPLICATION OF HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP IN MONTENEGRO TOURISM

    Directory of Open Access Journals (Sweden)

    Vesna Vujacic

    2014-01-01

    Full Text Available Tourism in Montenegro is the leading economic sector, a culinary product - food is an important element of tourist offers. With the development of tourism in Montenegro there is a need to provide quality as well as safe healthy food according to international standards. This paper presents the concept of HACCP and importance of its application in the tourism and hospitality industry. HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. HACCP is designed to act preventively with its principles and presents the most effective solution in providing healthy safe food. The aim of this paper is to present the importance of the application of HACCP concept in tourism of Montenegro as a recognizable and accepted international standard.

  1. Prediction of gas pressurization and hydrogen generation for shipping hazard analysis : Six unstabilized PU 02 samples

    Energy Technology Data Exchange (ETDEWEB)

    Moody, E. W. (Eddie W.); Veirs, D. K. (Douglas Kirk); Lyman, J. L. (John L.)

    2001-01-01

    Radiolysis of water to form hydrogen gas is a safety concern for safe storage and transport of plutonium-bearing materials. Hydrogen gas is considered a safety hazard if its concentration in the container exceeds five percent hydrogen by volume, DOE Docket No. 00-1 1-9965. Unfortunately, water cannot be entirely avoided in a processing environment and these samples contain a range of water inherently. Thermodynamic, chemical, and radiolysis modeling was used to predict gas generation and changes in gas composition as a function of time within sealed containers containing plutonium bearing materials. The results are used in support of safety analysis for shipping six unstabilized (i.e. uncalcined) samples from Rocky Flats Environmental Technology Sits (RFETS) to the Material Identification and Surveillance (MIS) program at Los Alamos National Lab (LANL). The intent of this work is to establish a time window in which safe shipping can occur.

  2. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  3. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  4. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    Science.gov (United States)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  5. GIS-supported geomorphological landslide hazard analysis in the Lainbach catchment, Upper Bavaria

    Science.gov (United States)

    Trau, J.; Ergenzinger, P.

    2003-04-01

    The Lainbach basin is located at the fringe of the Northern Limestone Alps. Predominant mass movements such as translational and rotational slides as well as debris flows are mainly linked to glacial deposits (Pleistocene valley fill) and Flysch series covering approximately 50% of the basin. The pre-Pleistocene relief is buried to a maximum thickness of 170 m of till, glacio-limnic and glacio-fluvial sediments. The spatial and temporal distributions of mass movements are coupled with different stages of fluvial incision. Recent fluvial processes are mainly bedrock controlled in the lower reaches. A special geomorphological map at a scale of 1:10.000 illustrates the relief evolution. In addition, the map focuses on past and recent process-forms related to mass movements. Thus areas of active and inactive mass movements can be easily distinguished. Zones of activity and the hazard potential can be deduced from the map. Hazard assessment is supported by GIS modelling, DEM analysis, multi-temporal time series analysis and aerial photo interpretation. Geophysical soundings are important for detailed site specific information such as shear planes and sediment thickness. A GIS model based on the parameters geology, topography (slope angle, curvature), thickness of loosely-consolidated material, vegetation and hydrology (proximity to receiving stream) was developed. Calculation of failure rates allow a specific value to be assigned to each parameter class indicating its role in the mass movement process. About 90% of the mapped mass movements were correctly classified by the model. Although the overall match seems to be quite good there are some localities where the modelled and the mapped results differ significantly. In the future, the mapped results should be considered together with further “expert knowledge” for an improvement of the GIS model.

  6. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  7. Rapid, reliable geodetic data analysis for hazard response: Results from the Advanced Rapid Imaging and Analysis (ARIA) project

    Science.gov (United States)

    Owen, S. E.; Simons, M.; Hua, H.; Yun, S.; Cruz, J.; Webb, F.; Rosen, P. A.; Fielding, E. J.; Moore, A. W.; Polet, J.; Liu, Z.; Agram, P. S.; Lundgren, P.

    2013-12-01

    ARIA is a joint JPL/Caltech coordinated project to automate InSAR and GPS imaging capabilities for scientific understanding, hazard response, and societal benefit. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allows us to resolve the fault geometry and distribution of slip associated with earthquakes in high spatial & temporal detail. In certain cases, it can be complementary to seismic data, providing constraints on location, geometry, or magnitude that is difficult to determine with seismic data alone. In addition, remote sensing with SAR provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. We have built an end-to-end prototype geodetic imaging data system that forms the foundation for a hazard response and science analysis capability that integrates InSAR, high-rate GPS, seismology, and modeling to deliver monitoring, science, and situational awareness products. This prototype incorporates state-of-the-art InSAR and GPS analysis algorithms from technologists and scientists. The products have been designed and a feasibility study conducted in collaboration with USGS scientists in the earthquake and volcano science programs. We will present results that show the capabilities of this data system in terms of latency, data processing capacity, quality of automated products, and feasibility of use for analysis of large SAR and GPS data sets and for earthquake response activities.

  8. A proxy analysis of urban air quality hazards in Bergen, Norway under a changing climate.

    Science.gov (United States)

    Wolf, Tobias; Esau, Igor; Reuder, Joachim

    2014-05-01

    The urban air quality in Bergen, Norway is characterized by clean air throughout most of the year interrupted by short episodes of hazardous pollution levels especially in close proximity to major road-emission sources. These pollution episodes are linked to winter time anti-cyclonic weather conditions with persistent stable temperature stratification (inversions) in the Atmospheric Boundary Layer. Although the pollution episodes are local events, the high pollution episodes are linked to large-scale persistent blockings in the atmospheric circulation. Here we present an atmospheric circulation proxy for the pollution episodes based on the ECMWF ERA-Interim reanalysis. The proxy is based on local 3-hourly instantaneous wind-speeds and directions at the 1000 hPa pressure level, and 1-day running mean temperature deviations at 2 m above ground from the 1-day running mean temperatures averaged over the full ERA-Interim record length. We tuned the thresholds for each quantity to the occurrence of events with an hourly mean NO2 concentration > 150 μg/m3 at a high pollution reference station. A condition on cloud cover had only little effect, sea-level pressure was not applicable. High pollution episodes predicted during typical low traffic days (Sundays, Christmas, New Year) were removed. The final proxy had a detection rate of 82 %, a false alarm rate of 77 % and a correct null prediction rate of 96 %. The high false alarm rate was expected because of the relaxed thresholds chosen in order to include a large fraction of possible states of atmospheric circulation that lead to hazardous air quality. Additionally, the false alarm rate was high because no constraint on the persistence of adverse meteorological conditions was set and because of the high variability of traffic, not always leading to hazardous pollution levels, even if the atmospheric circulation would allow for it. The Scandinavian index, an often used proxy for the occurrence of atmospheric circulation

  9. An OSHA based approach to safety analysis for nonradiological hazardous materials

    Energy Technology Data Exchange (ETDEWEB)

    Yurconic, M.

    1992-08-01

    The PNL method for chemical hazard classification defines major hazards by means of a list of hazardous substances (or chemical groups) with associated trigger quantities. In addition, the functional characteristics of the facility being classified is also be factored into the classification. In this way, installations defined as major hazard will only be those which have the potential for causing very serious incidents both on and off site. Because of the diversity of operations involving chemicals, it may not be possible to restrict major hazard facilities to certain types of operations. However, this hazard classification method recognizes that in the industrial sector major hazards are most commonly associated with activities involving very large quantities of chemicals and inherently energetic processes. These include operations like petrochemical plants, chemical production, LPG storage, explosives manufacturing, and facilities which use chlorine, ammonia, or other highly toxic gases in bulk quantities. The basis for this methodology is derived from concepts used by OSHA in its proposed chemical process safety standard, the Dow Fire and Explosion Index Hazard Classification Guide, and the International Labor Office`s program on chemical safety. For the purpose of identifying major hazard facilities, this method uses two sorting criteria, (1) facility function and processes and (2) quantity of substances to identify facilities requiringclassification. Then, a measure of chemical energy potential (material factor) is used to identify high hazard class facilities.

  10. An OSHA based approach to safety analysis for nonradiological hazardous materials

    Energy Technology Data Exchange (ETDEWEB)

    Yurconic, M.

    1992-08-01

    The PNL method for chemical hazard classification defines major hazards by means of a list of hazardous substances (or chemical groups) with associated trigger quantities. In addition, the functional characteristics of the facility being classified is also be factored into the classification. In this way, installations defined as major hazard will only be those which have the potential for causing very serious incidents both on and off site. Because of the diversity of operations involving chemicals, it may not be possible to restrict major hazard facilities to certain types of operations. However, this hazard classification method recognizes that in the industrial sector major hazards are most commonly associated with activities involving very large quantities of chemicals and inherently energetic processes. These include operations like petrochemical plants, chemical production, LPG storage, explosives manufacturing, and facilities which use chlorine, ammonia, or other highly toxic gases in bulk quantities. The basis for this methodology is derived from concepts used by OSHA in its proposed chemical process safety standard, the Dow Fire and Explosion Index Hazard Classification Guide, and the International Labor Office's program on chemical safety. For the purpose of identifying major hazard facilities, this method uses two sorting criteria, (1) facility function and processes and (2) quantity of substances to identify facilities requiringclassification. Then, a measure of chemical energy potential (material factor) is used to identify high hazard class facilities.

  11. Hazardous Waste

    Science.gov (United States)

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  12. Analysis of Member State RED implementation. Final Report (Task 2)

    Energy Technology Data Exchange (ETDEWEB)

    Peters, D.; Alberici, S.; Toop, G. [Ecofys, Utrecht (Netherlands); Kretschmer, B. [Institute for European Environmental Policy IEEP, London (United Kingdom)

    2012-12-15

    This report describes the way EU Member States have transposed the sustainability and chain of custody requirements for biofuels as laid down in the Renewable Energy Directive (RED) and Fuel Quality Directive (FQD). In the assessment of Member States' implementation, the report mainly focuses on effectiveness and administrative burden. Have Member States transposed the Directives in such a way that compliance with the sustainability criteria can be ensured as effectively as possible? To what extent does the Member States' implementation lead to unnecessary administrative burden for economic operators in the (bio)fuel supply chain? The report focuses specifically on the transposition of the sustainability and chain of custody requirements, not on the target for renewables on transport. This means that for example the double counting provision is not included as part of the scope of this report. This report starts with an introduction covering the implementation of the Renewable Energy (and Fuel Quality) Directive into national legislation, the methodology by which Member States were assessed against effectiveness and administrative burden and the categorisation of Member State's national systems for RED-implementation (Chapter 1). The report continues with a high level description of each Member State system assessed (Chapter 2). Following this, the report includes analysis of the Member States on the effectiveness and administrative burden of a number of key ('major') measures (Chapter 3). The final chapter presents the conclusions and recommendations (Chapter 4)

  13. Fast Flux Test Facility final safety analysis report. Amendment 72

    Energy Technology Data Exchange (ETDEWEB)

    Gantt, D. A.

    1992-08-01

    This document provides the Final Safety Analysis Report (FSAR) Amendment 72 for incorporation into the Fast Flux Test Facility (FFTF) FSAR set. This amendment change incorporates Engineering Change Notices issued subsequent to Amendment 71 and approved for incorporation before June 24, 1992. These include changes in: Chapter 2, Site Characteristics; Chapter 3, Design Criteria Structures, Equipment, and Systems; Chapter 5B, Reactor Coolant System; Chapter 7, Instrumentation and Control Systems; Chapter 8, Electrical Systems - The description of the Class 1E, 125 Vdc systems is updated for the higher capacity of the newly installed, replacement batteries; Chapter 9, Auxiliary Systems - The description of the inert cell NASA systems is corrected to list the correct number of spare sample points; Chapter 11, Reactor Refueling System; Chapter 12, Radiation Protection and Waste Management; Chapter 13, Conduct of Operations; Chapter 16, Quality Assurance; Chapter 17, Technical Specifications; Chapter 19, FFTF Fire Specifications for Fire Detection, Alarm, and Protection Systems; Chapter 20, FFTF Criticality Specifications; and Appendix B, Primary Piping Integrity Evaluation.

  14. Vulnerability analysis of Landslide hazard area: Case study of South Korea

    Science.gov (United States)

    Oh, Chaeyeon; Jun, Kyewon; Kim, Younghwan

    2017-04-01

    Recently such as Landslide and debris flow are occurring over the due to climate changes, frequent sedimentation disaster in mountains area. A scientific analysis of landslide risk areas along with the collection and analysis of a variety of spatial information would be critical for minimizing damage in the event of mountainous disasters such as landslide and debris flow. We carried out a case study of the selected areas at Inje, Gangwon province which suffered from serious landslides due to flash floods by Typhoon Ewiniar in 2006. Landslide and debris flow locations were identified in the study area from interpretation of airborne image and field surveys. We used GIS to construct a spatial information database integrating the data required for a comprehensive analysis of landslide risk areas including geography, hydrology, pedology, and forestry. Furthermore, this study evaluates slope stability of the affected areas using SINMAP(Stability Index Mapping), analyzes spatial data that have high correlation with selected landslide areas using Likelihood ratio. And by applying the Weight of evidence techniques weight values (W+ and W-) which were calculated for each element. We then analyzed the spatial data which were significantly correlated with the landslide occurrence and predicted the mountainous areas with elevated risks of landslide which are vulnerable to disasters, and the hazard map was generated using GIS. Acknowledgments This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(No.NRF-2014R1A1A3050495).

  15. The value of integrating information from multiple hazards for flood risk analysis and management

    Science.gov (United States)

    Castillo-Rodríguez, J. T.; Escuder-Bueno, I.; Altarejos-García, L.; Serrano-Lombillo, A.

    2014-02-01

    This article presents a methodology for estimating flood risk in urban areas integrating pluvial flooding, river flooding and failure of both small and large dams. The first part includes a review of basic concepts on flood risk analysis, evaluation and management. Flood risk analyses may be developed at local, regional and national level, however a general methodology to perform a quantitative flood risk analysis including different flood hazards is still required. The second part describes the proposed methodology, which presents an integrated approach - combining pluvial, river flooding and flooding from dam failure, as applied to a case study: an urban area located downstream of a dam under construction. The methodology enhances the approach developed within the SUFRI project ("Sustainable Strategies of Urban Flood Risk Management to cope with the residual risk", 2009-2011). This article also shows how outcomes from flood risk analysis provide better and more complete information to inform authorities, local entities and the stakeholders involved in decision-making with regard to flood risk management.

  16. Use of hazard analysis critical control point and alternative treatments in the production of apple cider.

    Science.gov (United States)

    Senkel, I A; Henderson, R A; Jolbitado, B; Meng, J

    1999-07-01

    The purpose of this study was to evaluate the practices of Maryland cider producers and determine whether implementing hazard analysis critical control point (HACCP) would reduce the microbial contamination of cider. Cider producers (n = 11) were surveyed to determine existing manufacturing practices and sanitation. A training program was then conducted to inform operators of safety issues, including contamination with Escherichia coli O157:H7, and teach HACCP concepts and principles, sanitation procedures, and good manufacturing practice (GMP). Although all operators used a control strategy from one of the model HACCP plans provided, only one developed a written HACCP plan. None developed specific GMP, sanitation standard operating procedures, or sanitation monitoring records. Six operators changed or added production controls, including the exclusion of windfall apples, sanitizing apples chemically and by hot dip, and cider treatment with UV light or pasteurization. Facility inspections indicated improved sanitation and hazard control but identified ongoing problems. Microbiological evaluation of bottled cider before and after training, in-line apples, pomace, cider, and inoculated apples was conducted. E. coli O157:H7, Salmonella, or Staphylococcus aureus were not found in samples of in-line apple, pomace, and cider, or bottled cider. Generic E. coli was not isolated on in-coming apples but was found in 4 of 32 (13%) in-line samples and 3 of 17 (18%) bottled fresh cider samples, suggesting that E. coli was introduced during in-plant processing. To produce pathogen-free cider, operators must strictly conform to GMP and sanitation procedures in addition to HACCP controls. Controls aimed at preventing or eliminating pathogens on source apples are critical but alone may not be sufficient for product safety.

  17. The ARIA project: Advanced Rapid Imaging and Analysis for Natural Hazard Monitoring and Response

    Science.gov (United States)

    Owen, S. E.; Webb, F.; Simons, M.; Rosen, P. A.; Cruz, J.; Yun, S.; Fielding, E. J.; Moore, A. W.; Hua, H.; Agram, P.; Lundgren, P.

    2012-12-01

    ARIA is a joint JPL/Caltech coordinated effort to automate geodetic imaging capabilities for hazard response and societal benefit. Over the past decade, space-based geodetic measurements such as InSAR and GPS have provided new assessment capabilities and situational awareness on the size and location of earthquakes following seismic disasters and on volcanic eruptions following magmatic events. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allow us to resolve the fault geometry and distribution of slip associated with any given earthquake in correspondingly high spatial & temporal detail. In addition, remote sensing with radar provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. These data sets are still essentially hand-crafted, and thus are not generated rapidly and reliably enough for informing decision-making agencies and the public following an earthquake. We are building an end-to-end prototype geodetic imaging data system that would form the foundation for an envisioned operational hazard response center integrating InSAR, GPS, seismology, and modeling to deliver monitoring, actionable science, and situational awareness products. This prototype exploits state-of-the-art analysis algorithms from technologists and scientists, These algorithms enable the delivery of actionable products from larger data sets with enhanced modeling and interpretation, and the development of next generation techniques. We are collaborating with USGS scientists in both the earthquake and volcano science program for our initial data product infusion. We present our progress to date on development of prototype data system and demonstration data products, and example responses we have run such as generating products for the 2011 M9.0 Tohoku-oki, M6.3 Christchurch earthquakes, the 2011 M7.1 Van earthquake, and several simulated

  18. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  19. The dilemma in prioritizing chemicals for environmental analysis: known versus unknown hazards.

    Science.gov (United States)

    Anna, Sobek; Sofia, Bejgarn; Christina, Rudén; Magnus, Breitholtz

    2016-08-10

    A major challenge for society is to manage the risks posed by the many chemicals continuously emitted to the environment. All chemicals in production and use cannot be monitored and science-based strategies for prioritization are essential. In this study we review available data to investigate which substances are included in environmental monitoring programs and published research studies reporting analyses of chemicals in Baltic Sea fish between 2000 and 2012. Our aim is to contribute to the discussion of priority settings in environmental chemical monitoring and research, which is closely linked to chemical management. In total, 105 different substances or substance groups were analyzed in Baltic Sea fish. Polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans (PCDD/Fs) and polychlorinated biphenyls (PCBs) were the most studied substances or substance groups. The majority, 87%, of all analyses comprised 20% of the substances or substance groups, whereas 46 substance groups (44%) were analyzed only once. Almost three quarters of all analyses regarded a POP-substance (persistent organic pollutant). These results demonstrate that the majority of analyses on environmental contaminants in Baltic Sea fish concern a small number of already regulated chemicals. Legacy pollutants such as POPs pose a high risk to the Baltic Sea due to their hazardous properties. Yet, there may be a risk that prioritizations for chemical analyses are biased based on the knowns of the past. Such biases may lead to society failing in identifying risks posed by yet unknown hazardous chemicals. Alternative and complementary ways to identify priority chemicals are needed. More transparent communication between risk assessments performed as part of the risk assessment process within REACH and monitoring programs, and information on chemicals contained in consumer articles, would offer ways to identify chemicals for environmental analysis.

  20. Safety and Hazard Analysis for the Coherent/Acculite Laser Based Sandia Remote Sensing System (Trailer B70).

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2005-09-01

    A laser safety and hazard analysis is presented, for the Coherent(r) driven Acculite(r) laser central to the Sandia Remote Sensing System (SRSS). The analysis is based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The trailer (B70) based SRSS laser system is a mobile platform which is used to perform laser interaction experiments and tests at various national test sites. The trailer based SRSS laser system is generally operated on the United State Air Force Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances. In order to protect personnel who work inside the Nominal Hazard Zone (NHZ) from hazardous laser exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (ODmin) necessary for the laser safety eyewear used by authorized personnel. Also, the Nominal Ocular Hazard Distance (NOHD) and The Extended Ocular Hazard Distance (EOHD) are calculated in order to protect unauthorized personnel who may have violated the boundaries of the control area and might enter into the laser's NHZ for testing outside the trailer. 4Page intentionally left blank

  1. Risk analysis procedure for post-wildfire natural hazards in British Columbia

    Science.gov (United States)

    Jordan, Peter

    2010-05-01

    Following a severe wildfire season in 2003, and several subsequent damaging debris flow and flood events, the British Columbia Forest Service developed a procedure for analysing risks to public safety and infrastructure from such events. At the same time, the Forest Service undertook a research program to determine the extent of post-wildfire hazards, and examine the hydrologic and geomorphic processes contributing to the hazards. The risk analysis procedure follows the Canadian Standards Association decision-making framework for risk management (which in turn is based on international standards). This has several steps: identification of risk, risk analysis and estimation, evaluation of risk tolerability, developing control or mitigation strategies, and acting on these strategies. The Forest Service procedure deals only with the first two steps. The results are passed on to authorities such as the Provincial Emergency Program and local government, who are responsible for evaluating risks, warning residents, and applying mitigation strategies if appropriate. The objective of the procedure is to identify and analyse risks to public safety and infrastructure. The procedure is loosely based on the BAER (burned area emergency response) program in the USA, with some important differences. Our procedure focuses on identifying risks and warning affected parties, not on mitigation activities such as broadcast erosion control measures. Partly this is due to limited staff and financial resources. Also, our procedure is not multi-agency, but is limited to wildfires on provincial forest land; in British Columbia about 95% of forest land is in the publicly-owned provincial forest. Each fire season, wildfires are screened by size and proximity to values at risk such as populated areas. For selected fires, when the fire is largely contained, the procedure begins with an aerial reconnaissance of the fire, and photography with a hand-held camera, which can be used to make a

  2. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xueqin [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); School of Social Development and Public Policy, Beijing Normal University, Beijing 100875 (China); Li, Ning [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); Yuan, Shuai, E-mail: syuan@nmemc.org.cn [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); Xu, Ning; Shi, Wenqin; Chen, Weibin [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China)

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  3. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    Science.gov (United States)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready

  4. Climatic change and health. Which problems are caused by thermophile hazardous organisms? Final report. Environment and health: climatic change; Klimawandel und Gesundheit. Welche Probleme verursachen Waerme liebende Schadorganismen? Abschlussbericht. Umwelt and Gesundheit: Klimawandel

    Energy Technology Data Exchange (ETDEWEB)

    Augustin, Jobst; Muecke, Hans-Guido (comps.)

    2010-03-15

    Climatic changes can cause health hazards due to thermophile harmful organisms, especially those with increased allergic potentials. The meeting covered the following topics: climatic change induced health hazards and the German adaptation strategies; the complex relation between climatic change and allergies; ambrosia propagation in Germany - hazards for health and biodiversity; climatic change induced reaction of hygienically precarious organism in urban regions; monitoring and abatement of Thaumetopoea processionea in Bavarian woods; climatic change and pollen flight dynamics; Thaumetopoea processionea as cause for non-distinctive respiratory systems diseases; risk and protection factors for the development of asthma and allergies during infancy; abatement of pathogenic or invasive harmful organisms in Switzerland; health hazards in connection with Thaumetopoea processionea - examples from Bavaria; retrospective analysis of EPS diseases during 2004 and 2005 in the region Kleve.

  5. Hazard, Vulnerability and Capacity Mapping for Landslides Risk Analysis using Geographic Information System (GIS)

    Science.gov (United States)

    Sari, D. A. P.; Innaqa, S.; Safrilah

    2017-06-01

    This research analyzed the levels of disaster risk in the Citeureup sub-District, Bogor Regency, West Java, based on its potential hazard, vulnerability and capacity, using map to represent the results, then Miles and Huberman analytical techniques was used to analyze the qualitative interviews. The analysis conducted in this study is based on the concept of disaster risk by Wisner. The result shows that the Citeureup sub-District has medium-low risk of landslides. Of the 14 villages, three villages have a moderate risk level, namely Hambalang, Tajur, and Tangkil, or 49.58% of the total land area. Eleven villages have a low level of risk, namely Pasir Mukti, Sanja, Tarikolot, Gunung Sari, Puspasari, East Karang Asem, Citeureup, Leuwinutug, Sukahati, West Karang Asem West and Puspanegara, or 48.68% of the total land area, for high-risk areas only around 1.74%, which is part of Hambalang village. The analysis using Geographic Information System (GIS) prove that areas with a high risk potential does not necessarily have a high level of risk. The capacity of the community plays an important role to minimize the risk of a region. Disaster risk reduction strategy is done by creating a safe condition, which intensified the movement of disaster risk reduction.

  6. Analysis on Topological Properties of Dalian Hazardous Materials Road Transportation Network

    Directory of Open Access Journals (Sweden)

    Pengyun Chong

    2015-01-01

    Full Text Available To analyze the topological properties of hazardous materials road transportation network (HMRTN, this paper proposed two different ways to construct the cyberspace of HMRTN and constructed their complex network models, respectively. One was the physical network model of HMRTN based on the primal approach and the other was the service network model of HMRTN based on neighboring nodes. The two complex network models were built by using the case of Dalian HMRTN. The physical network model contained 154 nodes and 238 edges, and the statistical analysis results showed that (1 the cumulative node degree of physical network was subjected to exponential distribution, showing the network properties of random network and that (2 the HMRTN had small characteristic path length and large network clustering coefficient, which was a typical small-world network. The service network model contained 569 nodes and 1318 edges, and the statistical analysis results showed that (1 the cumulative node degree of service network was subjected to power-law distribution, showing the network properties of scale-free network and that (2 the relationship between nodes strength and their descending order ordinal and the relationship between nodes strength and cumulative nodes strength were both subjected to power-law distribution, also showing the network properties of scale-free network.

  7. Comparison of Predicted Probabilities of Proportional Hazards Regression and Linear Discriminant Analysis Methods Using a Colorectal Cancer Molecular Biomarker Database

    Directory of Open Access Journals (Sweden)

    Upender Manne

    2007-01-01

    Full Text Available Background: Although a majority of studies in cancer biomarker discovery claim to use proportional hazards regression (PHREG to the study the ability of a biomarker to predict survival, few studies use the predicted probabilities obtained from the model to test the quality of the model. In this paper, we compared the quality of predictions by a PHREG model to that of a linear discriminant analysis (LDA in both training and test set settings. Methods: The PHREG and LDA models were built on a 491 colorectal cancer (CRC patient dataset comprised of demographic and clinicopathologic variables, and phenotypic expression of p53 and Bcl-2. Two variable selection methods, stepwise discriminant analysis and the backward selection, were used to identify the final models. The endpoint of prediction in these models was five-year post-surgery survival. We also used linear regression model to examine the effect of bin size in the training set on the accuracy of prediction in the test set.Results: The two variable selection techniques resulted in different models when stage was included in the list of variables available for selection. However, the proportion of survivors and non-survivors correctly identified was identical in both of these models. When stage was excluded from the variable list, the error rate for the LDA model was 42% as compared to an error rate of 34% for the PHREG model.Conclusions: This study suggests that a PHREG model can perform as well or better than a traditional classifier such as LDA to classify patients into prognostic classes. Also, this study suggests that in the absence of the tumor stage as a variable, Bcl-2 expression is a strong prognostic molecular marker of CRC.

  8. Hazardous Waste: Learn the Basics of Hazardous Waste

    Science.gov (United States)

    ... page on hazardous waste transportation . Top of Page Hazardous Waste Recycling, Treatment, Storage and Disposal To the extent possible, EPA ... Disposal Facilities (TSDFs) provide temporary storage and final treatment or disposal for hazardous wastes. Since they manage large volumes of waste and ...

  9. A hazard analysis via an improved timed colored petri net with time-space coupling safety constraint

    Institute of Scientific and Technical Information of China (English)

    Li Zelin; Wang Shihai; Zhao Tingdi; Liu Bin

    2016-01-01

    Petri nets are graphical and mathematical tools that are applicable to many systems for modeling, simulation, and analysis. With the emergence of the concept of partitioning in time and space domains proposed in avionics application standard software interface (ARINC 653), it has become difficult to analyze time–space coupling hazards resulting from resource partitioning using classical or advanced Petri nets. In this paper, we propose a time–space coupling safety constraint and an improved timed colored Petri net with imposed time–space coupling safety constraints (TCCP-NET) to fill this requirement gap. Time–space coupling hazard analysis is conducted in three steps: specification modeling, simulation execution, and results analysis. A TCCP-NET is employed to model and analyze integrated modular avionics (IMA), a real-time, safety-critical system. The analysis results are used to verify whether there exist time–space coupling hazards at runtime. The method we propose demonstrates superior modeling of safety-critical real-time systems as it can specify resource allocations in both time and space domains. TCCP-NETs can effectively detect underlying time–space coupling hazards.

  10. A hazard analysis via an improved timed colored petri net with time–space coupling safety constraint

    Directory of Open Access Journals (Sweden)

    Li Zelin

    2016-08-01

    Full Text Available Petri nets are graphical and mathematical tools that are applicable to many systems for modeling, simulation, and analysis. With the emergence of the concept of partitioning in time and space domains proposed in avionics application standard software interface (ARINC 653, it has become difficult to analyze time–space coupling hazards resulting from resource partitioning using classical or advanced Petri nets. In this paper, we propose a time–space coupling safety constraint and an improved timed colored Petri net with imposed time–space coupling safety constraints (TCCP-NET to fill this requirement gap. Time–space coupling hazard analysis is conducted in three steps: specification modeling, simulation execution, and results analysis. A TCCP-NET is employed to model and analyze integrated modular avionics (IMA, a real-time, safety-critical system. The analysis results are used to verify whether there exist time–space coupling hazards at runtime. The method we propose demonstrates superior modeling of safety-critical real-time systems as it can specify resource allocations in both time and space domains. TCCP-NETs can effectively detect underlying time–space coupling hazards.

  11. Multi-Hazard Analysis for the Estimation of Ground Motion Induced by Landslides and Tectonics

    Science.gov (United States)

    Iglesias, Rubén; Koudogbo, Fifame; Ardizzone, Francesca; Mondini, Alessandro; Bignami, Christian

    2016-04-01

    Space-borne synthetic aperture radar (SAR) sensors allow obtaining all-day all-weather terrain complex reflectivity images which can be processed by means of Persistent Scatterer Interferometry (PSI) for the monitoring of displacement episodes with extremely high accuracy. In the work presented, different PSI strategies to measure ground surface displacements for multi-scale multi-hazard mapping are proposed in the context of landslides and tectonic applications. This work is developed in the framework of ESA General Studies Programme (GSP). The present project, called Multi Scale and Multi Hazard Mapping Space based Solutions (MEMpHIS), investigates new Earth Observation (EO) methods and new Information and Communications Technology (ICT) solutions to improve the understanding and management of disasters, with special focus on Disaster Risk Reduction rather than Rapid Mapping. In this paper, the results of the investigation on the key processing steps for measuring large-scale ground surface displacements (like the ones originated by plate tectonics or active faults) as well as local displacements at high resolution (like the ones related with active slopes) will be presented. The core of the proposed approaches is based on the Stable Point Network (SPN) algorithm, which is the advanced PSI processing chain developed by ALTAMIRA INFORMATION. Regarding tectonic applications, the accurate displacement estimation over large-scale areas characterized by low magnitude motion gradients (3-5 mm/year), such as the ones induced by inter-seismic or Earth tidal effects, still remains an open issue. In this context, a low-resolution approach based in the integration of differential phase increments of velocity and topographic error (obtained through the fitting of a linear model adjustment function to data) will be evaluated. Data from the default mode of Sentinel-1, the Interferometric Wide Swath Mode, will be considered for this application. Regarding landslides

  12. Combined fluvial and pluvial urban flood hazard analysis: concept development and application to Can Tho city, Mekong Delta, Vietnam

    Science.gov (United States)

    Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen

    2016-04-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of

  13. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    Directory of Open Access Journals (Sweden)

    Yu-Ting Hung

    2015-09-01

    Full Text Available To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management.

  14. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  15. Ballistic analysis during multiscale explosive eruption at Vesuvius and hazard implications

    Science.gov (United States)

    De Novellis, Vincenzo

    2016-04-01

    concentration/overpressure in the vent. These initial conditions are then inserted into a ballistic model for the purpose of calculating the maximum range of BP for their different sizes (0.20-1.2 m), varying drag coefficient as a function of BP velocity and varying air density as a function takeoff point along eruptive column of the event examinated. Furthermore, the gas expansion in the column reduces the drag force on BP and assists their vertical-lateral transport. In agreement with previous studies a zone of reduced drag is also included in the ballistic calculations that is determined based on the size of vents that were active at Vesuvius during past eruptions. Finally, the results for BP range (from 3 to 14 km) raise some significant implications regarding hazard zones for different future eruptive scenario.

  16. 77 FR 17573 - Hazard Communication

    Science.gov (United States)

    2012-03-26

    ..., 1915 and 1926 Hazard Communication; Final Rule #0;#0;Federal Register / Vol. 77 , No. 58 / Monday... Administration 29 CFR Parts 1910, 1915, and 1926 RIN 1218-AC20 Hazard Communication AGENCY: Occupational Safety... modifying its Hazard Communication Standard (HCS) to conform to the United Nations' Globally Harmonized...

  17. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA): towards PTHA assessment for the coasts of Italy

    Science.gov (United States)

    Selva, Jacopo; Tonini, Roberto; Molinari, Irene; Tiberti, Mara M.; Romano, Fabrizio; Grezio, Anita; Melini, Daniele; Piatanesi, Alessio; Basili, Roberto; Lorito, Stefano

    2016-04-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes. Differently from classical approaches that commonly adopt the hazard integral and logic tree, we use an event tree approach and ensemble modelling. The procedure was developed in the framework of the EC projects ASTARTE and STREST, of the Italian National Flagship project RITMARE, and of the agreement between Italian Civil Protection and INGV. A total of about 2 × 107 different potential seismic sources covering the entire Mediterranean Sea, and more than 1 × 105 alternative model implementations have been considered to quantify both the aleatory variability and the epistemic uncertainty. A set of hazard curves is obtained along the coasts of the entire Italian territory. They are the prototype of the first homogeneous Italian national SPTHA map.

  18. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Harper, Bryan [Oregon State University (United States); Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan [Pacific Northwest National Laboratory (United States); Tang, Kaizhi [Intelligent Automation, Inc. (United States); Heredia-Langner, Alejandro [Pacific Northwest National Laboratory (United States); Lins, Roberto [CPqAM, Oswaldo Cruz Foundation, FIOCRUZ-PE (Brazil); Harper, Stacey, E-mail: stacey.harper@oregonstate.edu [Oregon State University (United States)

    2015-06-15

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships.

  19. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    Energy Technology Data Exchange (ETDEWEB)

    (NOEMAIL), R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  20. Cadmium and lead residue control in a hazard analysis and critical control point (HACCP) environment.

    Science.gov (United States)

    Pagan-Rodríguez, Doritza; O'Keefe, Margaret; Deyrup, Cindy; Zervos, Penny; Walker, Harry; Thaler, Alice

    2007-02-21

    In 2003-2004, the U.S. Department of Agriculture Food Safety and Inspection Service (FSIS) conducted an exploratory assessment to determine the occurrence and levels of cadmium and lead in randomly collected samples of kidney, liver, and muscle tissues of mature chickens, boars/stags, dairy cows, and heifers. The data generated in the study were qualitatively compared to data that FSIS gathered in a 1985-1986 study in order to identify trends in the levels of cadmium and lead in meat and poultry products. The exploratory assessment was necessary to verify that Hazard Analysis and Critical Control Point plans and efforts to control exposure to these heavy metals are effective and result in products that meet U.S. export requirements. A comparison of data from the two FSIS studies suggests that the incidence and levels of cadmium and lead in different slaughter classes have remained stable since the first study was conducted in 1985-1986. This study was conducted to fulfill FSIS mandate to ensure that meat, poultry, and egg products entering commerce in the United States are free of adulterants, including elevated levels of environmental contaminants such as cadmium and lead.

  1. Hazard analysis and possibilities for preventing botulism originating from meat products

    Directory of Open Access Journals (Sweden)

    Vasilev Dragan

    2008-01-01

    Full Text Available The paper presents the more important data on the bacteria Clostridium botulinum, the appearance of botulism, hazard analysis and the possibilities for preventing botulism. Proteolytic strains of C.botulinum Group I, whose spores are resistant to heat, create toxins predominantly in cans containing slightly sour food items, in the event that the spores are not inactivated in the course of sterilization. Non-proteolytic strains of Group II are more sensitive to high temperatures, but they have the ability to grow and create toxins at low temperatures. Type E most often creates a toxin in vacuum-packed smoked fish, and the non-proteolytic strain type B in dried hams and certain pasteurized meat products. The following plays an important role in the prevention of botulism: reducing to a minimum meat contamination with spores of clostridia, implementing good hygiene measures and production practice during the slaughter of animals, the inactivation of spores of C. botulinum during sterilization (F>3, and, in dried hams and pasteurized products, the prevention of bacterial growth and toxin forming by maintaining low temperatures in the course of production and storage, as well as the correct use of substances that inhibit the multiplication of bacteria and the production of toxins (nitrites, table salt, etc..

  2. ANALYSIS OF THE TANK 6F FINAL CHARACTERIZATION SAMPLES-2012

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L.; Diprete, D.; Coleman, C.; Hay, M.; Shine, G.

    2012-06-28

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to provide sample preparation and analysis of the Tank 6F final characterization samples to determine the residual tank inventory prior to grouting. Fourteen residual Tank 6F solid samples from three areas on the floor of the tank were collected and delivered to SRNL between May and August 2011. These Tank 6F samples were homogenized and combined into three composite samples based on a proportion compositing scheme and the resulting composite samples were analyzed for radiological, chemical and elemental components. Additional measurements performed on the Tank 6F composite samples include bulk density and water leaching of the solids to account for water soluble components. The composite Tank 6F samples were analyzed and the data reported in triplicate. Sufficient quality assurance standards and blanks were utilized to demonstrate adequate characterization of the Tank 6F samples. The main evaluation criteria were target detection limits specified in the technical task request document. While many of the target detection limits were met for the species characterized for Tank 6F some were not met. In a few cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. The isotopes whose detection limits were not met in all cases included Sn-126, Sb-126, Sb-126m, Eu-152, Cm-243 and Cf-249. SRNL, in conjunction with the customer, reviewed all of these cases and determined that the impacts of not meeting the target detection limits were acceptable. Based on the analyses of variance (ANOVA) for the inorganic constituents of Tank 6F, all the inorganic constituents displayed heterogeneity. The inorganic results demonstrated consistent differences across the composite samples: lowest concentrations for Composite Sample 1, intermediate-valued concentrations for Composite

  3. Analysis Of The Tank 6F Final Characterization Samples-2012

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L. N.; Diprete, D. P.; Coleman, C. J.; Hay, M. S.; Shine, E. P.

    2012-09-27

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to provide sample preparation and analysis of the Tank 6F final characterization samples to determine the residual tank inventory prior to grouting. Fourteen residual Tank 6F solid samples from three areas on the floor of the tank were collected and delivered to SRNL between May and August 2011. These Tank 6F samples were homogenized and combined into three composite samples based on a proportion compositing scheme and the resulting composite samples were analyzed for radiological, chemical and elemental components. Additional measurements performed on the Tank 6F composite samples include bulk density and water leaching of the solids to account for water soluble components. The composite Tank 6F samples were analyzed and the data reported in triplicate. Sufficient quality assurance standards and blanks were utilized to demonstrate adequate characterization of the Tank 6F samples. The main evaluation criteria were target detection limits specified in the technical task request document. While many of the target detection limits were met for the species characterized for Tank 6F some were not met. In a few cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. The isotopes whose detection limits were not met in all cases included Sn-126, Sb-126, Sb-126m, Eu-152, Cm-243 and Cf-249. SRNL, in conjunction with the customer, reviewed all of these cases and determined that the impacts of not meeting the target detection limits were acceptable. Based on the analyses of variance (ANOVA) for the inorganic constituents of Tank 6F, all the inorganic constituents displayed heterogeneity. The inorganic results demonstrated consistent differences across the composite samples: lowest concentrations for Composite Sample 1, intermediate-valued concentrations for Composite

  4. Analysis Of The Tank 5F Final Characterization Samples-2011

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L. N.; Diprete, D.; Coleman, C. J.; Hay, M. S.

    2012-09-27

    The Savannah River National Laboratory (SRNL) was requested by SRR to provide sample preparation and analysis of the Tank 5F final characterization samples to determine the residual tank inventory prior to grouting. Two types of samples were collected and delivered to SRNL: floor samples across the tank and subsurface samples from mounds near risers 1 and 5 of Tank 5F. These samples were taken from Tank 5F between January and March 2011. These samples from individual locations in the tank (nine floor samples and six mound Tank 5F samples) were each homogenized and combined in a given proportion into 3 distinct composite samples to mimic the average composition in the entire tank. These Tank 5F composite samples were analyzed for radiological, chemical and elemental components. Additional measurements performed on the Tank 5F composite samples include bulk density and water leaching of the solids to account for water soluble species. With analyses for certain challenging radionuclides as the exception, all composite Tank 5F samples were analyzed and reported in triplicate. The target detection limits for isotopes analyzed were based on customer desired detection limits as specified in the technical task request documents. SRNL developed new methodologies to meet these target detection limits and provide data for the extensive suite of components. While many of the target detection limits were met for the species characterized for Tank 5F, as specified in the technical task request, some were not met. In a few cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. The Technical Task Request allows that while the analyses of these isotopes is needed, meeting the detection limits for these isotopes is a lower priority than meeting detection limits for the other specified isotopes. The isotopes whose detection limits were not met in all cases included the

  5. ANALYSIS OF THE TANK 5F FINAL CHARACTERIZATION SAMPLES-2011

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L.; Diprete, D.; Coleman, C.; Hay, M.

    2012-08-03

    The Savannah River National Laboratory (SRNL) was requested by SRR to provide sample preparation and analysis of the Tank 5F final characterization samples to determine the residual tank inventory prior to grouting. Two types of samples were collected and delivered to SRNL: floor samples across the tank and subsurface samples from mounds near risers 1 and 5 of Tank 5F. These samples were taken from Tank 5F between January and March 2011. These samples from individual locations in the tank (nine floor samples and six mound Tank 5F samples) were each homogenized and combined in a given proportion into 3 distinct composite samples to mimic the average composition in the entire tank. These Tank 5F composite samples were analyzed for radiological, chemical and elemental components. Additional measurements performed on the Tank 5F composite samples include bulk density and water leaching of the solids to account for water soluble species. With analyses for certain challenging radionuclides as the exception, all composite Tank 5F samples were analyzed and reported in triplicate. The target detection limits for isotopes analyzed were based on customer desired detection limits as specified in the technical task request documents. SRNL developed new methodologies to meet these target detection limits and provide data for the extensive suite of components. While many of the target detection limits were met for the species characterized for Tank 5F, as specified in the technical task request, some were not met. In a few cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. The Technical Task Request allows that while the analyses of these isotopes is needed, meeting the detection limits for these isotopes is a lower priority than meeting detection limits for the other specified isotopes. The isotopes whose detection limits were not met in all cases included the

  6. ANALYSIS OF THE TANK 5F FINAL CHARATERIZATION SAMPLES-2011

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L.; Diprete, D.; Coleman, C.; Hay, M.

    2012-01-20

    The Savannah River National Laboratory (SRNL) was requested by SRR to provide sample preparation and analysis of the Tank 5F final characterization samples to determine the residual tank inventory prior to grouting. Two types of samples were collected and delivered to SRNL: floor samples across the tank and subsurface samples from mounds near risers 1 and 5 of Tank 5F. These samples were taken from Tank 5F between January and March 2011. These samples from individual locations in the tank (nine floor samples and six mound Tank 5F samples) were each homogenized and combined in a given proportion into 3 distinct composite samples to mimic the average composition in the entire tank. These Tank 5F composite samples were analyzed for radiological, chemical and elemental components. Additional measurements performed on the Tank 5F composite samples include bulk density and water leaching of the solids to account for water soluble species. With analyses for certain challenging radionuclides as the exception, all composite Tank 5F samples were analyzed and reported in triplicate. The target detection limits for isotopes analyzed were based on customer desired detection limits as specified in the technical task request documents. SRNL developed new methodologies to meet these target detection limits and provide data for the extensive suite of components. While many of the target detection limits were met for the species characterized for Tank 5F, as specified in the technical task request, some were not met. In a few cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. The Technical Task Request allows that while the analyses of these isotopes is needed, meeting the detection limits for these isotopes is a lower priority than meeting detection limits for the other specified isotopes. The isotopes whose detection limits were not met in all cases included the

  7. Analysis of the Tank 6F Final Characterization Samples-2012

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L. N.; Diprete, D. P.; Coleman, C. J.; Hay, M. S.; Shine, E. P.

    2013-01-31

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to provide sample preparation and analysis of the Tank 6F final characterization samples to determine the residual tank inventory prior to grouting. Fourteen residual Tank 6F solid samples from three areas on the floor of the tank were collected and delivered to SRNL between May and August 2011. These Tank 6F samples were homogenized and combined into three composite samples based on a proportion compositing scheme and the resulting composite samples were analyzed for radiological, chemical and elemental components. Additional measurements performed on the Tank 6F composite samples include bulk density and water leaching of the solids to account for water soluble components. The composite Tank 6F samples were analyzed and the data reported in triplicate. Sufficient quality assurance standards and blanks were utilized to demonstrate adequate characterization of the Tank 6F samples. The main evaluation criteria were target detection limits specified in the technical task request document. While many of the target detection limits were met for the species characterized for Tank 6F some were not met. In a few cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. The isotopes whose detection limits were not met in all cases included Sn-126, Sb-126, Sb-126m, Eu-152, Cm- 243 and Cf-249. SRNL, in conjunction with the customer, reviewed all of these cases and determined that the impacts of not meeting the target detection limits were acceptable. Based on the analyses of variance (ANOVA) for the inorganic constituents of Tank 6F, all the inorganic constituents displayed heterogeneity. The inorganic results demonstrated consistent differences across the composite samples: lowest concentrations for Composite Sample 1, intermediate-valued concentrations for Composite

  8. The influence of Alpine soil properties on shallow movement hazards, investigated through factor analysis

    Directory of Open Access Journals (Sweden)

    S. Stanchi

    2012-06-01

    shallow soil movements involving the upper soil horizons. We assessed a great number of soil properties that are known to be related to vulnerability to the main hazards present in the area. These properties were evaluated at the two depths and a factor analysis was performed to simplify the dataset interpretation, and to hypothesise the most decisive parameters that were potentially related to vulnerability. The factors (soil structure, aggregation, consistency, texture and parent material, cation exchange complex and other chemical properties were a first step towards identifying soil quality indexes in the studied environment.

  9. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    Science.gov (United States)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  10. Parameter estimation in Probabilistic Seismic Hazard Analysis: current problems and some solutions

    Science.gov (United States)

    Vermeulen, Petrus

    2017-04-01

    A typical Probabilistic Seismic Hazard Analysis (PSHA) comprises identification of seismic source zones, determination of hazard parameters for these zones, selection of an appropriate ground motion prediction equation (GMPE), and integration over probabilities according the Cornell-McGuire procedure. Determination of hazard parameters often does not receive the attention it deserves, and, therefore, problems therein are often overlooked. Here, many of these problems are identified, and some of them addressed. The parameters that need to be identified are those associated with the frequency-magnitude law, those associated with earthquake recurrence law in time, and the parameters controlling the GMPE. This study is concerned with the frequency-magnitude law and temporal distribution of earthquakes, and not with GMPEs. TheGutenberg-Richter frequency-magnitude law is usually adopted for the frequency-magnitude law, and a Poisson process for earthquake recurrence in time. Accordingly, the parameters that need to be determined are the slope parameter of the Gutenberg-Richter frequency-magnitude law, i.e. the b-value, the maximum value at which the Gutenberg-Richter law applies mmax, and the mean recurrence frequency,λ, of earthquakes. If, instead of the Cornell-McGuire, the "Parametric-Historic procedure" is used, these parameters do not have to be known before the PSHA computations, they are estimated directly during the PSHA computation. The resulting relation for the frequency of ground motion vibration parameters has an analogous functional form to the frequency-magnitude law, which is described by parameters γ (analogous to the b¬-value of the Gutenberg-Richter law) and the maximum possible ground motion amax (analogous to mmax). Originally, the approach was possible to apply only to the simple GMPE, however, recently a method was extended to incorporate more complex forms of GMPE's. With regards to the parameter mmax, there are numerous methods of estimation

  11. Hazard analysis in active landslide areas in the State of Veracruz, Mexico

    Science.gov (United States)

    Wilde, Martina; Morales Barrera, Wendy V.; Rodriguez Elizarrarás, Sergio R.; Solleiro Rebolledo, Elizabeth; Sedov, Sergey; Terhorst, Birgit

    2016-04-01

    mass movements are analyzed in order to reconstruct complex interrelations of the causes and effects of landslide events. One of the major objectives of this research is to evaluate the potential hazard of active landslide areas. Detailed field analyzes were performed to investigate the situations and dynamics of the slope movements. Therefore, geomorphological mapping, sediment characterization as well as geophysical methods are applied. On the one hand, a detailed sediment characterization aims to identify the type of material (e.g. geotechnical attributes), on the other sediments can provide information on different activity phases, respectively movement processes in slide masses. Furthermore, the focus is placed on the determination of landslide relevant parameters and thresholds. Digital elevation models, which were generated before the onset of slope movements, are integrated in the geomorphological analysis. The poster presents the specific study sites in Veracruz and the situation of endangered slopes before and after the landslide events. It is planned to use this knowledge to model susceptibility maps for the region in the future. Moreover, field data will be used as basic information for further monitoring plans. Resulting susceptibility maps will be provided to the responsible authorities in order to support sustainable planning of settlements and infrastructure in hazardous regions.

  12. Advanced Rapid Imaging & Analysis for Monitoring Hazards (ARIA-MH) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a service-oriented hazard/disaster monitoring data system enabling both science and decision-support communities to monitor ground motion in areas of...

  13. SCEC Community Modeling Environment (SCEC/CME) - Seismic Hazard Analysis Applications and Infrastructure

    Science.gov (United States)

    Maechling, P. J.; Jordan, T. H.; Kesselman, C.; Moore, R.; Minster, B.; SCEC ITR Collaboration

    2003-12-01

    The Southern California Earthquake Center (SCEC) has formed a Geoscience/IT partnership to develop an advanced information infrastructure for system-level earthquake science in Southern California. This SCEC/ITR partnership comprises SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. This collaboration recently completed the second year in a five-year National Science Foundation (NSF) funded ITR project called the SCEC Community Modeling Environment (SCEC/CME). The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed by project collaborators include a Probabilistic Seismic Hazard Analysis system called OpenSHA [Field et al., this meeting]. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERF's). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. A Rupture Dynamic Model (RDM) has also been developed that couples a rupture dynamics simulation into an anelastic wave model. The collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of SHA programs. To support computationally expensive simulations, we have constructed a grid-based system utilizing Globus software [Kesselman et al., this meeting]. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC, NPACI and Teragrid High Performance Computing Centers. We have

  14. Analysis of Risks in Hainan Island Typhoon Hazard Factor Based on GIS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The aim of this paper was to analyze the risks in the typhoon hazard factors in Hainan Island. [Method] Taking the theory and method of natural disasters evaluation as starting point and supporting point, and selecting Hainan province as the research target, where the typhoon disaster occurred relatively serious, based on the typhoon data during 1958-2008, with happening frequency of typhoon hazard-formative factors, maximum rainfall, potentially devastating effects of typhoon winds as evaluatio...

  15. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    Science.gov (United States)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  16. Readiness to implement Hazard Analysis and Critical Control Point (HACCP) systems in Iowa schools.

    Science.gov (United States)

    Henroid, Daniel; Sneed, Jeannie

    2004-02-01

    To evaluate current food-handling practices, food safety prerequisite programs, and employee knowledge and food safety attitudes and provide baseline data for implementing Hazard Analysis and Critical Control Point (HACCP) systems in school foodservice. One member of the research team visited each school to observe food-handling practices and assess prerequisite programs using a structured observation form. A questionnaire was used to determine employees' attitudes, knowledge, and demographic information. A convenience sample of 40 Iowa schools was recruited with input from the Iowa Department of Education. Descriptive statistics were used to summarize data. One-way analysis of variance was used to assess differences in attitudes and food safety knowledge among managers, cooks, and other foodservice employees. Multiple linear regression assessed the relationship between manager and school district demographics and the food safety practice score. Proper food-handling practices were not being followed in many schools and prerequisite food safety programs for HACCP were found to be inadequate for many school foodservice operations. School foodservice employees were found to have a significant amount of food safety knowledge (15.9+/-2.4 out of 20 possible points). School districts with managers (P=.019) and employees (P=.030) who had a food handler certificate were found to have higher food safety practice scores. Emphasis on implementing prerequisite programs in preparation for HACCP is needed in school foodservice. Training programs, both basic food safety such as ServSafe and HACCP, will support improvement of food-handling practices and implementation of prerequisite programs and HACCP.

  17. Seismic hazard analysis with PSHA method in four cities in Java.

    Science.gov (United States)

    Elistyawati, Y.; Palupi, I. R.; Suharsono

    2016-11-01

    In this study the tectonic earthquakes was observed through the peak ground acceleration through the PSHA method by dividing the area of the earthquake source. This study applied the earthquake data from 1965 - 2015 that has been analyzed the completeness of the data, location research was the entire Java with stressed in four large cities prone to earthquakes. The results were found to be a hazard map with a return period of 500 years, 2500 years return period, and the hazard curve were four major cities (Jakarta, Bandung, Yogyakarta, and the city of Banyuwangi). Results Java PGA hazard map 500 years had a peak ground acceleration within 0 g ≥ 0.5 g, while the return period of 2500 years had a value of 0 to ≥ 0.8 g. While, the PGA hazard curves on the city's most influential source of the earthquake was from sources such as fault Cimandiri backgroud, for the city of Bandung earthquake sources that influence the seismic source fault dent background form. In other side, the city of Yogyakarta earthquake hazard curve of the most influential was the source of the earthquake background of the Opak fault, and the most influential hazard curve of Banyuwangi earthquake was the source of Java and Sumba megatruts earthquake.

  18. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  19. Flood hazard zoning in Yasooj region, Iran, using GIS and multi-criteria decision analysis

    Directory of Open Access Journals (Sweden)

    Omid Rahmati

    2016-05-01

    Full Text Available Flood is considered to be the most common natural disaster worldwide during the last decades. Flood hazard potential mapping is required for management and mitigation of flood. The present research was aimed to assess the efficiency of analytical hierarchical process (AHP to identify potential flood hazard zones by comparing with the results of a hydraulic model. Initially, four parameters via distance to river, land use, elevation and land slope were used in some part of the Yasooj River, Iran. In order to determine the weight of each effective factor, questionnaires of comparison ratings on the Saaty's scale were prepared and distributed to eight experts. The normalized weights of criteria/parameters were determined based on Saaty's nine-point scale and its importance in specifying flood hazard potential zones using the AHP and eigenvector methods. The set of criteria were integrated by weighted linear combination method using ArcGIS 10.2 software to generate flood hazard prediction map. The inundation simulation (extent and depth of flood was conducted using hydrodynamic program HEC-RAS for 50- and 100-year interval floods. The validation of the flood hazard prediction map was conducted based on flood extent and depth maps. The results showed that the AHP technique is promising of making accurate and reliable prediction for flood extent. Therefore, the AHP and geographic information system (GIS techniques are suggested for assessment of the flood hazard potential, specifically in no-data regions.

  20. Preliminary assessment of hazardous-waste pretreatment as an air-pollution-control technique. Final report, 25 July 1983-31 July 1984

    Energy Technology Data Exchange (ETDEWEB)

    Spivey, J.J.; Allen, C.C.; Green, D.A.; Wood, J.P.; Stallings, R.L.

    1986-03-01

    The report evaluates twelve commercially available treatment techniques for their use in removing volatile constituents from hazardous and potentially hazardous waste streams. A case study of the cost of waste treatment is also provided for each technique. The results show that air stripping or evaporation coupled with carbon adsorption of the off gases; steam stripping; and batch distillation are the most widely applicable pretreatment techniques. The cost-effectiveness of pretreatment varies widely with waste-stream characteristics and type of pretreatment, with typical values being between $55 and $1,800 per megagram of volatile removed.

  1. Seismic risk analysis for General Electric Plutonium Facility, Pleasanton, California. Final report, part II

    Energy Technology Data Exchange (ETDEWEB)

    1980-06-27

    This report is the second of a two part study addressing the seismic risk or hazard of the special nuclear materials (SNM) facility of the General Electric Vallecitos Nuclear Center at Pleasanton, California. The Part I companion to this report, dated July 31, 1978, presented the seismic hazard at the site that resulted from exposure to earthquakes on the Calaveras, Hayward, San Andreas and, additionally, from smaller unassociated earthquakes that could not be attributed to these specific faults. However, while this study was in progress, certain additional geologic information became available that could be interpreted in terms of the existance of a nearby fault. Although substantial geologic investigations were subsequently deployed, the existance of this postulated fault, called the Verona Fault, remained very controversial. The purpose of the Part II study was to assume the existance of such a capable fault and, under this assumption, to examine the loads that the fault could impose on the SNM facility. This report first reviews the geologic setting with a focus on specifying sufficient geologic parameters to characterize the postulated fault. The report next presents the methodology used to calculate the vibratory ground motion hazard. Because of the complexity of the fault geometry, a slightly different methodology is used here compared to the Part I report. This section ends with the results of the calculation applied to the SNM facility. Finally, the report presents the methodology and results of the rupture hazard calculation.

  2. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    Science.gov (United States)

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters.

  3. Current and future pluvial flood hazard analysis for the city of Antwerp

    Science.gov (United States)

    Willems, Patrick; Tabari, Hossein; De Niel, Jan; Van Uytven, Els; Lambrechts, Griet; Wellens, Geert

    2016-04-01

    to two types of methods). These were finally transferred into future pluvial flash flood hazard maps for the city together with the uncertainties, and are considered as basis for spatial planning and adaptation.

  4. Health hazard evaluation/toxicity determination report 73-73-143, Inland Manufacturing Co. , General Motors Corporation, Dayton, Ohio. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Ruhe, R.L.

    1974-10-01

    NIOSH conducted a health hazard survey in a boiler room of a steam plant to evaluate worker exposure to coal dust containing silica and fly ash during the boiler clean-up operation. Based on these environmental measurements and on employee interviews, it was determined that the silica containing dusts were not toxic at the concentrations found on this survey. (GRA)

  5. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  6. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    Science.gov (United States)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  7. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

  8. AschFlow - A dynamic landslide run-out model for medium scale hazard analysis.

    Science.gov (United States)

    Luna, Byron Quan; Blahut, Jan; van Asch, Theo; van Westen, Cees; Kappes, Melanie

    2015-04-01

    Landslides and debris flow hazard assessments require a scale-dependent analysis in order to mitigate damage and other negative consequences at the respective scales of occurrence. Medium or large scale landslide run-out modelling for many possible landslide initiation areas has been a cumbersome task in the past. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the run-out models to compute the displacement with a large amount of individual initiation areas (computational exhaustive). Most of the existing physically based run-out models have complications in handling such situations and therefore empirical methods have been used as a practical mean to predict landslides mobility at a medium scale (1:10,000 to 1:50,000). In this context, a simple medium scale numerical model for rapid mass movements in urban and mountainous areas was developed. The deterministic nature of the approach makes it possible to calculate the velocity, height and increase in mass by erosion, resulting in the estimation of various forms of impacts exerted by debris flows at the medium scale The established and implemented model ("AschFlow") is a 2-D one-phase continuum model that simulates, the entrainment, spreading and deposition process of a landslide or debris flow at a medium scale. The flow is thus treated as a single phase material, whose behavior is controlled by rheology (e.g. Voellmy or Bingham). The developed regional model "AschFlow" was applied and evaluated in well documented areas with known past debris flow events.

  9. Simulating Social and Political Influences on Hazard Analysis through a Classroom Role Playing Exercise

    Science.gov (United States)

    Hales, T. C.; Cashman, K. V.

    2006-12-01

    Geological hazard mitigation is a complicated process that involves both detailed scientific research and negotiations between community members with competing interests in the solution. Geological hazards classes based around traditional lecture methods have difficulty conveying the decision-making processes that go into these negotiations. To address this deficiency, we have spent five years developing and testing a role- playing exercise based on mitigation of a dam outburst hazard on Ruapehu volcano, New Zealand. In our exercise, students are asked to undertake one of five different roles and decide the best way to mitigate the hazard. Over the course of their discussion students are challenged to reach a consensus decision despite the presence of strongly opposed positions. Key to the success of the exercise are (1) the presence of a facilitator and recorder for each meeting, (2) the provision of unique information for each interested party, and (3) the division of the class into multiple meeting groups, such that everyone is required to participate and individual groups can evolve to different conclusions. The exercise can be completed in a single hour and twenty minute classroom session that is divided into four parts: an introduction, a meeting between members of the same interested party to discuss strategy, a meeting between different interested parties, and a debriefing session. This framework can be readily translated to any classroom hazard problem. In our experience, students have responded positively to the use of role-playing to supplement lectures.

  10. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  11. Regional-scale analysis of lake outburst hazards in the southwestern Pamir, Tajikistan, based on remote sensing and GIS

    Directory of Open Access Journals (Sweden)

    M. Mergili

    2011-05-01

    Full Text Available This paper presents an analysis of the hazards emanating from the sudden drainage of alpine lakes in South-Western Tajik Pamir. In the last 40 yr, several new lakes have formed in the front of retreating glacier tongues, and existing lakes have grown. Other lakes are dammed by landslide deposits or older moraines. In 2002, sudden drainage of a glacial lake in the area triggered a catastrophic debris flow. Building on existing approaches, a rating scheme was devised allowing quick, regional-scale identification of potentially hazardous lakes and possible impact areas. This approach relies on GIS, remote sensing and empirical modelling, largely based on medium-resolution international datasets. Out of the 428 lakes mapped in the area, 6 were rated very hazardous and 34 hazardous. This classification was used for the selection of lakes requiring in-depth investigation. Selected cases are presented and discussed in order to understand the potentials and limitations of the approach used. Such an understanding is essential for the appropriate application of the methodology for risk mitigation purposes.

  12. GIS-Based Spatial Analysis and Modeling for Landslide Hazard Assessment: A Case Study in Upper Minjiang River Basin

    Institute of Scientific and Technical Information of China (English)

    FENG Wenlan; ZHOU Qigang; ZHANG Baolei; ZHOU Wancun; LI Ainong; ZHANG Haizhen; XIAN Wei

    2006-01-01

    By analyzing the topographic features of past landslides since 1980s and the main land-cover types (including change information) in landslide-prone area, modeled spatial distribution of landslide hazard in upper Minjiang River Basin was studied based on spatial analysis of GIS in this paper. Results of GIS analysis showed that landslide occurrence in this region closely related to topographic feature. Most areas with high hazard probability were deep-sheared gorge. Most of them in investigation occurred assembly in areas with elevation lower than 3 000 m, due to fragile topographic conditions and intensive human disturbances. Land-cover type, including its change information, was likely an important environmental factor to trigger landslide. Destroy of vegetation driven by increase of population and its demands augmented the probability of landslide in steep slope.

  13. Utility guidelines for reactor noise analysis: Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sweeney, F.J.

    1987-02-01

    Noise analysis techniques have been extensively utilized to monitor the health and performance of nuclear power plant systems. However, few utilities have adequate programs to effectively utilize these techniques. These programs usually provide low-quality data, which can lead to misinterpretation and false alarms. The objective of this work is to provide utilities and noise analysts with guidelines for data acquisition, data analysis, and interpretation of noise analysis results for surveillance and diagnosis of reactor systems.

  14. Operational Risk Management; An analysis of FSA Final Notices

    OpenAIRE

    van den Aarssen, Daniel

    2013-01-01

    In the last two decades, financial markets have been highlighted by large-scale financial failures due to incompetence and fraud, such as Barings, Daiwa, Allied Irish Banks, UBS, Société Génerale, and more recently JP Morgan. While previous research has focussed on market and credit risk, and even if the focus was on operational risk it concentrates on the market reaction to operational losses, the current research addresses the root of the problem. The current research explores the final...

  15. Ground landslide hazard potency using geoelectrical resistivity analysis and VS30, case study at geophysical station, Lembang, Bandung

    Science.gov (United States)

    Rohadi, Supriyanto; Sakya, Andi Eka; Masturyono, Murjaya, Jaya; Sunardi, Bambang; Rasmid, Ngadmanto, Drajat; Susilanto, Pupung; Nugraha, Jimmi; Pakpahan, Suliyanti

    2017-07-01

    We have conducted geoelectric resistivity and shear wave velocity (Vs30) study to identify the landslide potential hazard, around Geophysics Station Lembang, Bandung (107,617° E and 6,825° S). The the geoelectric analysis using Dipole-Dipole resitivity configuration, while shear wave velocity analysis performed using the Multichannel Analysis of Surface Wave (MASW). The study results indicate that the assumed soil or clay depth from the electrical resistivity observation was in accordance with the confirmed soil or clay depth by the MASW investigation. Based on these conditions, indicate the high potential of landsliding in this area, landslide potential supported by high slope angle in this area.

  16. International collaboration towards a global analysis of volcanic hazards and risk

    Science.gov (United States)

    Loughlin, Susan; Duncan, Melanie; Volcano Model Network, Global

    2017-04-01

    Approximately 800 million people live within 100km of an active volcano and such environments are often subject to multiple natural hazards. Volcanic eruptions and related volcanic hazards are less frequent than many other natural hazards but when they occur they can have immediate and long-lived impacts so it is important that they are not overlooked in a multi-risk assessment. Based on experiences to date, it's clear that natural hazards communities need to address a series of challenges in order to move to a multi-hazard approach to risk assessment. Firstly, the need to further develop synergies and coordination within our own communities at local to global scales. Secondly, we must collaborate and identify opportunities for harmonisation across natural hazards communities: for instance, by ensuring our databases are accessible and meet certain standards, a variety of users will be then able to contribute and access data. Thirdly, identifying the scale and breadth of multi-risk assessments needs to be co-defined with decision-makers, which will constrain the relevant potential cascading/compounding hazards to consider. Fourthly, and related to all previous points, multi-risk assessments require multi-risk knowledge, requiring interdisciplinary perspectives, as well as discipline specific expertise. The Global Volcano Model network (GVM) is a growing international network of (public and private) institutions and organisations, which have the collective aim of identifying and reducing volcanic risks. GVM's values embody collaboration, scientific excellence, open-access (wherever possible) and, above all, public good. GVM highlights and builds on the best research available within the volcanological community, drawing on the work of IAVCEI Commissions and other research initiatives. It also builds on the local knowledge of volcano observatories and collaborating scientists, ensuring that global efforts are underpinned by local evidence. Some of GVM's most

  17. Flood hazards analysis based on changes of hydrodynamic processes in fluvial systems of Sao Paulo, Brazil.

    Science.gov (United States)

    Simas, Iury; Rodrigues, Cleide

    2016-04-01

    The metropolis of Sao Paulo, with its 7940 Km² and over 20 million inhabitants, is increasingly being consolidated with disregard for the dynamics of its fluvial systems and natural limitations imposed by fluvial terraces, floodplains and slopes. Events such as floods and flash floods became particularly persistent mainly in socially and environmentally vulnerable areas. The Aricanduva River basin was selected as the ideal area for the development of the flood hazard analysis since it presents the main geological and geomorphological features found in the urban site. According to studies carried out by Anthropic Geomorphology approach in São Paulo, to study this phenomenon is necessary to take into account the original hydromorphological systems and its functional conditions, as well as in which dimensions the Anthropic factor changes the balance between the main variables of surface processes. Considering those principles, an alternative model of geographical data was proposed and enabled to identify the role of different driving forces in terms of spatial conditioning of certain flood events. Spatial relationships between different variables, such as anthropogenic and original morphology, were analyzed for that purpose in addition to climate data. The surface hydrodynamic tendency spatial model conceived for this study takes as key variables: 1- The land use present at the observed date combined with the predominant lithological group, represented by a value ranging 0-100, based on indexes of the National Soil Conservation Service (NSCS-USA) and the Hydraulic Technology Center Foundation (FCTH-Brazil) to determine the resulting balance of runoff/infiltration. 2- The original slope, applying thresholds from which it's possible to determine greater tendency for runoff (in percents). 3- The minimal features of relief, combining the curvature of surface in plant and profile. Those three key variables were combined in a Geographic Information System in a series of

  18. Use of remote sensing and seismotectonic parameters for seismic hazard analysis of Bangalore

    Directory of Open Access Journals (Sweden)

    T. G. Sitharam

    2006-01-01

    Full Text Available Deterministic Seismic Hazard Analysis (DSHA for the Bangalore, India has been carried out by considering the past earthquakes, assumed subsurface fault rupture lengths and point source synthetic ground motion model. The sources have been identified using satellite remote sensing images and seismotectonic atlas map of India and relevant field studies. Maximum Credible Earthquake (MCE has been determined by considering the regional seismotectonic activity in about 350 km radius around Bangalore. The seismotectonic map has been prepared by considering the faults, lineaments, shear zones in the area and past moderate earthquakes of more than 470 events having the moment magnitude of 3.5 and above. In addition, 1300 number of earthquake tremors having moment magnitude of less than 3.5 has been considered for the study. Shortest distance from the Bangalore to the different sources is measured and then Peak Horizontal Acceleration (PHA is calculated for the different sources and moment magnitude of events using regional attenuation relation for peninsular India. Based on Wells and Coppersmith (1994 relationship, subsurface fault rupture length of about 3.8% of total length of the fault shown to be matching with past earthquake events in the area. To simulate synthetic ground motions, Boore (1983, 2003 SMSIM programs have been used and the PHA for the different locations is evaluated. From the above approaches, the PHA of 0.15 g was established. This value was obtained for a maximum credible earthquake having a moment magnitude of 5.1 for a source Mandya-Channapatna-Bangalore lineament. This particular source has been identified as a vulnerable source for Bangalore. From this study, it is very clear that Bangalore area can be described as seismically moderately active region. It is also recommended that southern part of Karnataka in particular Bangalore, Mandya and Kolar, need to be upgraded from current Indian Seismic Zone II to Seismic Zone III

  19. Assessing the long-term probabilistic volcanic hazard for tephra fallout in Reykjavik, Iceland: a preliminary multi-source analysis

    Science.gov (United States)

    Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús

    2015-04-01

    Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results

  20. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  1. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  2. Development of a risk-analysis model. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    This report consists of a main body, which provides a presentation of risk analysis and its general and specific application to the needs of the Office of Buildings and Community Systems of the Department of Energy; and several case studies employing the risk-analysis model developed. The highlights include a discussion of how risk analysis is currently used in the private, regulated, and public sectors and how this methodology can be employed to meet the policy-analysis needs of the Office of Buildings and Community Systems of the Department of Energy (BCS/DOE). After a review of the primary methodologies available for risk analysis, it was determined that Monte Carlo simulation techniques provide the greatest degree of visibility into uncertainty in the decision-making process. Although the data-collection requirements can be demanding, the benefits, when compared to other methods, are substantial. The data-collection problem can be significantly reduced, without sacrificing proprietary-information rights, if prior arrangements are made with RD and D contractors to provide responses to reasonable requests for base-case data. A total of three case studies were performed on BCS technologies: a gas-fired heat pump; a 1000 ton/day anaerobic digestion plant; and a district heating and cooling system. The three case studies plus the risk-analysis methodology were issued as separate reports. It is concluded that, based on the overall research of risk analysis and the case-study experience, that the risk-analysis methodology has significant potential as a policy-evaluation tool within BCS.

  3. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Science.gov (United States)

    2010-01-01

    ... fully loaded propellant storage tanks or pressurized motor segments. (vii) Worst case combustion or... of each accident experienced by the launch operator involving the release of a toxic propellant; and..., including the launch operator's ground safety plan, hazard area surveillance and clearance plan,...

  4. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation.

    Science.gov (United States)

    Liu, Xiang; Saat, Mohd Rapik; Barkan, Christopher P L

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2010-07-14

    ... procedures to fruit and vegetable juice processing. HACCP is a preventative system of hazard control that can be used by all food processors to ensure the safety of their products to consumers. A HACCP system of... and recordkeeping are essential parts of any HACCP system. The information collection requirements...

  6. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2013-11-20

    ... principles to the processing of fruit and vegetable juices. HACCP is a preventive system of hazard control... regulations for the efficient enforcement of that act. The rationale in establishing an HACCP system of... development and recordkeeping are essential parts of any HACCP system. The information collection...

  7. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiang, E-mail: liu94@illinois.edu; Saat, Mohd Rapik, E-mail: mohdsaat@illinois.edu; Barkan, Christopher P.L., E-mail: cbarkan@illinois.edu

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail.

  8. Final recommendations for reference materials in black carbon analysis

    Science.gov (United States)

    Schmidt, Michael W. I.; Masiello, Caroline A.; Skjemstad, Jan O.

    Last summer, a symposium was held to discuss aspects of global biogeochemical cycles, including organic matter cycling in soils, rivers, and marine environments; black carbon particle fluxes and the biological pump; dissolved organic matter; and organic matter preservation. Seventy scientists from various disciplines, including oceanography, soil science, geology, and chemistry attended the 3-day meeting at the Friday Harbor Laboratories, a research station of the University of Washington.“New Approaches in Marine Organic Biogeochemistry” commemorated the life and science of a colleague and friend, John I. Hedges, who was also involved in several groups developing chemical reference materials. Part of this symposium included a workshop on chemical reference materials, where final recommendations of the Steering Committee for Black Carbon Reference Materials were presented.

  9. Systems Analysis of NASA Aviation Safety Program: Final Report

    Science.gov (United States)

    Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen

    2013-01-01

    A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.

  10. On the predictive information criteria for model determination in seismic hazard analysis

    Science.gov (United States)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  11. Treatment of Uncertainties in Probabilistic Tsunami Hazard

    Science.gov (United States)

    Thio, H. K.

    2012-12-01

    Over the last few years, we have developed a framework for developing probabilistic tsunami inundation maps, which includes comprehensive quantification of earthquake recurrence as well as uncertainties, and applied it to the development of a tsunami hazard map of California. The various uncertainties in tsunami source and propagation models are an integral part of a comprehensive probabilistic tsunami hazard analysis (PTHA), and often drive the hazard at low probability levels (i.e. long return periods). There is no unique manner in which uncertainties are included in the analysis although in general, we distinguish between "natural" or aleatory variability, such as slip distribution and event magnitude, and uncertainties due to an incomplete understanding of the behavior of the earth, called epistemic uncertainties, such as scaling relations and rupture segmentation. Aleatory uncertainties are typically included through integration over distribution functions based on regression analyses, whereas epistemic uncertainties are included using logic trees. We will discuss how the different uncertainties were included in our recent probabilistic tsunami inundation maps for California, and their relative importance on the final results. Including these uncertainties in offshore exceedance waveheights is straightforward, but the problem becomes more complicated once the non-linearity of near-shore propagation and inundation are encountered. By using the probabilistic off-shore waveheights as input level for the inundation models, the uncertainties up to that point can be included in the final maps. PTHA provides a consistent analysis of tsunami hazard and will become an important tool in diverse areas such as coastal engineering and land use planning. The inclusive nature of the analysis, where few assumptions are made a-priori as to which sources are significant, means that a single analysis can provide a comprehensive view of the hazard and its dominant sources

  12. Hazard categorization for 300 area N reactor fuel fabrication and storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Brehm, J.R., Fluor Daniel Hanford

    1997-02-12

    A final hazard categorization has been prepared for the 300 Area Fuel Supply Shutdown (FSS) facility in accordance with DOE-STD-1027-92, ''Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports'' (DOE 1992). Prior to using the hazard category methodology, hazard classifications were prepared in accordance with the requirements of the Westinghouse Hanford Company (Westinghouse Hanford) controlled manual, WHC-CM-4-46, ''Safety Analysis Manual'', Chapter 4.0, ''Hazard Classification.'' A hazard classification (Huang 1995) was previously prepared for the FSS in accordance with WHC-CM-4-46. The analysis lead to the conclusion that the FSS should be declared a Nuclear facility with a Moderate Hazard Class rating. The analysis and results contained in the hazard classification can be used to provide additional information to support other safety analysis documentation. Also, the hazard classification provides analyses of the toxicological hazards inherent with the FSS inventory: whereas, a hazard categorization prepared in accordance with DOE-STD-1027-92, considers only the radiological component of the inventory.

  13. Commercial building systems analysis. Final report, January 1988-July 1989

    Energy Technology Data Exchange (ETDEWEB)

    Glazer, J.; Henninger, R.H.

    1991-07-01

    The report describes the methodology used for conducting an economic analysis of Gas Heat Pumps (GHP's) and competing space conditioning equipment in the light commercial range. The economic analysis began by obtaining equipment installed costs and determining the yearly energy usage for each type of space conditioning equipment applied to a small office building and quick service restaurant in 17 cities. The installed costs of competing technologies were obtained via a survey in various cities throughout the United States. The yearly energy costs for this equipment and GHP's were calculated by using the DOE-2.1C Hourly Energy Analysis Computer Program. The DOE-2 program was modified specifically to simulate the complexity of gas heat pumps by incorporating special FORTRAN algorithms. Performance curves were developed and included in the GHP computer model based on empirical data. An economic analysis was conducted comparing GHP's and competing equipment using both payback and net present value methods which included an examination of the effect of the change in future utility costs. The results of the analysis are not included in the report because they are based on proprietary performance information.

  14. An Earthquake Source Ontology for Seismic Hazard Analysis and Ground Motion Simulation

    Science.gov (United States)

    Zechar, J. D.; Jordan, T. H.; Gil, Y.; Ratnakar, V.

    2005-12-01

    Representation of the earthquake source is an important element in seismic hazard analysis and earthquake simulations. Source models span a range of conceptual complexity - from simple time-independent point sources to extended fault slip distributions. Further computational complexity arises because the seismological community has established so many source description formats and variations thereof; what this means is that conceptually equivalent source models are often expressed in different ways. Despite the resultant practical difficulties, there exists a rich semantic vocabulary for working with earthquake sources. For these reasons, we feel it is appropriate to create a semantic model of earthquake sources using an ontology, a computer science tool from the field of knowledge representation. Unlike the domain of most ontology work to date, earthquake sources can be described by a very precise mathematical framework. Another uniqueness associated with developing such an ontology is that earthquake sources are often used as computational objects. A seismologist generally wants more than to simply construct a source and have it be well-formed and properly described; additionally, the source will be used for performing calculations. Representation and manipulation of complex mathematical objects presents a challenge to the ontology development community. In order to enable simulations involving many different types of source models, we have completed preliminary development of a seismic point source ontology. The use of an ontology to represent knowledge provides machine interpretability and the ability to validate logical consistency and completeness. Our ontology, encoded using the OWL Web Ontology Language - a standard from the World Wide Web Consortium, contains the conceptual definitions and relationships necessary for source translation services. For example, specification of strike, dip, rake, and seismic moment will automatically translate into a double

  15. Risk prediction of Critical Infrastructures against extreme natural hazards: local and regional scale analysis

    Science.gov (United States)

    Rosato, Vittorio; Hounjet, Micheline; Burzel, Andreas; Di Pietro, Antonio; Tofani, Alberto; Pollino, Maurizio; Giovinazzi, Sonia

    2016-04-01

    Natural hazard events can induce severe impacts on the built environment; they can hit wide and densely populated areas, where there is a large number of (inter)dependent technological systems whose damages could cause the failure or malfunctioning of further different services, spreading the impacts on wider geographical areas. The EU project CIPRNet (Critical Infrastructures Preparedness and Resilience Research Network) is realizing an unprecedented Decision Support System (DSS) which enables to operationally perform risk prediction on Critical Infrastructures (CI) by predicting the occurrence of natural events (from long term weather to short nowcast predictions, correlating intrinsic vulnerabilities of CI elements with the different events' manifestation strengths, and analysing the resulting Damage Scenario. The Damage Scenario is then transformed into an Impact Scenario, where punctual CI element damages are transformed into micro (local area) or meso (regional) scale Services Outages. At the smaller scale, the DSS simulates detailed city models (where CI dependencies are explicitly accounted for) that are of important input for crisis management organizations whereas, at the regional scale by using approximate System-of-Systems model describing systemic interactions, the focus is on raising awareness. The DSS has allowed to develop a novel simulation framework for predicting earthquakes shake maps originating from a given seismic event, considering the shock wave propagation in inhomogeneous media and the subsequent produced damages by estimating building vulnerabilities on the basis of a phenomenological model [1, 2]. Moreover, in presence of areas containing river basins, when abundant precipitations are expected, the DSS solves the hydrodynamic 1D/2D models of the river basins for predicting the flux runoff and the corresponding flood dynamics. This calculation allows the estimation of the Damage Scenario and triggers the evaluation of the Impact Scenario

  16. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Torrential processes like flooding, heavy bedload transport or debris flows in steep mountain channels emerge during intense, highly localized rainfall events. They pose a serious risk on the densely populated Alpine region. Hydrogeomorphic hazards are profoundly nonlinear, threshold mediated phenomena frequently causing costly damage to infrastructure and people. Thus, in the context of climate change, there is an ever rising interest in whether sediment cascades of small alpine catchments react to changing precipitation patterns and how the climate signal is propagated through the fluvial system. We intend to answer the following research questions: (i) What are critical meteorological characteristics triggering torrential events in the Eastern Alps of Austria? (ii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which factors control the internal susceptibility? (iii) Do torrential processes show an increase in magnitude and frequency or a shift in seasonality in the recent past? (iv) Which future changes can be expected under different climate scenarios? Quantifications of bedload transport in small alpine catchments are rare and often associated with high uncertainties. Detailed knowledge though exists for the Schöttlbach catchment, a 71 km2 study area in Styria in the Eastern Alps. The torrent is monitored since a heavy precipitation event resulted in a disastrous flood in July 2011. Sediment mobilisation from slopes as well as within-channel storage and fluxes are regularly measured by photogrammetric methods and sediment impact sensors (SIS). The associated hydro-meteorological conditions are known from a dense station network. Changing states of connectivity can thus be related to precipitation and internal dynamics (sediment availability, cut-and-fill cycles). The site-specific insights are then conceptualized for application to a broader scale. Therefore, a Styria wide database of torrential

  17. Debris-flow susceptibility and hazard assessment at a regional scale from GIS analysis

    Science.gov (United States)

    Bertrand, M.; Liébault, F.; Piégay, H.

    2012-12-01

    Small torrents of the Southern French Alps are prone to extreme events. Depending on the rainfall conditions, the sediment supply from hillslopes, and the gravitational energy, these events can occur under different forms, from floods to debris-flows. Debris-flows are recognized as the most dangerous phenomena and may have dramatic consequences for exposed people and infrastructures. As a first step of hazard assessment, we evaluated the debris-flow susceptibility, i.e. the likelihood that an event occurs in an area under particular physical conditions, not including the temporal dimension. The susceptibility is determined by (i) the morphometric controls of small upland catchments for debris-flows triggering and propagation, and by (ii) sediment supply conditions, i.e. erosion patterns feeding the channels. The morphometric controls are evaluated with indicators calculated from basic topographic variables. The sediment supply is evaluated by considering the cumulated surface of erosion area connected to the hydrographic network. We developed a statistical model to predict the geomorphic responses of the catchments (fluvial vs. debris-flow) and we apply this model within a GIS for regional-scale prediction. The model is based on two morphometric indicators, i.e. fan / channel slope and the Melton ruggedness index, and is based on a wide set of data including the Southern French Alps. We developed a GIS procedure to extract the indicators automatically using a 25m DEM and the hydrographic network as raw data. This model and its application have been validated with historical data. Sediment sources feeding debris-flow prone torrents are identified by first automatically mapping the erosion patches from the infrared orthophotos analysis then identifying the ones connected to the stream network. A classification method has been developed (segmentation into homogeneous objects classified with a neural network algorithm) and validated with expert interpretation on the

  18. Data sources and methods for industrial energy analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-08-01

    Following an introductory and overview section of industrial energy-use patterns, Section II of this report describes a number of the major industrial-energy-use data bases often used to analyze industrial energy use. Section III gives the results of an analysis which used a number of energy and industrial-location data bases to estimate plant-specific energy use in ten of the largest energy-using industries. The section summarizes the results of the analysis and discusses the implications of the energy use per plant distributions for the industrial market for high- and low-Btu coal gasification and coal liquefaction. Section IV outlines a methodology for segmenting the industrial energy market and evaluating the competitiveness of low- and medium-Btu gas relative to other alternatives. The methodology demonstrates the uses of the industrial energy data bases in performing market penetration analysis.

  19. Hazard Identification and Risk Assessment of Health and Safety Approach JSA (Job Safety Analysis) in Plantation Company

    Science.gov (United States)

    Sugarindra, Muchamad; Ragil Suryoputro, Muhammad; Tiya Novitasari, Adi

    2017-06-01

    Plantation company needed to identify hazard and perform risk assessment as an Identification of Hazard and Risk Assessment Crime and Safety which was approached by using JSA (Job Safety Analysis). The identification was aimed to identify the potential hazards that might be the risk of workplace accidents so that preventive action could be taken to minimize the accidents. The data was collected by direct observation to the workers concerned and the results were recorded on a Job Safety Analysis form. The data were as forklift operator, macerator worker, worker’s creeper, shredder worker, workers’ workshop, mechanical line worker, trolley cleaning workers and workers’ crepe decline. The result showed that shredder worker value was 30 and had the working level with extreme risk with the risk value range was above 20. So to minimize the accidents could provide Personal Protective Equipment (PPE) which were appropriate, information about health and safety, the company should have watched the activities of workers, and rewards for the workers who obey the rules that applied in the plantation.

  20. The impact of overlapping processes on rockfall hazard analysis - the Bolonia Bay study (southern Spain)

    Science.gov (United States)

    Fernandez-Steeger, T.; Grützner, C.; Reicherter, K.; Braun, A.; Höbig, N.

    2009-04-01

    from the described investigation show that on a screening and planning level the results of the empirical methods are quite good. Especially for numerical simulation, where back analysis is common to parameterize the models, the identification of "ideal" rockfalls is essential for a good simulation performance and subsequently for an appropriate planning of protection measures. References Corominas, J. 1996. The angle of reach as a mobility index for small and large landslides. Canadian Geotechnical Journal, 33, 260 - 271. Dorren, L.K. 2003. A review of rockfall mechanics and modeling approaches. Progress in Physical Geography, 27 (1), 69 - 87. Evans, S. & Hungr, O. 1993. The assessment of rockfall hazard at the base of talus slopes. Canadian Geotechnical Journal, 30, 620 - 636. Heim, A. 1932. Bergsturz und Menschenleben. Vjschr. d. Naturforsch Ges. Zürich, 216 pp. Silva P.G., Reicherter K., Grützner C., Bardají T., Lario J., Goy J.L., Zazo C., & Becker-Heidmann P. 2009. Surface and subsurface paleoseismic records at the ancient Roman city of Baelo Claudia and the Bolonia Bay area, Cádiz (South Spain). Geol Soc of London Spec. Vol.: Paleoseismology: Historical and prehistorical records of earthquake ground effects for seismic hazard assessment. In press. Spang, R. M. & Sonser, Th. 1995. Optimized rockfall protection by "ROCKFALL". Proc 8th Int Congress Rock Mechanics, 3, 1233-1242.

  1. Photovoltaic venture analysis. Final report. Volume I. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    The objective of the study, government programs under investigation, and a brief review of the approach are presented. Potential markets for photovoltaic systems relevant to the study are described. The response of the photovoltaic supply industry is then considered. A model which integrates the supply and demand characteristics of photovoltaics over time was developed. This model also calculates the economic benefits associated with various government subsidy programs. Results are derived under alternative possible supply, demand, and macroeconomic conditions. A probabilistic analysis of the costs and benefits of a $380 million federal photovoltaic procurement initiative, as well as certain alternative strategies, is summarized. Conclusions and recommendations based on the analysis are presented.

  2. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant.

    Science.gov (United States)

    Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen

    2015-09-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.

  3. Final Report: Hydrogen Production Pathways Cost Analysis (2013 – 2016)

    Energy Technology Data Exchange (ETDEWEB)

    James, Brian David [Strategic Analysis Inc., Arlington, VA (United States); DeSantis, Daniel Allan [Strategic Analysis Inc., Arlington, VA (United States); Saur, Genevieve [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-30

    This report summarizes work conducted under a three year Department of Energy (DOE) funded project to Strategic Analysis, Inc. (SA) to analyze multiple hydrogen (H2) production technologies and project their corresponding levelized production cost of H2. The analysis was conducted using the H2A Hydrogen Analysis Tool developed by the DOE and National Renewable Energy Laboratory (NREL). The project was led by SA but conducted in close collaboration with the NREL and Argonne National Laboratory (ANL). In-depth techno-economic analysis (TEA) of five different H2 production methods was conducted. These TEAs developed projections for capital costs, fuel/feedstock usage, energy usage, indirect capital costs, land usage, labor requirements, and other parameters, for each H2 production pathway, and use the resulting cost and system parameters as inputs into the H2A discounted cash flow model to project the production cost of H2 ($/kgH2). Five technologies were analyzed as part of the project and are summarized in this report: Proton Exchange Membrane technology (PEM), High temperature solid oxide electrolysis cell technology (SOEC), Dark fermentation of biomass for H2 production, H2 production via Monolithic Piston-Type Reactors with rapid swing reforming and regeneration reactions, and Reformer-Electrolyzer-Purifier (REP) technology developed by Fuel Cell Energy, Inc. (FCE).

  4. Vertically integrated analysis of human DNA. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Olson, M.

    1997-10-01

    This project has been oriented toward improving the vertical integration of the sequential steps associated with the large-scale analysis of human DNA. The central focus has been on an approach to the preparation of {open_quotes}sequence-ready{close_quotes} maps, which is referred to as multiple-complete-digest (MCD) mapping, primarily directed at cosmid clones. MCD mapping relies on simple experimental steps, supported by advanced image-analysis and map-assembly software, to produce extremely accurate restriction-site and clone-overlap maps. We believe that MCD mapping is one of the few high-resolution mapping systems that has the potential for high-level automation. Successful automation of this process would be a landmark event in genome analysis. Once other higher organisms, paving the way for cost-effective sequencing of these genomes. Critically, MCD mapping has the potential to provide built-in quality control for sequencing accuracy and to make possible a highly integrated end product even if there are large numbers of discontinuities in the actual sequence.

  5. Seismic hazard analysis of nuclear installations in France. Current practice and research

    Energy Technology Data Exchange (ETDEWEB)

    Mohammadioun, B. [CEA Centre d`Etudes de Fontenay-aux-Roses, 92 (France). Inst. de Protection et de Surete Nucleaire

    1997-03-01

    The methodology put into practice in France for the evaluation of seismic hazard on the sites of nuclear facilities is founded on data assembled country-wide over the past 15 years, in geology, geophysics and seismology. It is appropriate to the regional seismotectonic context (interplate), characterized notably by diffuse seismicity. Extensive use is made of information drawn from historical seismicity. The regulatory practice described in the RFS I.2.c is reexamined periodically and is subject to up-dating so as to take advantage of new earthquake data and of the results gained from research work. Acquisition of the basic data, such as the identification of active faults and the quantification of site effect, which will be needed to achieve improved preparedness versus severe earthquake hazard in the 21st century, will necessarily be the fruit of close international cooperation and collaboration, which should accordingly be actively promoted. (J.P.N.)

  6. Analysis of root causes of major hazard precursors (hydrocarbon leaks) in the Norwegian offshore petroleum industry

    Energy Technology Data Exchange (ETDEWEB)

    Vinnem, Jan Erik, E-mail: jev@preventor.n [Preventor AS/University of Stavanger, Rennebergstien 30, 4021 Stavanger (Norway); Hestad, Jon Andreas [Safetec Nordic AS, Bergen (Norway); Kvaloy, Jan Terje [Department of Mathematics and Natural Sciences, University of Stavanger (Norway); Skogdalen, Jon Espen [Department of Industrial Economics, Risk Management and Planning, University of Stavanger (Norway)

    2010-11-15

    The offshore petroleum industry in Norway reports major hazard precursors to the authorities, and data are available for the period 1996 through 2009. Barrier data have been reported since 2002, as have data from an extensive questionnaire survey covering working environment, organizational culture and perceived risk among all employees on offshore installations. Several attempts have been made to analyse different data sources in order to discover relations that may cast some light on possible root causes of major hazard precursors. These previous attempts were inconclusive. The study presented in this paper is the most extensive study performed so far. The data were analysed using linear regression. The conclusion is that there are significant correlations between number of leaks and safety climate indicators. The discussion points to possible root causes of major accidents.

  7. Non-parametric seismic hazard analysis in the presence of incomplete data

    Science.gov (United States)

    Yazdani, Azad; Mirzaei, Sajjad; Dadkhah, Koroush

    2017-01-01

    The distribution of earthquake magnitudes plays a crucial role in the estimation of seismic hazard parameters. Due to the complexity of earthquake magnitude distribution, non-parametric approaches are recommended over classical parametric methods. The main deficiency of the non-parametric approach is the lack of complete magnitude data in almost all cases. This study aims to introduce an imputation procedure for completing earthquake catalog data that will allow the catalog to be used for non-parametric density estimation. Using a Monte Carlo simulation, the efficiency of introduced approach is investigated. This study indicates that when a magnitude catalog is incomplete, the imputation procedure can provide an appropriate tool for seismic hazard assessment. As an illustration, the imputation procedure was applied to estimate earthquake magnitude distribution in Tehran, the capital city of Iran.

  8. On adjustment for auxiliary covariates in additive hazard models for the analysis of randomized experiments

    DEFF Research Database (Denmark)

    Vansteelandt, S.; Martinussen, Torben; Tchetgen, E. J Tchetgen

    2014-01-01

    's dependence on time or on the auxiliary covariates is misspecified, and even away from the null hypothesis of no treatment effect. We furthermore show that adjustment for auxiliary baseline covariates does not change the asymptotic variance of the estimator of the effect of a randomized treatment. We conclude......We consider additive hazard models (Aalen, 1989) for the effect of a randomized treatment on a survival outcome, adjusting for auxiliary baseline covariates. We demonstrate that the Aalen least-squares estimator of the treatment effect parameter is asymptotically unbiased, even when the hazard...... that, in view of its robustness against model misspecification, Aalen least-squares estimation is attractive for evaluating treatment effects on a survival outcome in randomized experiments, and the primary reasons to consider baseline covariate adjustment in such settings could be interest in subgroup...

  9. Hazardous Waste Management in South African Mining ; A CGE Analysis of the Economic Impacts

    OpenAIRE

    Wiebelt, Manfred

    1999-01-01

    There is no doubt that an improved hazardous waste management in mining and mineral processing will reduce environmental and health risks in South Africa. However, skeptics fear that waste reduction, appropriate treatment and disposal are not affordable within the current economic circumstances, neither from an economic nor from a social point of view. This paper mainly deals with the first aspect and touches upon the second. It investigates the short-run and long-run sectoral impacts of an e...

  10. DOWNFLOW code and LIDAR technology for lava flow analysis and hazard assessment at Mount Etna

    OpenAIRE

    Alessandro Fornaciai; Simone Tarquini; Massimiliano Favalli

    2011-01-01

    The use of a lava-flow simulation (DOWNFLOW) probabilistic code and airborne light detection and ranging (LIDAR) technology are combined to analyze the emplacement of compound lava flow fields at Mount Etna (Sicily, Italy). The goal was to assess the hazard posed by lava flows. The LIDAR-derived time series acquired during the 2006 Mount Etna eruption records the changing topography of an active lava-flow field. These short-time-interval, high-resolution topographic surveys provide a detailed...

  11. Safety, Health and Environmental Hazards Associated with Composites: A Complete Analysis

    Science.gov (United States)

    1992-11-01

    Cincinnati, OH: 1992. Warner, John D., Alan G. Miller. "Advanced Composite Use Experiences. The Basis for Future Applications". SAE Technical Paper Series. SAE...Proceedings. Vol. 35 Book 2. Covina, CA: 1990. Young, Stephen L., John W. Brelsford , and Michal S. Wogalter. "Judgements of Hazard, Risk, and Danger. Do They Differ?" Proceedings of the Human Factors Society. Santa Monica, CA: 1990.

  12. A framework for the assessment and analysis of multi-hazards induced risk resulting from space vehicles operations

    Science.gov (United States)

    Sala-Diakanda, Serge N.

    2007-12-01

    With the foreseeable increase in traffic frequency to and from orbit, the safe operation of current and future space vehicles at designated spaceports has become a serious concern. Due to their high explosive energy potential, operating those launch vehicles presents a real risk to: (1) the spaceport infrastructure and personnel, (2) the communities surrounding the spaceport and (3) the flying aircrafts whose routes could be relatively close to spaceport launch and reentry routes. Several computer models aimed at modeling the effects of the different hazards generated by the breakup of such vehicles (e.g., fragmentation of debris, release of toxic gases, propagation of blast waves, etc.) have been developed, and are used to assist in Go-No Go launch decisions. They can simulate a total failure scenario of the vehicle and, estimate a number of casualties to be expected as a result of such failure. However, as all of these models---which can be very elaborate and complex---consider only one specific explosion hazard in their simulations, the decision of whether or not a launch should occur is currently based on the evaluation of several estimates of an expected number of casualties. As such, current practices ignore the complex, nonlinear interactions between the different hazards as well as the interdependencies between the estimates. In this study, we developed a new framework which makes use of information fusion theory, hazards' dispersion modeling and, geographical statistical analysis and visualization capabilities of geographical information systems to assess the risk generated by the operation of space launch vehicles. A new risk metric, which effectively addresses the lack of a common risk metric with current methods, is also proposed. A case study, based on a proposed spaceport in the state of Oklahoma showed that the estimates we generate through our framework consistently outperform estimates provided by any individual hazard, or by the independent

  13. Medieval monastic mortality: hazard analysis of mortality differences between monastic and nonmonastic cemeteries in England.

    Science.gov (United States)

    DeWitte, Sharon N; Boulware, Jessica C; Redfern, Rebecca C

    2013-11-01

    Scholarship on life in medieval European monasteries has revealed a variety of factors that potentially affected mortality in these communities. Though there is some evidence based on age-at-death distributions from England that monastic males lived longer than members of the general public, what is missing from the literature is an explicit examination of how the risks of mortality within medieval monastic settings differed from those within contemporaneous lay populations. This study examines differences in the hazard of mortality for adult males between monastic cemeteries (n = 528) and non-monastic cemeteries (n = 368) from London, all of which date to between AD 1050 and 1540. Age-at-death data from all cemeteries are pooled to estimate the Gompertz hazard of mortality, and "monastic" (i.e., buried in a monastic cemetery) is modeled as a covariate affecting this baseline hazard. The estimated effect of the monastic covariate is negative, suggesting that individuals in the monastic communities faced reduced risks of dying compared to their peers in the lay communities. These results suggest better diets, the positive health benefits of religious behavior, better living conditions in general in monasteries, or selective recruitment of healthy or higher socioeconomic status individuals.

  14. Using Monte Carlo techniques and parallel processing for debris hazard analysis of rocket systems

    Energy Technology Data Exchange (ETDEWEB)

    LaFarge, R.A.

    1994-02-01

    Sandia National Laboratories has been involved with rocket systems for many years. Some of these systems have carried high explosive onboard, while others have had FTS for destruction purposes whenever a potential hazard is detected. Recently, Sandia has also been involved with flight tests in which a target vehicle is intentionally destroyed by a projectile. Such endeavors always raise questions about the safety of personnel and the environment in the event of a premature detonation of the explosive or an activation of the FTS, as well as intentional vehicle destruction. Previous attempts to investigate fragmentation hazards for similar configurations have analyzed fragment size and shape in detail but have computed only a limited number of trajectories to determine the probabilities of impact and casualty expectations. A computer program SAFETIE has been written in support of various SNL flight experiments to compute better approximations of the hazards. SAFETIE uses the AMEER trajectory computer code and the Engineering Sciences Center LAN of Sun workstations to determine more realistically the probability of impact for an arbitrary number of exclusion areas. The various debris generation models are described.

  15. Hazardous waste characterization among various thermal processes in South Korea: a comparative analysis.

    Science.gov (United States)

    Shin, Sun Kyoung; Kim, Woo-Il; Jeon, Tae-Wan; Kang, Young-Yeul; Jeong, Seong-Kyeong; Yeon, Jin-Mo; Somasundaram, Swarnalatha

    2013-09-15

    Ministry of Environment, Republic of Korea (South Korea) is in progress of converting its current hazardous waste classification system to harmonize it with the international standard and to set-up the regulatory standards for toxic substances present in the hazardous waste. In the present work, the concentrations along with the trend of 13 heavy metals, F(-), CN(-) and 19 PAH present in the hazardous waste generated among various thermal processes (11 processes) in South Korea were analyzed along with their leaching characteristics. In all thermal processes, the median concentrations of Cu (3.58-209,000 mg/kg), Ni (BDL-1560 mg/kg), Pb (7.22-5132.25mg/kg) and Zn (83.02-31419 mg/kg) were comparatively higher than the other heavy metals. Iron & Steel thermal process showed the highest median value of the heavy metals Cd (14.76 mg/kg), Cr (166.15 mg/kg) and Hg (2.38 mg/kg). Low molecular weight PAH (BDL-37.59 mg/kg) was predominant in sludge & filter cake samples present in most of the thermal processes. Comparatively flue gas dust present in most of the thermal processing units resulted in the higher leaching of the heavy metals.

  16. Analysis of factors related to man-induced hazard for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Soon; Jung, Jea Hee; Lee, Keun O; Son, Ki Sang; Wang, Sang Chul; Lee, Chang Jin; Ku, Min Ho; Park, Nam Young [Seoul National Univ. of Technology, Seoul (Korea, Republic of)

    2003-03-15

    This study is to show a guide for installing hazardous facilities adjoined atomic power plant after finding out how much these facilities could impact to the atomic plant. Nuclear power plant is an important facility which is closely connected with public life, industrial activity, and the conduct of public business, so it should not be damaged. Therefore, if there are hazardous and harmful facilities near the plant, then they must be evaluated by the size, the type, and the shape. First of all, any factors that could cause man induced accident must be investigated. And they must be exactly evaluated from how much it will damage the plant facilities. The purpose of this study is to set a technical standard for the installation of these facilities by evaluating the man induced accident. Also, it is to make out the evaluation methods by investigating the hazardous facilities which are placed near the plant. Our country is now using CFR standard : reg. guide and IAEA safety series. However, not only the standard of technology which is related to man induced accident but also the evaluation methods for facilities are not yet layed down. As It was mentioned above, we should evaluate these facilities adequately, and these methods must be made out.

  17. Flood Hazard Area

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  18. Flood Hazard Boundaries

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  19. Well test analysis for Devonian-shale wells. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Serra, K.; Chen, C.C.; Yeh, N.S.; Ohaeri, C.; Reynolds, A.C.; Raghavan, R.

    1981-09-30

    This work presents broad interpretive rules for analyzing Devonian Shale Wells based on simulated drawdown and buildup tests. The report consists of four parts: (1) New Pressure Transient Analysis Methods for Naturally Fractured Reservoirs, (2) Pressure Transient Analysis Methods for Bounded Naturally Fractured Reservoirs, (3) Pressure Response at Observation Wells in Fractured Reservoirs, and (4) Unsteady Flow to a Well Produced at a Constant Pressure in a Fractured Reservoir. Each of these sections is an independent unit; that is, knowledge of the other sections, even though desirable, is not necessary to understand the material in a given section. The principal contribution of this work is the identification of a new flow regime during the early transient period. The discovery of this flow regime represents a major advance in our ability to analyze pressure transient tests. The identification of the new flow regime also explains the response of wells in fractured reservoirs that until now have been considered anomalous. Systematic procedures to analyze single well (drawdown and buildup) tests and multiwell (interference) tests are discussed.

  20. A decision analysis framework for estimating the potential hazards for drinking water resources of chemicals used in hydraulic fracturing fluids.

    Science.gov (United States)

    Yost, Erin E; Stanek, John; Burgoon, Lyle D

    2017-01-01

    Despite growing concerns over the potential for hydraulic fracturing to impact drinking water resources, there are limited data available to identify chemicals used in hydraulic fracturing fluids that may pose public health concerns. In an effort to explore these potential hazards, a multi-criteria decision analysis (MCDA) framework was employed to analyze and rank selected subsets of these chemicals by integrating data on toxicity, frequency of use, and physicochemical properties that describe transport in water. Data used in this analysis were obtained from publicly available databases compiled by the United States Environmental Protection Agency (EPA) as part of a larger study on the potential impacts of hydraulic fracturing on drinking water. Starting with nationwide hydraulic fracturing chemical usage data from EPA's analysis of the FracFocus Chemical Disclosure Registry 1.0, MCDAs were performed on chemicals that had either noncancer toxicity values (n=37) or cancer-specific toxicity values (n=10). The noncancer MCDA was then repeated for subsets of chemicals reported in three representative states (Texas, n=31; Pennsylvania, n=18; and North Dakota, n=20). Within each MCDA, chemicals received scores based on relative toxicity, relative frequency of use, and physicochemical properties (mobility in water, volatility, persistence). Results show a relative ranking of these chemicals based on hazard potential, and provide preliminary insight into chemicals that may be more likely than others to impact drinking water resources. Comparison of nationwide versus state-specific analyses indicates regional differences in the chemicals that may be of more concern to drinking water resources, although many chemicals were commonly used and received similar overall hazard rankings. Several chemicals highlighted by these MCDAs have been reported in groundwater near areas of hydraulic fracturing activity. This approach is intended as a preliminary analysis, and represents one

  1. Analysis of Intellectual Capital Effect toward Final Performance and Growt

    Directory of Open Access Journals (Sweden)

    Sasya Sabrina

    2015-11-01

    Full Text Available The purpose of this research is to investigate the influence of intellectual capital of firm toward financial performance and growth. The Value Added Intellectual Coefficient (VAICTM is used to measure intellectual capital. The indicators for VAICTM are Value Added Capital Employed (VACA, Value Added Human Capital (VAHU, and Structural Capital Value Added (STVA. The indicators for financial performance are Current Ratio (CR, Total Assets Turnover (TATO, Return on Investment (ROI, and Return on Equity (ROE. The indicators for growth are Earnings Growth (EG and Assets Growth (AG. This research uses data drawn from 92 publicly listed manufacturing companies in Indonesian Stock Exchange in 2010, 2011, and 2012. Partial Least Square (PLS is used as the method of data analysis. This research uses SmartPLS 3.2.0 to analyze the data.The results show that: intellectual capital doesnt influence financial performance and intellectual capital positively influences growth.

  2. Probabilistic finite elements for fatigue and fracture analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Belytschko, T.; Liu, W.K.

    1993-04-01

    An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.

  3. Caucasus Seismic Information Network: Data and Analysis Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Randolph Martin; Mary Krasovec; Spring Romer; Timothy O' Connor; Emanuel G. Bombolakis; Youshun Sun; Nafi Toksoz

    2007-02-22

    The geology and tectonics of the Caucasus region (Armenia, Azerbaijan, and Georgia) are highly variable. Consequently, generating a structural model and characterizing seismic wave propagation in the region require data from local seismic networks. As of eight years ago, there was only one broadband digital station operating in the region – an IRIS station at Garni, Armenia – and few analog stations. The Caucasus Seismic Information Network (CauSIN) project is part of a nulti-national effort to build a knowledge base of seismicity and tectonics in the region. During this project, three major tasks were completed: 1) collection of seismic data, both in event catalogus and phase arrival time picks; 2) development of a 3-D P-wave velocity model of the region obtained through crustal tomography; 3) advances in geological and tectonic models of the region. The first two tasks are interrelated. A large suite of historical and recent seismic data were collected for the Caucasus. These data were mainly analog prior to 2000, and more recently, in Georgia and Azerbaijan, the data are digital. Based on the most reliable data from regional networks, a crustal model was developed using 3-D tomographic inversion. The results of the inversion are presented, and the supporting seismic data are reported. The third task was carried out on several fronts. Geologically, the goal of obtaining an integrated geological map of the Caucasus on a scale of 1:500,000 was initiated. The map for Georgia has been completed. This map serves as a guide for the final incorporation of the data from Armenia and Azerbaijan. Description of the geological units across borders has been worked out and formation boundaries across borders have been agreed upon. Currently, Armenia and Azerbaijan are working with scientists in Georgia to complete this task. The successful integration of the geologic data also required addressing and mapping active faults throughout the greater Caucasus. Each of the major

  4. The median hazard ratio: a useful measure of variance and general contextual effects in multilevel survival analysis.

    Science.gov (United States)

    Austin, Peter C; Wagner, Philippe; Merlo, Juan

    2017-03-15

    Multilevel data occurs frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models (MLRM). MLRM incorporate cluster-specific random effects which allow one to partition the total individual variance into between-cluster variation and between-individual variation. Statistically, MLRM account for the dependency of the data within clusters and provide correct estimates of uncertainty around regression coefficients. Substantively, the magnitude of the effect of clustering provides a measure of the General Contextual Effect (GCE). When outcomes are binary, the GCE can also be quantified by measures of heterogeneity like the Median Odds Ratio (MOR) calculated from a multilevel logistic regression model. Time-to-event outcomes within a multilevel structure occur commonly in epidemiological and medical research. However, the Median Hazard Ratio (MHR) that corresponds to the MOR in multilevel (i.e., 'frailty') Cox proportional hazards regression is rarely used. Analogously to the MOR, the MHR is the median relative change in the hazard of the occurrence of the outcome when comparing identical subjects from two randomly selected different clusters that are ordered by risk. We illustrate the application and interpretation of the MHR in a case study analyzing the hazard of mortality in patients hospitalized for acute myocardial infarction at hospitals in Ontario, Canada. We provide R code for computing the MHR. The MHR is a useful and intuitive measure for expressing cluster heterogeneity in the outcome and, thereby, estimating general contextual effects in multilevel survival analysis. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  5. Final Report. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Coles, T. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Spantini, A. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Tosatto, L. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2013-09-30

    The project was a collaborative effort among MIT, Sandia National Laboratories (local PI Dr. Habib Najm), the University of Southern California (local PI Prof. Roger Ghanem), and The Johns Hopkins University (local PI Prof. Omar Knio, now at Duke University). Our focus was the analysis and reduction of large-scale dynamical systems emerging from networks of interacting components. Such networks underlie myriad natural and engineered systems. Examples important to DOE include chemical models of energy conversion processes, and elements of national infrastructure—e.g., electric power grids. Time scales in chemical systems span orders of magnitude, while infrastructure networks feature both local and long-distance connectivity, with associated clusters of time scales. These systems also blend continuous and discrete behavior; examples include saturation phenomena in surface chemistry and catalysis, and switching in electrical networks. Reducing size and stiffness is essential to tractable and predictive simulation of these systems. Computational singular perturbation (CSP) has been effectively used to identify and decouple dynamics at disparate time scales in chemical systems, allowing reduction of model complexity and stiffness. In realistic settings, however, model reduction must contend with uncertainties, which are often greatest in large-scale systems most in need of reduction. Uncertainty is not limited to parameters; one must also address structural uncertainties—e.g., whether a link is present in a network—and the impact of random perturbations, e.g., fluctuating loads or sources. Research under this project developed new methods for the analysis and reduction of complex multiscale networks under uncertainty, by combining computational singular perturbation (CSP) with probabilistic uncertainty quantification. CSP yields asymptotic approximations of reduceddimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing

  6. Quantitative analysis of the 1981 and 2001 Etna flank eruptions: a contribution for future hazard evaluation and mitigation

    Directory of Open Access Journals (Sweden)

    Cristina Proietti

    2011-12-01

    Full Text Available Lava flows produced during Etna flank eruptions represent severe hazards for the nearby inhabited areas, which can be protected by adopting prompt mitigation actions, such as the building of diversion barriers. Lava diversion measures were attempted recently during the 1983, 1991-93, 2001 and 2002 Etna eruptions, although with different degrees of success. In addition to the complexity of barrier construction (due to the adverse physical conditions, the time available to successfully slow the advance of a lava flow depends on the lava effusion rate, which is not easily measurable. One method to estimate the average lava effusion rate over a specified period of time is based on a volumetric approach; i.e. the measurement of the volume changes of the lava flow over that period. Here, this has been compared to an approach based on thermal image processing, as applied to estimate the average effusion rates of lava flows during the 1981 and 2001 Etna eruptions. The final volumes were measured by the comparison of pre-eruption and post-eruption photogrammetric digital elevation models and orthophotographs. Lava volume growth during these eruptions was estimated by locating the flow-front positions from analyses of scientific papers and newspapers reports, as well as from helicopter photographs. The analyses of these two eruptions contribute to the understanding of the different eruptive mechanisms, highlighting the role of the peak effusion rate, which represents a critical parameter for planning of mitigation actions and for hazard evaluation.

  7. Solar thermal repowering utility value analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, R.; Day, J.; Reed, B.; Malone, M.

    1979-12-01

    The retrofit of solar central receiver energy supply systems to existing steam-electric generating stations (repowering) is being considered as a major programmatic thrust by DOE. The determination of a government response appropriate to the opportunities of repowering is an important policy question, and is the major reason for the analysis. The study objective is to define a government role in repowering that constitutes an efficient program investment in pursuit of viable private markets for heliostat-based energy systems. In support of that objective, the study is designed to identify the scope and nature of the repowering opportunity within the larger context of its contributions to central receiver technology development and commercialization. The Supply and Integration Tasks are documented elsewhere. This report documents the Demand Task, determining and quantifying the sources of the value of repowering and of central receiver technology in general to electric utilities. The modeling tools and assumptions used in the Demand Task are described and the results are presented and interpreted. (MCW)

  8. Production cost analysis of Euphorbia lathyris. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Mendel, D.A.

    1979-08-01

    The purpose of this study is to estimate costs of production for Euphorbia lathyris (hereafter referred to as Euphorbia) in commercial-scale quantities. Selection of five US locations for analysis was based on assumed climatic and cultivation requirements. The five areas are: nonirrigated areas (Southeast Kansas and Central Oklahoma, Northeast Louisiana and Central Mississippi, Southern Illinois), and irrigated areas: (San Joaquin Valley and the Imperial Valley, California and Yuma, Arizona). Cost estimates are tailored to reflect each region's requirements and capabilities. Variable costs for inputs such as cultivation, planting, fertilization, pesticide application, and harvesting include material costs, equipment ownership, operating costs, and labor. Fixed costs include land, management, and transportation of the plant material to a conversion facility. Euphorbia crop production costs, on the average, range between $215 per acre in nonirrigated areas to $500 per acre in irrigated areas. Extraction costs for conversion of Euphorbia plant material to oil are estimated at $33.76 per barrel of oil, assuming a plant capacity of 3000 dry ST/D. Estimated Euphorbia crop production costs are competitive with those of corn. Alfalfa production costs per acre are less than those of Euphorbia in the Kansas/Oklahoma and Southern Illinois site, but greater in the irrigated regions. This disparity is accounted for largely by differences in productivity and irrigation requirements.

  9. Case study applications of venture analysis: fluidized bed. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Mosle, R.

    1978-05-01

    In order to appraise the case for government intervention in the case of atmospheric fluid-bed combustion, Energy Resources Company and Rotan Mosle have developed a methodology containing four key elements. The first is an economic and environmental characterization of the new technology; the second, a survey of its prospective users and vendors; the third, a cost-benefit analysis of its prospective social benefits; and the fourth, an analytical model of its market penetration and the effects thereon of a basket of government incentives. Three major technical obstacles exist to continued AFBC development: feeding coal and limestone reliably to the boiler, tube erosion and corrosion, and developing boiler turndown capability. The review of the economic, environmental and technical attributes of the new technology has suggested that the preliminary venture can be selected with confidence as a commercial prospect capable of detailed evaluation from both private and public perspectives. The venture choice can therefore be considered firm: it will be the equipment required for the combustion of coal in atmospheric fluid beds as applied to industrial process steam in boilers of at least 83 Kpph capacity. The most effective demonstration of the potential of AFBC in the eyes of prospective industrial users is that provided by a project conducted by the private sector with minimal government direction. Unlike the ''experimental'' style of existing mixed public-private demonstration projects, the pressure to achieve reliability in more commercial applications would serve rapidly to reveal more clearly the potential of AFBC. The marketplace can be allowed to decide its fate thereafter. Once AFBC has been successfully demonstrated, the relative merits of AFBC and coal-FGD are best left to prospective users to evaluate.

  10. Photosynthesis energy factory: analysis, synthesis, and demonstration. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1978-11-01

    This quantitative assessment of the potential of a combined dry-land Energy Plantation, wood-fired power plant, and algae wastewater treatment system demonstrates the cost-effectiveness of recycling certain by-products and effluents from one subsystem to another. Designed to produce algae up to the limit of the amount of carbon in municipal wastewater, the algae pond provides a positive cash credit, resulting mainly from the wastewater treatment credit, which may be used to reduce the cost of the Photosynthesis Energy Factory (PEF)-generated electricity. The algae pond also produces fertilizer, which reduces the cost of the biomass produced on the Energy Plantation, and some gas. The cost of electricity was as low as 35 mills per kilowatt-hour for a typical municipally-owned PEF consisting of a 65-MWe power plant, a 144-acre algae pond, and a 33,000-acre Energy Plantation. Using only conventional or near-term technology, the most cost-effective algae pond for a PEF is the carbon-limited secondary treatment system. This system does not recycle CO/sub 2/ from the flue gas. Analysis of the Energy Plantation subsystem at 15 sites revealed that plantations of 24,000 to 36,000 acres produce biomass at the lowest cost per ton. The following sites are recommended for more detailed evaluation as potential demonstration sites: Pensacola, Florida; Jamestown, New York; Knoxville, Tennessee; Martinsville, Virginia, and Greenwood, South Carolina. A major possible extension of the PEF concept is to include the possibility for irrigation.

  11. An Investigation of Homogeneous and Heterogeneous Sonochemistry for Destruction of Hazardous Waste - Final Report - 09/15/1996 - 09/14/2000

    Energy Technology Data Exchange (ETDEWEB)

    Hua, Inez

    2000-09-14

    During the last 20 years, various legislative acts have mandated the reduction and elimination of water and land pollution. In order to fulfill these mandates, effective control and remediation methods must be developed and implemented. The drawbacks of current hazardous waste control methods motivate the development of new technology, and the need for new technology is further driven by the large number of polluted sites across the country. This research explores the application and optimization of ultrasonic waves as a novel method by which aqueous contaminants are degraded. The primary objective of the investigation is to acquire a deeper fundamental knowledge of acoustic cavitation and cavitation chemistry, and in doing so, to ascertain how ultrasonic irradiation can be more effectively applied to environmental problems. Special consideration is given to the types of problems and hazardous chemical substrates found specifically at Department of Energy (DOE) sites. The experimental work is divided into five broad tasks, to be completed over a period of three years. The first task is to explore the significance of physical variables during sonolysis, such as ultrasonic frequency. The second aim is an understanding of sonochemical degradation kinetics and by-products, complemented by information from the detection of reactive intermediates with electron paramagnetic resonance. The sonolytic decomposition studies will focus on polychlorinated biphenyls (PCBs). Investigation of activated carbon regeneration during ultrasonic irradiation extends sonochemical applications in homogeneous systems to heterogeneous systems of environmental interest. Lastly, the physics and hydrodynamics of cavitation bubbles and bubble clouds will be correlated with sonochemical effects by performing high-speed photographic studies of acoustically cavitating aqueous solutions. The most important benefit will be fundamental information which will allow a more optimal application of

  12. DOWNFLOW code and LIDAR technology for lava flow analysis and hazard assessment at Mount Etna

    Directory of Open Access Journals (Sweden)

    Alessandro Fornaciai

    2011-12-01

    Full Text Available The use of a lava-flow simulation (DOWNFLOW probabilistic code and airborne light detection and ranging (LIDAR technology are combined to analyze the emplacement of compound lava flow fields at Mount Etna (Sicily, Italy. The goal was to assess the hazard posed by lava flows. The LIDAR-derived time series acquired during the 2006 Mount Etna eruption records the changing topography of an active lava-flow field. These short-time-interval, high-resolution topographic surveys provide a detailed quantitative picture of the topographic changes. The results highlight how the flow field evolves as a number of narrow (5-15 m wide disjointed flow units that are fed simultaneously by uneven lava pulses that advance within formed channels. These flow units have widely ranging advance velocities (3-90 m/h. Overflows, bifurcations and braiding are also clearly displayed. In such a complex scenario, the suitability of deterministic codes for lava-flow simulation can be hampered by the fundamental difficulty of measuring the flow parameters (e.g. the lava discharge rate, or the lava viscosity of a single flow unit. However, the DOWNFLOW probabilistic code approaches this point statistically and needs no direct knowledge of flow parameters. DOWNFLOW intrinsically accounts for complexities and perturbations of lava flows by randomly varying the pre-eruption topography. This DOWNFLOW code is systematically applied here over Mount Etna, to derive a lava-flow hazard map based on: (i the topography of the volcano; (ii the probability density function for vent opening; and (iii a law for the expected lava-flow length for all of the computational vents considered. Changes in the hazard due to the recent morphological evolution of Mount Etna have also been addressed.

  13. Numerical and probabilistic analysis of asteroid and comet impact hazard mitigation

    Energy Technology Data Exchange (ETDEWEB)

    Plesko, Catherine S [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2010-09-09

    The possibility of asteroid and comet impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear munitions are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear munitions for hazard mitigation. There are large uncertainties in a PHO's physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be 'rubble piles' with large porosities and little strength. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community's understanding of these objects. Computational power and simulation capabilities have improved such that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  14. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-10-01

    This document is the third volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, except for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of appendices C through U of the report

  15. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-10-01

    This document is the first volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, except for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of an introduction, summary/conclusion, site description and assessment, description of facility, and description of operation.

  16. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-10-01

    This document is the third volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, except for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of appendices C through U of the report

  17. Application of Hazard Analysis Critical Control Point in the local manufacture of ready-to-use therapeutic foods (RUTFs).

    Science.gov (United States)

    Henry, C Jeya K; Xin, Janice Lim Wen

    2014-06-01

    The local manufacture of ready-to-use therapeutic foods (RUTFs) is increasing, and there is a need to develop methods to ensure their safe production. We propose the application of Hazard Analysis Critical Control Point (HACCP) principles to achieve this goal. The basic principles of HACCP in the production of RUTFs are outlined. It is concluded that the implementation of an HACCP system in the manufacture of RUTFs is not only feasible but also attainable. The introduction of good manufacturing practices, coupled with an effective HACCP system, will ensure that RUTFs are produced in a cost-effective, safe, and hygienic manner.

  18. Updated laser safety & hazard analysis for the ARES laser system based on the 2007 ANSI Z136.1 standard.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2007-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2007 version of the American National Standards Institutes (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2005 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  19. Analysis of potential hazards associated with 241Am loaded resins from nitrate media

    Energy Technology Data Exchange (ETDEWEB)

    Schulte, Louis D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rubin, Jim [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fife, Keith William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ricketts, Thomas Edgar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tappan, Bryce C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chavez, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-19

    LANL has been contacted to provide possible assistance in safe disposition of a number of 241Am-bearing materials associated with local industrial operations. Among the materials are ion exchange resins which have been in contact with 241Am and nitric acid, and which might have potential for exothermic reaction. The purpose of this paper is to analyze and define the resin forms and quantities to the extent possible from available data to allow better bounding of the potential reactivity hazard of the resin materials. An additional purpose is to recommend handling procedures to minimize the probability of an uncontrolled exothermic reaction.

  20. Combustion diagnosis for analysis of solid propellant rocket abort hazards: Role of spectroscopy

    Science.gov (United States)

    Gill, W.; Cruz-Cabrera, A. A.; Donaldson, A. B.; Lim, J.; Sivathanu, Y.; Bystrom, E.; Haug, A.; Sharp, L.; Surmick, D. M.

    2014-11-01

    Solid rocket propellant plume temperatures have been measured using spectroscopic methods as part of an ongoing effort to specify the thermal-chemical-physical environment in and around a burning fragment of an exploded solid rocket at atmospheric pressures. Such specification is needed for launch safety studies where hazardous payloads become involved with large fragments of burning propellant. The propellant burns in an off-design condition producing a hot gas flame loaded with burning metal droplets. Each component of the flame (soot, droplets and gas) has a characteristic temperature, and it is only through the use of spectroscopy that their temperature can be independently identified.

  1. Hazard Management Dealt by Safety Professionals in Colleges: The Impact of Individual Factors

    Science.gov (United States)

    Wu, Tsung-Chih; Chen, Chi-Hsiang; Yi, Nai-Wen; Lu, Pei-Chen; Yu, Shan-Chi; Wang, Chien-Peng

    2016-01-01

    Identifying, evaluating, and controlling workplace hazards are important functions of safety professionals (SPs). The purpose of this study was to investigate the content and frequency of hazard management dealt by safety professionals in colleges. The authors also explored the effects of organizational factors/individual factors on SPs’ perception of frequency of hazard management. The researchers conducted survey research to achieve the objective of this study. The researchers mailed questionnaires to 200 SPs in colleges after simple random sampling, then received a total of 144 valid responses (response rate = 72%). Exploratory factor analysis indicated that the hazard management scale (HMS) extracted five factors, including physical hazards, biological hazards, social and psychological hazards, ergonomic hazards, and chemical hazards. Moreover, the top 10 hazards that the survey results identified that safety professionals were most likely to deal with (in order of most to least frequent) were: organic solvents, illumination, other chemicals, machinery and equipment, fire and explosion, electricity, noise, specific chemicals, human error, and lifting/carrying. Finally, the results of one-way multivariate analysis of variance (MANOVA) indicated there were four individual factors that impacted the perceived frequency of hazard management which were of statistical and practical significance: job tenure in the college of employment, type of certification, gender, and overall job tenure. SPs within colleges and industries can now discuss plans revolving around these five areas instead of having to deal with all of the separate hazards. PMID:27918474

  2. Hazard Management Dealt by Safety Professionals in Colleges: The Impact of Individual Factors

    Directory of Open Access Journals (Sweden)

    Tsung-Chih Wu

    2016-12-01

    Full Text Available Identifying, evaluating, and controlling workplace hazards are important functions of safety professionals (SPs. The purpose of this study was to investigate the content and frequency of hazard management dealt by safety professionals in colleges. The authors also explored the effects of organizational factors/individual factors on SPs’ perception of frequency of hazard management. The researchers conducted survey research to achieve the objective of this study. The researchers mailed questionnaires to 200 SPs in colleges after simple random sampling, then received a total of 144 valid responses (response rate = 72%. Exploratory factor analysis indicated that the hazard management scale (HMS extracted five factors, including physical hazards, biological hazards, social and psychological hazards, ergonomic hazards, and chemical hazards. Moreover, the top 10 hazards that the survey results identified that safety professionals were most likely to deal with (in order of most to least frequent were: organic solvents, illumination, other chemicals, machinery and equipment, fire and explosion, electricity, noise, specific chemicals, human error, and lifting/carrying. Finally, the results of one-way multivariate analysis of variance (MANOVA indicated there were four individual factors that impacted the perceived frequency of hazard management which were of statistical and practical significance: job tenure in the college of employment, type of certification, gender, and overall job tenure. SPs within colleges and industries can now discuss plans revolving around these five areas instead of having to deal with all of the separate hazards.

  3. Hazard Management Dealt by Safety Professionals in Colleges: The Impact of Individual Factors.

    Science.gov (United States)

    Wu, Tsung-Chih; Chen, Chi-Hsiang; Yi, Nai-Wen; Lu, Pei-Chen; Yu, Shan-Chi; Wang, Chien-Peng

    2016-12-03

    Identifying, evaluating, and controlling workplace hazards are important functions of safety professionals (SPs). The purpose of this study was to investigate the content and frequency of hazard management dealt by safety professionals in colleges. The authors also explored the effects of organizational factors/individual factors on SPs' perception of frequency of hazard management. The researchers conducted survey research to achieve the objective of this study. The researchers mailed questionnaires to 200 SPs in colleges after simple random sampling, then received a total of 144 valid responses (response rate = 72%). Exploratory factor analysis indicated that the hazard management scale (HMS) extracted five factors, including physical hazards, biological hazards, social and psychological hazards, ergonomic hazards, and chemical hazards. Moreover, the top 10 hazards that the survey results identified that safety professionals were most likely to deal with (in order of most to least frequent) were: organic solvents, illumination, other chemicals, machinery and equipment, fire and explosion, electricity, noise, specific chemicals, human error, and lifting/carrying. Finally, the results of one-way multivariate analysis of variance (MANOVA) indicated there were four individual factors that impacted the perceived frequency of hazard management which were of statistical and practical significance: job tenure in the college of employment, type of certification, gender, and overall job tenure. SPs within colleges and industries can now discuss plans revolving around these five areas instead of having to deal with all of the separate hazards.

  4. Multi-scenario-based hazard analysis of high temperature extremes experienced in China during 1951-2010

    Institute of Scientific and Technical Information of China (English)

    YIN Zhan'e; YIN Jie; ZHANG Xiaowei

    2013-01-01

    China is physically and socio-economically susceptible to global warming-derived high temperature extremes because of its vast area and high urban population density.This article presents a scenario-based analysis method for high temperature extremes aimed at illustrating the latter's hazardous potential and exposure across China.Based on probability analysis,high temperature extreme scenarios with return periods of 5,10,20,and 50 years were designed,with a high temperature hazard index calculated by integrating two differentially-weighted extreme temperature indices (maximum temperature and high temperature days).To perform the exposure analysis,a land use map was employed to determine the spatial distribution of susceptible human activities under the different scenarios.The results indicate that there are two heat-prone regions and a sub-hotspot occupying a relatively small land area.However,the societal and economic consequences of such an environmental impact upon the North China Plain and middle/lower Yangtze River Basin would be substantial due to the concentration of human activities in these areas.

  5. Superboom Caustic Analysis and Measurement Program (SCAMP) Final Report

    Science.gov (United States)

    Page, Juliet; Plotkin, Ken; Hobbs, Chris; Sparrow, Vic; Salamone, Joe; Cowart, Robbie; Elmer, Kevin; Welge, H. Robert; Ladd, John; Maglieri, Domenic; Piacsek, Andrew

    2015-01-01

    The objectives of the Superboom Caustic Analysis and Measurement (SCAMP) Program were to develop and validate, via flight-test measurements, analytical models for sonic boom signatures in and around focal zones as they are expected to occur during commercial aircraft transition from subsonic to supersonic flight, and to apply these models to focus boom prediction of low-boom aircraft designs. The SCAMP program has successfully investigated sonic boom focusing both analytically and experimentally, while gathering a comprehensive empirical flight test and acoustic dataset, and developing a suite of focused sonic boom prediction tools. An experimental flight and acoustic measurement test was designed during the initial year of the SCAMP program, with execution of the SCAMP flight test occurring in May 2011. The current SCAMP team, led by Wyle, includes partners from the Boeing Company, Pennsylvania State University, Gulfstream Aerospace, Eagle Aeronautics, and Central Washington University. Numerous collaborators have also participated by supporting the experiment with human and equipment resources at their own expense. The experiment involved precision flight of a McDonnell Douglas (now Boeing) F-18B executing different maneuvers that created focused sonic booms. The maneuvers were designed to center on the flight regime expected for commercial supersonic aircraft transonic transition, and also span a range of caustic curvatures in order to provide a variety of conditions for code validations. The SCAMP experiment was designed to capture concurrent F-18B on-board flight instrumentation data, high-fidelity ground-based and airborne acoustic data, and surface and upper air meteorological data. Close coordination with NASA Dryden resulted in the development of new experimental instrumentation and techniques to facilitate the SCAMP flight-test execution, including the development of an F-18B Mach rate cockpit display, TG-14 powered glider in-flight sonic boom measurement

  6. Final Report - Independent Verification Survey Report for the Waste Loading Area, Former Hazardous Waste Management Facility, Brookhaven National Laboratory, Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    P.C. Weaver

    2008-08-19

    The objective of the verification survey was to obtain evidence by means of measurements and sampling to confirm that the final radiological conditions were less than the established release criteria. This objective was achieved via multiple verification components including document reviews to determine the accuracy and adequacy of FSS documentation.

  7. Analysis of the comprehensibility of chemical hazard communication tools at the industrial workplace.

    Science.gov (United States)

    Ta, Goh Choo; Mokhtar, Mazlin Bin; Mohd Mokhtar, Hj Anuar Bin; Ismail, Azmir Bin; Abu Yazid, Mohd Fadhil Bin Hj

    2010-01-01

    Chemical classification and labelling systems may be roughly similar from one country to another but there are significant differences too. In order to harmonize various chemical classification systems and ultimately provide consistent chemical hazard communication tools worldwide, the Globally Harmonized System of Classification and Labelling of Chemicals (GHS) was endorsed by the United Nations Economic and Social Council (ECOSOC). Several countries, including Japan, Taiwan, Korea and Malaysia, are now in the process of implementing GHS. It is essential to ascertain the comprehensibility of chemical hazard communication tools that are described in the GHS documents, namely the chemical labels and Safety Data Sheets (SDS). Comprehensibility Testing (CT) was carried out with a mixed group of industrial workers in Malaysia (n=150) and factors that influence the comprehensibility were analysed using one-way ANOVA. The ability of the respondents to retrieve information from the SDS was also tested in this study. The findings show that almost all the GHS pictograms meet the ISO comprehension criteria and it is concluded that the underlying core elements that enhance comprehension of GHS pictograms and which are also essential in developing competent persons in the use of SDS are training and education.

  8. Hazards of volcanic lakes: analysis of Lakes Quilotoa and Cuicocha, Ecuador

    Directory of Open Access Journals (Sweden)

    G. Gunkel

    2008-01-01

    Full Text Available Volcanic lakes within calderas should be viewed as high-risk systems, and an intensive lake monitoring must be carried out to evaluate the hazard of potential limnic or phreatic-magmatic eruptions. In Ecuador, two caldera lakes – Lakes Quilotoa and Cuicocha, located in the high Andean region >3000 a.s.l. – have been the focus of these investigations. Both volcanoes are geologically young or historically active, and have formed large and deep calderas with lakes of 2 to 3 km in diameter, and 248 and 148 m in depth, respectively. In both lakes, visible gas emissions of CO2 occur, and an accumulation of CO2 in the deep water body must be taken into account.

    Investigations were carried out to evaluate the hazards of these volcanic lakes, and in Lake Cuicocha intensive monitoring was carried out for the evaluation of possible renewed volcanic activities. At Lake Quilotoa, a limnic eruption and diffuse CO2 degassing at the lake surface are to be expected, while at Lake Cuicocha, an increased risk of a phreatic-magmatic eruption exists.

  9. A new concept in seismic landslide hazard analysis for practical application

    Science.gov (United States)

    Lee, Chyi-Tyi

    2017-04-01

    A seismic landslide hazard model could be constructed using deterministic approach (Jibson et al., 2000) or statistical approach (Lee, 2014). Both approaches got landslide spatial probability under a certain return-period earthquake. In the statistical approach, our recent study found that there are common patterns among different landslide susceptibility models of the same region. The common susceptibility could reflect relative stability of slopes at a region; higher susceptibility indicates lower stability. Using the common susceptibility together with an earthquake event landslide inventory and a map of topographically corrected Arias intensity, we can build the relationship among probability of failure, Arias intensity and the susceptibility. This relationship can immediately be used to construct a seismic landslide hazard map for the region that the empirical relationship built. If the common susceptibility model is further normalized and the empirical relationship built with normalized susceptibility, then the empirical relationship may be practically applied to different region with similar tectonic environments and climate conditions. This could be feasible, when a region has no existing earthquake-induce landslide data to train the susceptibility model and to build the relationship. It is worth mentioning that a rain-induced landslide susceptibility model has common pattern similar to earthquake-induced landslide susceptibility in the same region, and is usable to build the relationship with an earthquake event landslide inventory and a map of Arias intensity. These will be introduced with examples in the meeting.

  10. Analysis and GIS Mapping of Flooding Hazards on 10 May 2016, Guangzhou, China

    Directory of Open Access Journals (Sweden)

    Hai-Min Lyu

    2016-10-01

    Full Text Available On 10 May 2016, Guangdong Province, China, suffered a heavy rainstorm. This rainstorm flooded the whole city of Guangzhou. More than 100,000 people were affected by the flooding, in which eight people lost their lives. Subway stations, cars, and buses were submerged. In order to analyse the influential factors of this flooding, topographical characteristics were mapped using Digital Elevation Model (DEM by the Geographical Information System (GIS and meteorological conditions were statistically summarised at both the whole city level and the district level. To analyse the relationship between flood risk and urbanization, GIS was also adopted to map the effect of the subway system using the Multiple Buffer operator over the flooding distribution area. Based on the analyses, one of the significant influential factors of flooding was identified as the urbanization degree, e.g., construction of a subway system, which forms along flood-prone areas. The total economic loss due to flooding in city centers with high urbanization has become very serious. Based on the analyses, the traditional standard of severity of flooding hazards (rainfall intensity grade was modified. Rainfall intensity for severity flooding was decreased from 50 mm to 30 mm in urbanized city centers. In order to protect cities from flooding, a “Sponge City” planning approach is recommended to increase the temporary water storage capacity during heavy rainstorms. In addition, for future city management, the combined use of GIS and Building Information Modelling (BIM is recommended to evaluate flooding hazards.

  11. Elemental analysis and radiation hazards parameters of bauxite located in Saudi Arabia

    Science.gov (United States)

    Alashrah, S.; E Taher, A.

    2017-04-01

    Since Bauxite has been widely used in industry and in scientific investigations for producing Aluminum, it is important to measure the radionuclides concentrations to determine the health effect. The Bauxite mine is located in Az Zabirah city in Saudi Arabia. The concentrations of the radionuclides in the bauxite samples were measured using γ-ray spectrometer NaI (Tl). The average and range values of the concentrations of 226Ra, 232Th and 40K were 102.2 (141.1-62.7), 156.3 (202.8-102.8) and 116.8 (191.7- 48.9) Bq/kg respectively. These results were compared with the reported ranges in the literature from other locations around the world. The radiation hazard parameters; radium equivalent activity, annual dose, external hazard were also calculated and compared with the recommended levels by International Commission on Radiological Protection (ICRP-60) and united nations scientific committee on the effects of atomic radiation UNSCEAR reports. There are no studies for the natural radioactivity in the bauxite mine in Az Zabirah city, so these results are a start to establishing a database in this location.

  12. Scale orientated analysis of river width changes due to extreme flood hazards

    Directory of Open Access Journals (Sweden)

    G. Krapesch

    2011-08-01

    Full Text Available This paper analyses the morphological effects of extreme floods (recurrence interval >100 years and examines which parameters best describe the width changes due to erosion based on 5 affected alpine gravel bed rivers in Austria. The research was based on vertical aerial photos of the rivers before and after extreme floods, hydrodynamic numerical models and cross sectional measurements supported by LiDAR data of the rivers. Average width ratios (width after/before the flood were calculated and correlated with different hydraulic parameters (specific stream power, shear stress, flow area, specific discharge. Depending on the geomorphological boundary conditions of the different rivers, a mean width ratio between 1.12 (Lech River and 3.45 (Trisanna River was determined on the reach scale. The specific stream power (SSP best predicted the mean width ratios of the rivers especially on the reach scale and sub reach scale. On the local scale more parameters have to be considered to define the "minimum morphological spatial demand of rivers", which is a crucial parameter for addressing and managing flood hazards and should be used in hazard zone plans and spatial planning.

  13. Causal Analysis of the Inadvertent Contact with an Uncontrolled Electrical Hazardous Energy Source (120 Volts AC)

    Energy Technology Data Exchange (ETDEWEB)

    David E. James; Dennis E. Raunig; Sean S. Cunningham

    2014-10-01

    On September 25, 2013, a Health Physics Technician (HPT) was performing preparations to support a pneumatic transfer from the HFEF Decon Cell to the Room 130 Glovebox in HFEF, per HFEF OI 3165 section 3.5, Field Preparations. This activity involves an HPT setting up and climbing a portable ladder to remove the 14-C meter probe from above ball valve HBV-7. The HPT source checks the meter and probe and then replaces the probe above HBV-7, which is located above Hood ID# 130 HP. At approximately 13:20, while reaching past the HBV-7 valve position indicator switches in an attempt to place the 14-C meter probe in the desired location, the HPT’s left forearm came in contact with one of the three sets of exposed terminals on the valve position indication switches for HBV 7. This resulted in the HPT receiving an electrical shock from a 120 Volt AC source. Upon moving the arm, following the electrical shock, the HPT noticed two exposed electrical connections on a switch. The HPT then notified the HFEF HPT Supervisor, who in turn notified the MFC Radiological Controls Manager and HFEF Operations Manager of the situation. Work was stopped in the area and the hazard was roped off and posted to prevent access to the hazard. The HPT was escorted by the HPT Supervisor to the MFC Dispensary and then preceded to CFA medical for further evaluation. The individual was evaluated and released without any medical restrictions. Causal Factor (Root Cause) A3B3C01/A5B2C08: - Knowledge based error/Attention was given to wrong issues - Written Communication content LTA, Incomplete/situation not covered The Causal Factor (root cause) was attention being given to the wrong issues during the creation, reviews, verifications, and actual performance of HFEF OI-3165, which covers the need to perform the weekly source check and ensure placement of the probe prior to performing a “rabbit” transfer. This resulted in the hazard not being identified and mitigated in the procedure. Work activities

  14. 基于知识本体的过程安全分析信息标准化%Standardized information for process hazard analysis based on ontology

    Institute of Scientific and Technical Information of China (English)

    吴重光; 许欣; 纳永良; 张卫华

    2012-01-01

    过程危险分析的主要目标是识别危险剧情.危险剧情能够表达团队“头脑风暴”安全评价过程也能表达评价结论.危险剧情的知识本体是标准化过程安全分析信息的准确描述.知识本体是概念表达的明确规范.依据设计知识本体所遵循的规则提出了一种过程安全分析信息标准化方法,称为剧情对象模型(scenario objectmodel,SOM).SOM能够表达安全分析信息的内容和结构,能够实施计算机自动推理和半定量计算.应用知识本体SOM有效实现了计算机辅助自动安全评价和安全信息的传递、复查和共享.%The principal objective of process hazard analysis is to identify hazard scenarios. Both the course of team brainstorming hazard evaluation and its result information can be expressed as hazard scenarios. Ontology of hazard scenarios is accurate expression of standardized process hazard analysis information. An ontology is an explicit specification of a conceptualization. According to design criteria for ontologies, a standardized process hazard analysis information called scenario object model (SOM) was proposed. SOM was used to represent contents and structures of hazard evaluation information. Computer automatic reasoning and semi-quantitative algorithms could be implemented on SOM. Computer-aided automatic hazard evaluation and transfer, auditing and sharing of safety information were realized effectively by using ontology SOM.

  15. Evolution and hazard analysis of high-mountain lakes in the Cordillera Vilcabamba (Southern Peru) from 1991 to 2014

    Science.gov (United States)

    Guardamino, Lucía; Drenkhan, Fabian

    2015-04-01

    In recent decades, glaciers in high-mountain regions have experienced unprecedented glacier retreat since the Little Ice Age (LIA). This development triggers the formation and growth of glacier lakes, which in combination with changes in glacier parameters might produce more frequent conditions for the occurrence of disasters, such as Glacier Lake Outburst Floods (GLOF). Facing such a scenario, the analysis of changing lake characteristics and identification of new glacier lakes are imperative in order to identify and reduce potential hazards and mitigate or prevent future disasters for adjacent human settlements. In this study, we present a multi-temporal analysis with Landsat TM 5 and OLI 8 images between 1991 and 2014 in the Cordillera Vilcabamba region (Southern Peru), a remote area with difficult access and climate and glaciological in-situ data scarcity. A semi-automatic model was developed using the band ratios Normalized Difference Snow Index (NDSI) and Normalized Difference Water Index (NDWI) in order to identify glacier and lake area changes. Results corroborate a strong glacier area reduction of about 51% from 1991 (200.3 km²) to 2014 (98.4 km²). At the same time, the number of lakes (total lake surface) has increased at an accelerated rate, from 0.77% (0.48%) in 1991 to 2.31% (2.49%) in 2014. In a multiple criteria analysis to identify potential hazards, 90 out of a total of 329 lakes in 2014 have been selected for further monitoring. Additionally, 29 population centers have been identified as highly exposed to lake related hazards from which 25 indicate a distance less than 1 km to an upstream lake and four are situated in a channel of potential debris flow. In these areas human risks are particularly high in view of a low HDI below Peru's average and hence pronounced vulnerability. We suggest more future research on measurements and monitoring of glacier and lake characteristics in these remote high-mountain regions, which include comprehensive risk

  16. Analisis Risk Assessment Menggunakan Process Hazard Analysis (PHA dan Safety Objective Analysis (SOA pada Central Gathering Station (CGS di Onshore Facilities

    Directory of Open Access Journals (Sweden)

    Dimas Jouhari

    2014-03-01

    Full Text Available Keselamatan proses merupakan faktor utama yang sering dibahas oleh industri-industri kimia beberapa tahun terakhir ini. Salah satu metode semi-kuantitatif yang dapat digunakan untuk mengidentifikasi, menganalisis, dan menetapkan tingkat risiko bahaya yaitu dengan Process Hazard Analysis (PHA dan Safety Objective Analysis (SOA. Hazard and Operability Studies (HAZOP dan What-If Analysis merupakan metode identifikasi bahaya kualitatif yang sering diterapkan secara simultan untuk PHA-SOA. Process Hazard Analysis (PHA ialah rangkaian aktivitas mengidentifikasi hazard, mengestimasi konsekuensi, mengestimasi likelihood suatu skenario proses disertai dengan safeguard, dan mendapatkan risk ranking yang dapat dilihat pada matrik PHA 6x6. Sedangkan Safety Objective Analysis (SOA merupakan rangkaian aktivitas yang bergantung pada penyebab skenario, dan konsekuensi dari PHA, menghasilkan kebutuhan IPL (Independent Protective Layer menggunakan matrik SOA 6x6. Risk ranking 6 pada penilaian PHA diketegorikan aman jika safeguard yang ada selalu siap mengurangi risiko yang timbul dari skenario tersebut. Namun tidak semua safeguard dapat selalu siap mengurangi risiko tersebut. Oleh karena itu, perlu adanya analisis tambahan untuk memastikan risiko dari skenario dapat diperkecil. Analisis safety suatu skenario dengan SOA menghasilkan kebutuhan IPL yang dapat ditutup dengan mengkonfirmasi safeguard yang sesuai menjadi IPL. Hasil penilaian PHA-SOA CGS 1, CGS 3, CGS 4, dan CGS 5 menunjukkan bahwa ada penilaian severity dan PHA-SOA likelihood yang berbeda di tiap CGS padahal proses pada CGS tersebut identik, maka perlu adanya analisis konsistensi. Hasil analisis konsistensi ini dapat dijadikan pedoman untuk melakukan safety review pada risk assessment workshop kedepannya, yang biasanya diadakan setiap tiga hingga lima tahun sekali oleh industri.

  17. A new probabilistic shift away from seismic hazard reality in Italy?

    CERN Document Server

    Nekrasova, Anastasia; Kossobokov, Volodya; Panza, Giuliano F

    2014-01-01

    Objective testing is a key issue in the process of revision and improvement of seismic hazard assessments. Therefore we continue the rigorous comparative analysis of past and newly available hazard maps for the territory of Italy against the seismic activity observed in reality. The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with the reference hazard maps for the Italian seismic code, all obtained by probabilistic seismic hazard assessment (PSHA), are cross-compared to the three ground shaking maps based on the duly physically and mathematically rooted neo-deterministic approach (NDSHA). These eight hazard maps for Italy are tested against the available data on ground shaking. The results of comparison between predicted macroseismic intensities and those reported for past earthquakes (in the time interval from 1000 to 2014 year) show that models provide rather conservative estimates, which ten...

  18. Spatial analysis of the Los Tuxtlas Volcanic Field (LTVF) and hazard implications

    Science.gov (United States)

    Sieron, K.; Alvarez, D.

    2013-05-01

    The Tuxtlas volcanic field (LTVF) is located in the southern part of Veracruz state (Mexico) adjacent to the Gulf of Mexico and consists of 4 large volcanic edifices, 3 of them considered inactive and the active San Martin shield volcano. The monogenetic volcanoes belonging to the younger series are represented by hundreds of scoria cones and tens of maars and tuff cones, all of which show ages less than 50,000 years. In comparison to other monogenetic fields, the scoria cone density is quite elevated with 0.2 cones/km2, although the highest scoria cone density can be observed along narrow zones corresponding to the main NW-SE fault system where it reaches 0.7 cones/km2. Scoria cones occur as single edifices and in clusters and show individual edifice volumes of 0.0009 km3 to 0.2 km3, cone heights varying between 21.39 m and 299.21 m. Lava flows associated to scoria cones originate especially along the main NW-SE trending main fault and present run out distances up to 11 kilometers. Only few radiocarbon and Ar-Ar dates exist for the LTVF, mostly because of the high cone density and dense vegetation of the Los Tuxtlas region. Therefore, morphological parameters were used to estimate relative ages. In consequence, the scoria cones can be subdivided into four age groups; the members of each group do not seem to follow any particular trend and are rather scattered throughout the field. The explosive (or wet) equivalents of the mainly basaltic strombolian scoria cones are explosion craters, such as maars and tuff cones, show the highest concentration along the border of the two main geological units to the S of the area with the highest scoria cone concentration. Although the relatively small scale strombolian eruptions associated to scoria cone emplacement do not represent a considerable hazard for the surrounding population, lava flows can easily extent to the main urban zones accommodating about 262,384 inhabitants. Within the area prone to maar formation, the hazard

  19. Hazard screening of chemical releases and environmental equity analysis of populations proximate to toxic release inventory facilities in Oregon.

    Science.gov (United States)

    Neumann, C M; Forman, D L; Rothlein, J E

    1998-04-01

    A comprehensive approach using hazard screening, demographic analysis, and a geographic information system (GIS) for mapping is employed to address environmental equity issues in Oregon. A media-specific chronic toxicity index [or chronic index (CI)] was used to compare environmental chemical releases reported in the EPA's Toxic Chemical Release Inventory (TRI) database. In 1992, 254 facilities reportedly released more than 40 million pounds of toxic chemicals directly into the environment on-site or transferred them to sewage treatment plants or other off-site facilities for disposal and recycling. For each reported on-site TRI chemical release, a CI based on oral toxicity factors and total mass was calculated. CIs were aggregated on a media-, facility-, and chemical-specific basis. Glycol ethers, nickel, trichloroethylene, chloroform, and manganese were ranked as the top five chemicals released statewide based on total CI. In contrast, based on total mass, methanol, nickel, ammonia, acetone, and toluene were identified as the top five TRI chemicals released in Oregon. TRI facility rankings were related to the demographics and household income of surrounding neighborhoods using bivariate GIS mapping and statistical analysis. TRI facilities were disproportionately located in racial and ethnic minority neighborhoods. They were also located in areas with lower incomes compared to those in the surrounding county. No relationship was observed between the hazard ranking of the TRI facilities overall and socioeconomic characteristics of the community in which they were located.

  20. Analysis of the hazardous low-altitude snowfall, 8th March 2010, in Catalonia

    Science.gov (United States)

    Aran, M.; Rigo, T.; Bech, J.; Brucet, C.; Vilaclara, E.

    2010-09-01

    During winter season snow precipitation is quite frequent in the Pyrenees (north-east of the Iberian Peninsula). On average the total amount of fresh snow at 2200 metres is of 250 cm. However, important snow episodes at low latitudes are unlikely. From 1947 to 2009, 16 significant snow episodes took place in the Barcelona and 18 in Girona areas. On 8th March 2010, a severe wet snow event had a high social impact on these regions. One of the most remarkable features of this episode was the type of precipitation (wet snow) and the large amount of precipitation combined with strong wind gust that caused the collapse of electricity pylons and tree forests. The damage was very important in the north-eastern part and the regional government approved funds of 21.4 million € to reduce the impact caused by this event. Although diagnosis of other low altitude snowfall events in Catalonia has been done previously, the analysis of this event can contribute to characterise a little bit better these snow episodes. In this study, we will present the synoptic framework characterised by the presence of a deep low in the north-east of Catalonia and moving through Ebro valley to the Catalan coast. To do this we will use ECMWF reanalyses and Meteosat images. The main features to predict this snow event and the critical point were the total amount of precipitation and snow level forecasted by mesoscale models (MM5, WRF). The model outputs for precipitation, temperature and wind will be compared with automatic weather, radar and radiosounding data. The snow level and the type of precipitation are compared with the information received from spotters. The main storm was characterised by moderate vertical development with tops of 8 km (4 km were the average height during the initial and final phase of the event). Also, lightning activity was observed, 310 intra-cloud and 128 cloud-to-ground. The type of precipitation at a specific location in the eastern zone temporally changed because