WorldWideScience

Sample records for preliminary hazards analysis

  1. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  2. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  3. Repository Subsurface Preliminary Fire Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  4. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  5. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  6. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    Energy Technology Data Exchange (ETDEWEB)

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  7. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  8. A Sensitivity Study for an Evaluation of Input Parameters Effect on a Preliminary Probabilistic Tsunami Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyun-Me; Kim, Min Kyu; Choi, In-Kil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Dong-Hoon [Chonnam National University, Gwangju (Korea, Republic of)

    2014-10-15

    The tsunami hazard analysis has been based on the seismic hazard analysis. The seismic hazard analysis has been performed by using the deterministic method and the probabilistic method. To consider the uncertainties in hazard analysis, the probabilistic method has been regarded as attractive approach. The various parameters and their weight are considered by using the logic tree approach in the probabilistic method. The uncertainties of parameters should be suggested by analyzing the sensitivity because the various parameters are used in the hazard analysis. To apply the probabilistic tsunami hazard analysis, the preliminary study for the Ulchin NPP site had been performed. The information on the fault sources which was published by the Atomic Energy Society of Japan (AESJ) had been used in the preliminary study. The tsunami propagation was simulated by using the TSUNAMI{sub 1}.0 which was developed by Japan Nuclear Energy Safety Organization (JNES). The wave parameters have been estimated from the result of tsunami simulation. In this study, the sensitivity analysis for the fault sources which were selected in the previous studies has been performed. To analyze the effect of the parameters, the sensitivity analysis for the E3 fault source which was published by AESJ was performed. The effect of the recurrence interval, the potential maximum magnitude, and the beta were suggested by the sensitivity analysis results. Level of annual exceedance probability has been affected by the recurrence interval.. Wave heights have been influenced by the potential maximum magnitude and the beta. In the future, the sensitivity analysis for the all fault sources in the western part of Japan which were published AESJ would be performed.

  9. Assessing the long-term probabilistic volcanic hazard for tephra fallout in Reykjavik, Iceland: a preliminary multi-source analysis

    Science.gov (United States)

    Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús

    2015-04-01

    Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results

  10. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  11. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-02-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  12. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Project

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-10-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  13. Preliminary Earthquake Hazard Map of Afghanistan

    Science.gov (United States)

    Boyd, Oliver S.; Mueller, Charles S.; Rukstales, Kenneth S.

    2007-01-01

    . Deformation here is expressed as a belt of major, north-northeast-trending, left-lateral strike-slip faults and abundant seismicity. The seismicity intensifies farther to the northeast and includes a prominent zone of deep earthquakes associated with northward subduction of the Indian plate beneath Eurasia that extends beneath the Hindu Kush and Pamirs Mountains. Production of the seismic hazard maps is challenging because the geological and seismological data required to produce a seismic hazard model are limited. The data that are available for this project include historical seismicity and poorly constrained slip rates on only a few of the many active faults in the country. Much of the hazard is derived from a new catalog of historical earthquakes: from 1964 to the present, with magnitude equal to or greater than about 4.5, and with depth between 0 and 250 kilometers. We also include four specific faults in the model: the Chaman fault with an assigned slip rate of 10 mm/yr, the Central Badakhshan fault with an assigned slip rate of 12 mm/yr, the Darvaz fault with an assigned slip rate of 7 mm/yr, and the Hari Rud fault with an assigned slip rate of 2 mm/yr. For these faults and for shallow seismicity less than 50 km deep, we incorporate published ground-motion estimates from tectonically active regions of western North America, Europe, and the Middle East. Ground-motion estimates for deeper seismicity are derived from data in subduction environments. We apply estimates derived for tectonic regions where subduction is the main tectonic process for intermediate-depth seismicity between 50- and 250-km depth. Within the framework of these limitations, we have developed a preliminary probabilistic seismic-hazard assessment of Afghanistan, the type of analysis that underpins the seismic components of modern building codes in the United States. The assessment includes maps of estimated peak ground-acceleration (PGA), 0.2-second spectral acceleration (SA), and 1.0-secon

  14. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  15. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  16. K Basin Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  17. K Basins Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WEBB, R.H.

    1999-12-29

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062, Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  18. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  19. Preliminary Feasibility, Design, and Hazard Analysis of a Boiling Water Test Loop Within the Idaho National Laboratory Advanced Test Reactor National Scientific User Facility

    Energy Technology Data Exchange (ETDEWEB)

    Douglas M. Gerstner

    2009-05-01

    The Advanced Test Reactor (ATR) is a pressurized light-water reactor with a design thermal power of 250 MW. The principal function of the ATR is to provide a high neutron flux for testing reactor fuels and other materials. The ATR and its support facilities are located at the Idaho National Laboratory (INL). A Boiling Water Test Loop (BWTL) is being designed for one of the irradiation test positions within the. The objective of the new loop will be to simulate boiling water reactor (BWR) conditions to support clad corrosion and related reactor material testing. Further it will accommodate power ramping tests of candidate high burn-up fuels and fuel pins/rods for the commercial BWR utilities. The BWTL will be much like the pressurized water loops already in service in 5 of the 9 “flux traps” (region of enhanced neutron flux) in the ATR. The loop coolant will be isolated from the primary coolant system so that the loop’s temperature, pressure, flow rate, and water chemistry can be independently controlled. This paper presents the proposed general design of the in-core and auxiliary BWTL systems; the preliminary results of the neutronics and thermal hydraulics analyses; and the preliminary hazard analysis for safe normal and transient BWTL and ATR operation.

  20. MGR External Events Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  1. Probabilistic analysis of tsunami hazards

    Science.gov (United States)

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  2. Hazardous Materials Hazard Analysis, Portland, Oregon.

    Science.gov (United States)

    1981-06-01

    ACCIDENTS IN OREGON, 1976-1979 INJURY RATE FATALITY RATE (per 100 million nilles ) (per 100 million miles) Injuries Fatalities 100 - 94. 8 80 75 - - 6...commercial vehicle Involved. Driver fault--icy road conditions caused truck to jack -knIfe and skid. Resulted in hazardous material spill and relase and...Wheel gem tanks retrieved her body. Huerta Mayor Jack Pirog said Mobil Chemi- Corp. i Mendota. She distributed the revived after emergency treatment at

  3. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    Energy Technology Data Exchange (ETDEWEB)

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  4. Application of Preliminary Hazard Analysis in Operation and Management at Secondary Surveillance Radar Station%预先危险分析方法在航管二次雷达站运行管理中的应用

    Institute of Scientific and Technical Information of China (English)

    舒涛

    2012-01-01

    本文对预先危险分析方法作了简单介绍。重点研究了PHA在航管二次雷达站运行管理中的应用,找出了广汉机场航管二次雷达站运行管理中存在的危险源,采用定性和定量的方法从危险的后果严重程度和发生可能性两方面对风险进行了分析,提出了控制危险性的有效措施。%This paper gives a brief introduction of preliminary hazard analysis and keys the study on application of preliminary hazard analysis in the operation and management at Secondary Surveillance Radar (SSR) Stations. It finds out the dangerous source in the operation and management of SSR Station at Guanghan Airport and analyzes the risks with the qualitative and quantitative meth- ods from the viewpoints of serious corisequences and potential dangers. It proposes effective measures to control the risks, reduce the probability of accidents, ensure the normal operation of the SSR, and secure the flight training of our college.

  5. Preliminary deformation model for National Seismic Hazard map of Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Meilano, Irwan; Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z. [Geodesy Research Division, Faculty of Earth Science and Technology, Institute of Technology Bandung (Indonesia); Susilo,; Efendi, Joni [Agency for Geospatial Information (BIG) (Indonesia)

    2015-04-24

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  6. The Integrated Hazard Analysis Integrator

    Science.gov (United States)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  7. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  8. Phase 2 fire hazard analysis for the canister storage building

    Energy Technology Data Exchange (ETDEWEB)

    Sadanaga, C.T., Westinghouse Hanford

    1996-07-01

    The fire hazard analysis assesses the risk from fire in a facility to ascertain whether the fire protection policies are met. This document provides a preliminary FHA for the CSB facility. Open items have been noted in the document. A final FHA will be required at the completion of definitive design, prior to operation of the facility.

  9. 14 CFR 437.29 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with §...

  10. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  11. Canister storage building hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    POWERS, T.B.

    1999-05-11

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis was performed in accordance with the DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', and meets the intent of HNF-PRO-704, ''Hazard and Accident Analysis Process''. This hazard analysis implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports''.

  12. UVISS preliminary visibility analysis

    DEFF Research Database (Denmark)

    Betto, Maurizio

    1998-01-01

    The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part of the w......The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part...

  13. UVISS preliminary visibility analysis

    DEFF Research Database (Denmark)

    Betto, Maurizio

    1998-01-01

    The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part...

  14. 14 CFR 437.55 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee...

  15. Cold Vacuum Drying Facility hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  16. Canister storage building hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  17. Chemical Safety Alert: Identifying Chemical Reactivity Hazards Preliminary Screening Method

    Science.gov (United States)

    Introduces small-to-medium-sized facilities to a method developed by Center for Chemical Process Safety (CCPS), based on a series of twelve yes-or-no questions to help determine hazards in warehousing, repackaging, blending, mixing, and processing.

  18. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  19. Preliminary volcano-hazard assessment for Iliamna Volcano, Alaska

    Science.gov (United States)

    Waythomas, Christopher F.; Miller, Thomas P.

    1999-01-01

    Iliamna Volcano is a 3,053-meter-high, ice- and snow-covered stratovolcano in the southwestern Cook Inlet region about 225 kilometers southwest of Anchorage and about 100 kilometers northwest of Homer. Historical eruptions of Iliamna Volcano have not been positively documented; however, the volcano regularly emits steam and gas, and small, shallow earthquakes are often detected beneath the summit area. The most recent eruptions of the volcano occurred about 300 years ago, and possibly as recently as 90-140 years ago. Prehistoric eruptions have generated plumes of volcanic ash, pyroclastic flows, and lahars that extended to the volcano flanks and beyond. Rock avalanches from the summit area have occurred numerous times in the past. These avalanches flowed several kilometers down the flanks and at least two large avalanches transformed to cohesive lahars. The number and distribution of known volcanic ash deposits from Iliamna Volcano indicate that volcanic ash clouds from prehistoric eruptions were significantly less voluminous and probably less common relative to ash clouds generated by eruptions of other Cook Inlet volcanoes. Plumes of volcanic ash from Iliamna Volcano would be a major hazard to jet aircraft using Anchorage International Airport and other local airports, and depending on wind direction, could drift at least as far as the Kenai Peninsula and beyond. Ashfall from future eruptions could disrupt oil and gas operations and shipping activities in Cook Inlet. Because Iliamna Volcano has not erupted for several hundred years, a future eruption could involve significant amounts of ice and snow that could lead to the formation of large lahars and downstream flooding. The greatest hazards in order of importance are described below and shown on plate 1.

  20. Concept Overview & Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, Mark

    2017-07-12

    'H2@Scale' is an opportunity for wide-scale use of hydrogen as an intermediate that carries energy from various production options to multiple uses. It is based on identifying and developing opportunities for low-cost hydrogen production and investigating opportunities for using that hydrogen across the electricity, industrial, and transportation sectors. One of the key production opportunities is use of low-cost electricity that may be generated under high penetrations of variable renewable generators such as wind and solar photovoltaics. The technical potential demand for hydrogen across the sectors is 60 million metric tons per year. The U.S. has sufficient domestic renewable resources so that each could meet that demand and could readily meet the demand using a portfolio of generation options. This presentation provides an overview of the concept and the technical potential demand and resources. It also motivates analysis and research on H2@Scale.

  1. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  2. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  3. Preliminary volcano-hazard assessment for Augustine Volcano, Alaska

    Science.gov (United States)

    Waythomas, Christopher F.; Waitt, Richard B.

    1998-01-01

    Augustine Volcano is a 1250-meter high stratovolcano in southwestern Cook Inlet about 280 kilometers southwest of Anchorage and within about 300 kilometers of more than half of the population of Alaska. Explosive eruptions have occurred six times since the early 1800s (1812, 1883, 1935, 1964-65, 1976, and 1986). The 1976 and 1986 eruptions began with an initial series of vent-clearing explosions and high vertical plumes of volcanic ash followed by pyroclastic flows, surges, and lahars on the volcano flanks. Unlike some prehistoric eruptions, a summit edifice collapse and debris avalanche did not occur in 1812, 1935, 1964-65, 1976, or 1986. However, early in the 1883 eruption, a portion of the volcano summit broke loose forming a debris avalanche that flowed to the sea. The avalanche initiated a small tsunami reported on the Kenai Peninsula at English Bay, 90 kilometers east of the volcano. Plumes of volcanic ash are a major hazard to jet aircraft using Anchorage International and other local airports. Ashfall from future eruptions could disrupt oil and gas operations and shipping activities in Cook Inlet. Eruptions similar to the historical and prehistoric eruptions are likely in Augustine's future.

  4. Potentially hazardous plants of Puerto Rico: preliminary guide

    Energy Technology Data Exchange (ETDEWEB)

    Ferguson, F.F.; Medina, F.R.

    1975-08-01

    General information is presented about the kinds of native and imported plants in Puerto Rico (weeds, grasses, vines, cactuses, shrubs, trees and parts thereof) that should be avoided, or not ingested. Small amounts of eaten wild plant materials are usually not likely to be hazardous although large amounts may be dangerous; the striking exception is mushrooms. While a number of Puerto Rican plants are lethal to cattle, only a few are known to cause death to man as, for example, the fruit of the Deadly Manchineel, Hippomane mancinella and the seed of the Rosary Pea, Abrus precatorius. Tourists especially should avoid tasting any green or yellowish apples growing on a medium-sized tree. The Hippomane fruit resembles the Crabapple of temperate zones. It is now unlawful to use the Rosary Pea in the local handicraft industry. An item of special interest is the delicious fruit of Mamey often offered for sale at roadside, the outer coating of which is poisonous. All of the light brown outer covering, including especially all of the inner whitish tunic, must be carefully removed from the golden yellow fruit before eating, or else illness may result. Relatively few of the plants presented here will produce major physical problems if only contacted or chewed, but ingestion of some plant parts produces severe toxic symptoms.

  5. Preliminary tsunami hazard assessment in British Columbia, Canada

    Science.gov (United States)

    Insua, T. L.; Grilli, A. R.; Grilli, S. T.; Shelby, M. R.; Wang, K.; Gao, D.; Cherniawsky, J. Y.; Harris, J. C.; Heesemann, M.; McLean, S.; Moran, K.

    2015-12-01

    Ocean Networks Canada (ONC), a not-for-profit initiative by the University of Victoria that operates several cabled ocean observatories, is developing a new generation of ocean observing systems (referred to as Smart Ocean Systems™), involving advanced undersea observation technologies, data networks and analytics. The ONC Tsunami project is a Smart Ocean Systems™ project that addresses the need for a near-field tsunami detection system for the coastal areas of British Columbia. Recent studies indicate that there is a 40-80% probability over the next 50 for a significant tsunami impacting the British Columbia (BC) coast with runups higher than 1.5 m. The NEPTUNE cabled ocean observatory, operated by ONC off of the west coast of British Columbia, could be used to detect near-field tsunami events with existing instrumentation, including seismometers and bottom pressure recorders. As part of this project, new tsunami simulations are underway for the BC coast. Tsunami propagation is being simulated with the FUNWAVE-TVD model, for a suite of new source models representing Cascadia megathrust rupture scenarios. Simulations are performed by one-way coupling in a series of nested model grids (from the source to the BC coast), whose bathymetry was developed based on digital elevation maps (DEMs) of the area, to estimate both tsunami arrival time and coastal runup/inundation for different locations. Besides inundation, maps of additional parameters such as maximum current are being developed, that will aid in tsunami hazard assessment and risk mitigation, as well as developing evacuation plans. We will present initial results of this work for the Port Alberni inlet, in particular Ucluelet, based on new source models developed using the best available data. We will also present a model validation using measurements of the 2011 transpacific Tohoku-oki tsunami recorded in coastal BC by several instruments from various US and Canadian agencies.

  6. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Sevougian, S. David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  7. Results of the probabilistic volcanic hazard analysis project

    Energy Technology Data Exchange (ETDEWEB)

    Youngs, R.; Coppersmith, K.J.; Perman, R.C. [Geomatrix Consultants, Inc., San Francisco, CA (United States)

    1996-12-01

    The Probabilistic Volcanic Hazard Analysis (PVHA) project, sponsored by the U.S. Department of Energy (DOE), has been conducted to assess the probability of a future volcanic event disrupting the potential repository at Yucca Mountain. The methodology for the PVHA project is summarized in Coppersmith and others (this volume). The judgments of ten earth scientists who were members of an expert panel were elicited to ensure that a wide range of approaches were considered. Each expert identified one or more approaches for assessing the hazard and they quantified their uncertainties in models and parameter values. Aggregated results are expressed as a probability distribution on the annual frequency of intersecting the proposed repository block. This paper presents some of the key results of the PVHA assessments. These results are preliminary; the final report for the study is planned to be submitted to DOE in April 1996.

  8. 40 CFR 68.67 - Process hazard analysis.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process hazard analysis. 68.67 Section...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a) The owner or operator shall perform an initial process hazard analysis (hazard evaluation)...

  9. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    Science.gov (United States)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  10. Probabilistic earthquake hazard analysis for Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  11. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  12. Fire hazard analysis for fusion energy experiments

    Energy Technology Data Exchange (ETDEWEB)

    Alvares, N.J.; Hasegawa, H.K.

    1979-01-01

    The 2XIIB mirror fusion facility at Lawrence Livermore Laboratory (LLL) was used to evaluate the fire safety of state-of-the-art fusion energy experiments. The primary objective of this evaluation was to ensure the parallel development of fire safety and fusion energy technology. Through fault-tree analysis, we obtained a detailed engineering description of the 2XIIB fire protection system. This information helped us establish an optimum level of fire protection for experimental fusion energy facilities as well as evaluate the level of protection provided by various systems. Concurrently, we analyzed the fire hazard inherent to the facility using techniques that relate the probability of ignition to the flame spread and heat-release potential of construction materials, electrical and thermal insulations, and dielectric fluids. A comparison of the results of both analyses revealed that the existing fire protection system should be modified to accommodate the range of fire hazards inherent to the 2XIIB facility.

  13. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2... REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a)...

  14. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  15. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    Science.gov (United States)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  16. Verification of C. G. Jung's analysis of Rowland Hazard and the history of Alcoholics Anonymous.

    Science.gov (United States)

    Bluhm, Amy Colwell

    2006-11-01

    Extant historical scholarship in the Jungian literature and the Alcoholics Anonymous (AA) literature does not provide a complete picture of the treatment of Rowland Hazard by C. G. Jung, an analysis that AA co-founder Bill Wilson claimed was integral to the foundation of AA in theory and practice. Wilson's original report resulted in archivists and historians incorrectly calibrating their searches to the wrong date. The current work definitively solves the mystery of the timing of Hazard's treatment with Jung by placing his preliminary analysis with Jung in the year 1926, rather than 1930 or 1931. Previously unexamined correspondence originating from Jung, Hazard, his cousin Leonard Bacon, his uncle Irving Fisher, and his aunt Margaret Hazard Fisher is supplemented by relevant primary and secondary source material.

  17. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-03-01

    When training for a job in which human error has the potential of producing catastrophic results, an understanding of the hazards that may be encountered is of paramount importance. In high consequence activities, it is important that the training program be conducted in a safe environment and yet emphasize the potential hazards. Because of the high consequence of a human error the use of a high-fidelity simulation is of great importance to provide the safe environment the worker needs to learn and hone required skills. A hazards analysis identifies the operation hazards, potential human error, and associated positive measures that aid in the mitigation or prevention of the hazard. The information gained from the hazards analysis should be used in the development of training. This paper will discuss the integration of information from the hazards analysis into the development of simulation components of a training program.

  18. Fire hazards analysis of transuranic waste storage and assay facility

    Energy Technology Data Exchange (ETDEWEB)

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  19. Total system hazards analysis for the western area demilitarization facility

    Science.gov (United States)

    Pape, R.; Mniszewski, K.; Swider, E.

    1984-08-01

    The results of a hazards analysis of the Western Area Demilitarization facility (WADF) at Hawthorne, Nevada are summarized. An overview of the WADF systems, the hazards analysis methodology that was applied, a general discussion of the fault tree analysis results, and a compilation of the conclusions and recommendations for each area of the facility are given.

  20. A LiDAR based analysis of hydraulic hazard mapping

    Science.gov (United States)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  1. Preliminary volcanic hazards evaluation for Los Alamos National Laboratory Facilities and Operations : current state of knowledge and proposed path forward

    Energy Technology Data Exchange (ETDEWEB)

    Keating, Gordon N.; Schultz-Fellenz, Emily S.; Miller, Elizabeth D.

    2010-09-01

    The integration of available information on the volcanic history of the region surrounding Los Alamos National Laboratory indicates that the Laboratory is at risk from volcanic hazards. Volcanism in the vicinity of the Laboratory is unlikely within the lifetime of the facility (ca. 50–100 years) but cannot be ruled out. This evaluation provides a preliminary estimate of recurrence rates for volcanic activity. If further assessment of the hazard is deemed beneficial to reduce risk uncertainty, the next step would be to convene a formal probabilistic volcanic hazards assessment.

  2. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-12-01

    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  3. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    Science.gov (United States)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  4. 327 Building fire hazards analysis implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    Eggen, C.D.

    1998-09-16

    In March 1998, the 327 Building Fire Hazards Analysis (FRA) (Reference 1) was approved by the US Department of Energy, Richland Operations Office (DOE-RL) for implementation by B and W Hanford Company (B and WHC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in five areas and provided nine recommendations (11 items) to bring the 327 Building into compliance. To date, actions for five of the 11 items have been completed. Exemption requests will be transmitted to DOE-RL for two of the items. Corrective actions have been identified for the remaining four items. The completed actions address combustible loading requirements associated with the operation of the cells and support areas. The status of the recommendations and actions was confirmed during the July 1998 Fire Protection Assessment. B and WHC will use this Implementation Plan to bring the 327 Building and its operation into compliance with DOE Order 5480.7A and RLID 5480.7.

  5. 14 CFR 417.223 - Flight hazard area analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard...

  6. Cold Vacuum Drying (CVD) Facility Hazards Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    CROWE, R.D.

    2000-08-07

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) Hazard Analysis to support the CVDF Final Safety Analysis Report and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports,'' and implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports.''

  7. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-15

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, each based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.

  8. UPDATE TO THE PROBABILISTIC VOLCANIC HAZARD ANALYSIS, YUCCA MOUNTAIN, NEVADA

    Energy Technology Data Exchange (ETDEWEB)

    K.J. Coppersmith

    2005-09-14

    A probabilistic volcanic hazard analysis (PVHA) was conducted in 1996 for the proposed repository at Yucca Mountain, Nevada. Based on data gathered by the Yucca Mountain Project over the course of about 15 years, the analysis integrated the judgments of a panel of ten volcanic experts using methods of formal expert elicitation. PVHA resulted in a probability distribution of the annual frequency of a dike intersecting the repository, which ranges from 10E-7 to 10E-10 (mean 1.6 x 10E-8). The analysis incorporates assessments of the future locations, rates, and types of volcanic dikes that could intersect the repository, which lies about 300 m below the surface. A particular focus of the analysis is the quantification of uncertainties. Since the 1996 PVHA, additional aeromagnetic data have been collected in the Yucca Mountain region, including a high-resolution low-altitude survey. A number of anomalies have been identified within alluvial areas and modeling suggests that some of these may represent buried eruptive centers (basaltic cinder cones). A program is currently underway to drill several of the anomalies to gain information on their origin and, if basalt, their age and composition. To update the PVHA in light of the new aeromagnetic and drilling data as well as other advancements in volcanic hazard modeling over the past decade, the expert panel has been reconvened and the expert elicitation process has been fully restarted. The analysis requires assessments of the spatial distribution of igneous events, temporal distributions, and geometries and characteristics of future events (both intrusive and extrusive). The assessments are for future time periods of 10,000 years and 1,000,000 years. Uncertainties are being quantified in both the conceptual models that define these elements as well as in the parameters for the models. The expert elicitation process is centered around a series of workshops that focus on the available data; alternative approaches to

  9. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  10. Hazard Analysis of Japanese Boxed Lunches (Bento).

    Science.gov (United States)

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary…

  11. Preliminary assessment of landslide-induced wave hazards, Tidal Inlet, Glacier Bay National Park, Alaska

    Science.gov (United States)

    Wieczorek, Gerald F.; Jakob, Matthias; Motyka, Roman J.; Zirnheld, Sandra L.; Craw, Patricia

    2003-01-01

    A large potential rock avalanche above the northern shore of Tidal Inlet, Glacier Bay National Park, Alaska, was investigated to determine hazards and risks of landslide-induced waves to cruise ships and other park visitors. Field and photographic examination revealed that the 5 to 10 million cubic meter landslide moved between AD 1892 and 1919 after the retreat of Little Ice Age glaciers from Tidal Inlet by AD 1890. The timing of landslide movement and the glacial history suggest that glacial debuttressing caused weakening of the slope and that the landslide could have been triggered by large earthquakes of 1899-1900 in Yakutat Bay. Evidence of recent movement includes fresh scarps, back-rotated blocks, and smaller secondary landslide movements. However, until there is evidence of current movement, the mass is classified as a dormant rock slump. An earthquake on the nearby active Fairweather fault system could reactivate the landslide and trigger a massive rock slump and debris avalanche into Tidal Inlet. Preliminary analyses show that waves induced by such a landslide could travel at speeds of 45 to 50 m/s and reach heights up to 76 m with wave runups of 200 m on the opposite shore of Tidal Inlet. Such waves would not only threaten vessels in Tidal Inlet, but would also travel into the western arm of Glacier Bay endangering large cruise ships and their passengers.

  12. The assessment of seismic hazard for Gori, (Georgia) and preliminary studies of seismic microzonation

    Science.gov (United States)

    Gogoladze, Z.; Moscatelli, M.; Giallini, S.; Avalle, A.; Gventsadze, A.; Kvavadze, N.; Tsereteli, N.

    2016-12-01

    Seismic risk is a crucial issue for South Caucasus, which is the main gateway between Asia and Europe. The goal of this work is to propose new methods and criteria for defining an overall approach aimed at assessing and mitigating seismic risk in Georgia. In this reguard seismic microzonation represents a highly useful tool for seismic risk assessmentin land management, for design of buildings or structures and for emergency planning.Seismic microzonation assessment of local seismic hazard,which is a component of seismicity resulting from specific local characteristics which cause local amplification and soil instability, through identification of zones with seismically homogeneous behavior. This paper presents the results of preliminary study of seismic microzonation of Gori, Georgia. Gori is and is located in the Shida Kartli region and on both sides of Liachvi and Mtkvari rivers, with area of about 135 km2around the Gori fortress. Gori is located in Achara-Trialeti fold-thrust belt, that is tectonically unstable. Half of all earthquakes in Gori area with magnitude M≥3.5 have happened along this fault zone and on basis of damage caused by previous earthquakes, this territory show the highest level of risk (the maximum value of direct losses) in central part of the town. The seismic microzonation map of level 1 for Gori was carried out using: 1) Already available data (i.e., topographic map and boreholes data), 2) Results of new geological surveys and 3) Geophysical measurements (i.e., MASW and noise measurements processed with HVSR technique). Our preliminary results highlight the presence of both stable zones susceptible to local amplifications and unstable zones susceptible to geological instability. Our results are directed to establish set of actions aimed at risk mitigation before initial onset of emergency, and to management of the emergency once the seismic event has occurred. The products obtained, will contain the basic elements of an integrated system

  13. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    Science.gov (United States)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  14. Preliminary Analysis of Google+'s Privacy

    OpenAIRE

    2011-01-01

    In this paper we provide a preliminary analysis of Google+ privacy. We identified that Google+ shares photo metadata with users who can access the photograph and discuss its potential impact on privacy. We also identified that Google+ encourages the provision of other names including maiden name, which may help criminals performing identity theft. We show that Facebook lists are a superset of Google+ circles, both functionally and logically, even though Google+ provides a better user interfac...

  15. Seismic hazard analysis for Jayapura city, Papua

    Energy Technology Data Exchange (ETDEWEB)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A. [Geological Agency, Diponegoro Road No.57, Bandung, 40122 (Indonesia)

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  16. The Yucca Mountain probabilistic volcanic hazard analysis project

    Energy Technology Data Exchange (ETDEWEB)

    Coppersmith, K.J.; Perman, R.C.; Youngs, R.R. [Geomatrix Consultants, Inc., San Francisco, CA (United States)] [and others

    1996-12-01

    The Probabilistic Volcanic Hazard Analysis (PVHA) project, sponsored by the U.S. Department of Energy (DOE), was conducted to assess the probability of a future volcanic event disrupting the potential repository at Yucca Mountain. The PVHA project is one of the first major expert judgment studies that DOE has authorized for technical assessments related to the Yucca Mountain project. The judgments of members of a ten-person expert panel were elicited to ensure that a wide range of approaches were considered for the hazard analysis. The results of the individual elicitations were then combined to develop an integrated assessment of the volcanic hazard that reflects the diversity of alternative scientific interpretations. This assessment, which focused on the volcanic hazard at the site, expressed as the probability of disruption of the potential repository, will provide input to an assessment of volcanic risk, which expresses the probability of radionuclide release due to volcanic disruption.

  17. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... processing plant environment, including food safety hazards that can occur before, during, and after harvest... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product...

  18. Hazard screening application guide. Safety Analysis Report Update Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  19. Preliminary seismic hazard assessment, shallow seismic refraction and resistivity sounding studies for future urban planning at the Gebel Umm Baraqa area, Egypt

    Science.gov (United States)

    Khalil, Mohamed H.; Hanafy, Sherif M.; Gamal, Mohamed A.

    2008-12-01

    Gebel Umm Baraqa Fan, west Gulf of Aqaba, Sinai, is one of the most important tourism areas in Egypt. However, it is located on the active Dead Sea-Gulf of Aqaba Levant transform fault system. Geophysical studies, including fresh water aquifer delineation, shallow seismic refraction, soil characterization and preliminary seismic hazard assessment, were conducted to help in future city planning. A total of 11 vertical electrical soundings (1000-3000 m maximum AB/2) and three bore-holes were drilled in the site for the analysis of ground water, total dissolved solids (TDS) and fresh water aquifer properties. The interpretation of the one-dimensional (1D) inversion of the resistivity data delineated the fresh water aquifer and determined its hydro-geologic parameters. Eleven shallow seismic refraction profiles (125 m in length) have been collected and interpreted using the generalized reciprocal method, and the resulting depth-velocity models were verified using an advanced finite difference (FD) technique. Shallow seismic refraction effectively delineates two subsurface layers (VP ~ 450 m s-1 and VP ~ 1000 m s-1). A preliminary seismic hazard assessment in Umm Baraqa has produced an estimate of the probabilistic peak ground acceleration hazard in the study area. A recent and historical earthquake catalog for the time period 2200 BC to 2006 has been compiled for the area. New accurate seismic source zoning is considered because such details affect the degree of hazard in the city. The estimated amount of PGA reveals values ranging from 250 to 260 cm s-2 in the bedrock of the Umm Baraqa area during a 100 year interval (a suitable time window for buildings). Recommendations as to suitable types of buildings, considering the amount of shaking and the aquifer properties given in this study, are expected to be helpful for the Umm Baraqa area.

  20. A Bayesian Seismic Hazard Analysis for the city of Naples

    Science.gov (United States)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  1. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    Science.gov (United States)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;

  2. Risk assessment of CST-7 proposed waste treatment and storage facilities Volume I: Limited-scope probabilistic risk assessment (PRA) of proposed CST-7 waste treatment & storage facilities. Volume II: Preliminary hazards analysis of proposed CST-7 waste storage & treatment facilities

    Energy Technology Data Exchange (ETDEWEB)

    Sasser, K.

    1994-06-01

    In FY 1993, the Los Alamos National Laboratory Waste Management Group [CST-7 (formerly EM-7)] requested the Probabilistic Risk and Hazards Analysis Group [TSA-11 (formerly N-6)] to conduct a study of the hazards associated with several CST-7 facilities. Among these facilities are the Hazardous Waste Treatment Facility (HWTF), the HWTF Drum Storage Building (DSB), and the Mixed Waste Receiving and Storage Facility (MWRSF), which are proposed for construction beginning in 1996. These facilities are needed to upgrade the Laboratory`s storage capability for hazardous and mixed wastes and to provide treatment capabilities for wastes in cases where offsite treatment is not available or desirable. These facilities will assist Los Alamos in complying with federal and state requlations.

  3. Landslide hazards and systems analysis: A Central European perspective

    Science.gov (United States)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  4. Hazardous-waste analysis plan for LLNL operations

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  5. Preliminary study of soil liquefaction hazard at Terengganu shoreline, Peninsular Malaysia

    Science.gov (United States)

    Hashim, H.; Suhatril, M.; Hashim, R.

    2017-06-01

    Terengganu is a shoreline state located in Peninsular Malaysia which is a growing hub for port industries and tourism centre. The northern part offers pristine settings of a relax beach areas whereas the southern part are observed to be a growing centre for development. The serious erosion on soil deposit along the beach line presents vulnerable soil condition to soil liquefaction consists of sandy with low plasticity and shallow ground water. Moreover, local earthquake from nearby fault have present significant tremors over the past few years which need to be considered in the land usage or future development in catering the seismic loading. Liquefaction analysis based on field standard penetration of soil is applied on 546 boreholes scattered along the shoreline areas ranging 244 km of shoreline stretch. Based on simplified approach, it is found that more than 70% of the studied areas pose high liquefaction potential since there are saturated loose sand and silt deposits layer ranges at depth 3 m and up to 20 m. The presence of clay deposits and hard stratum at the remaining 30% of the studied areas shows good resistance to soil liquefaction hence making the area less significant to liquefaction hazard. Result indicates that liquefaction improving technique is advisable in future development of shoreline areas of Terengganu state.

  6. Fire Hazards Analysis for the Inactive Equipment Storage Sprung Structure

    Energy Technology Data Exchange (ETDEWEB)

    MYOTT, C.F.

    2000-02-03

    The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objective of DOE Order 5480.1A are met. The order acknowledges a graded approach commensurate with the hazards involved.

  7. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    Energy Technology Data Exchange (ETDEWEB)

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  8. Implementation of hazard analysis critical control point in jameed production.

    Science.gov (United States)

    Al-Saed, A K; Al-Groum, R M; Al-Dabbas, M M

    2012-06-01

    The average of standard plate count and coliforms, Staphylococcus aureus and Salmonella counts for three home-made jameed samples, a traditional fermented dairy product, before applying hazard analysis critical control point system were 2.1 × 10(3), 8.9 × 10(1), 4 × 10(1) and less than 10 cfu/g, respectively. The developed hazard analysis critical control point plan resulted in identifying ten critical control points in the flow chart of jameed production. The critical control points included fresh milk receiving, pasteurization, addition of starter, water and salt, straining, personnel hygiene, drying and packaging. After applying hazard analysis critical control point system, there was significant improvement in the microbiological quality of the home-made jameed. The standard plate count was reduced to 3.1 × 10(2) cfu/g whereas coliform and Staphylococcus aureus counts were less than 10 cfu/g and Salmonella was not detected. Sensory evaluation results of color and flavor of sauce prepared from jameed showed a significant increase in the average scores given after hazard analysis critical control point application.

  9. Probabilistic Tsunami Hazard Analysis for Eastern Sicily (Italy)

    Science.gov (United States)

    Lorito, Stefano; Piatanesi, Alessio; Romano, Fabrizio; Basili, Roberto; Kastelic, Vanja; Tiberti, Mara Monica; Valensise, Gianluca; Selva, Jacopo

    2010-05-01

    We present preliminary results of a Probabilistic Tsunami Hazard Analysis (PTHA) for the coast of eastern Sicily. We only consider earthquake-generated tsunamis. We focus on important cities such as Messina, Catania, and Augusta. We consider different potentially tsunamigenic Source Zones (SZ) in the Mediterranean basin, basing on geological and seismological evidences. Considering many synthetic earthquakes for each SZ, we numerically simulate the entire tsunami propagation, from sea-floor displacement to inundation. We evaluate different tsunami damage metrics, as for example maximum runup, current speed, momentum and Froude number. We use a finite difference scheme in the shallow-water approximation for the tsunami propagation at open sea, and a finite volumes scheme for the inundation phase. For the shoaling and inundation stages, we have built a bathy-topo model by merging GEBCO database, multibeam soundings, and topographic data at 10 m of resolution. Accounting for their relative probability of occurrence, deterministic scenarios are merged together to assess PTHA at the selected target sites, expressed as a probability of exceedance of a given threshold (e.g. 1 m wave height) in a given time (e.g. 100 yr). First order epistemic and aleatory uncertainties are accessed through a logic tree, accounting for changes in the variables judged to have a major impact on PTHA, and for eventual incompleteness of the SZs. The SZs are located at short, intermediate and large distances with respect to the target coastlines. We thus highlight, for different source-target distances, the relative importance of the different source parameters, and/or the role of the uncertainties in the input parameters estimation. Our results suggest that in terms of inundation extent the Hellenic Arc SZ has the highest impact on the selected target coastlines. In terms of exceedance probability instead, there is a larger variability depending not only on location and recurrence but also on

  10. Hazard analysis of Clostridium perfringens in the Skylab Food System

    Science.gov (United States)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  11. PO*WW*ER mobile treatment unit process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damage and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  12. Analysis of SEAFP containment strategies regarding hydrogen hazard

    Energy Technology Data Exchange (ETDEWEB)

    Maunier, F.; Arnould, F. [Technicatome, Dir. de l' Ingenierie, SEPS, 13 - Aix-en-Provence (France); Marbach, G. [CEA/Cadarache, Dept. d' Etudes des Reacteurs (DER), 13 - Saint-Paul-lez-Durance (France)

    1998-07-01

    Since SEAFP is a safety-directed study, safety considerations dominate the concept for the confinement of hazard of the different options defined. The containment strategy is the principal safety function and includes all the measures required to ensure that uncontrolled release of radioactive and chemical materials will not occur. The study presented here corresponds to the safety analysis of the three containment strategies for SEAFP model 2 (Water Cooled) regarding Hydrogen Hazard. The objective is: to compare the different containmentstrategies, to define, for each containment strategy, the necessary Safety Systems in order to reduce the frequency of the H2 Hazard to a very low value (

  13. Using video games for volcanic hazard education and communication: an assessment of the method and preliminary results

    Science.gov (United States)

    Mani, Lara; Cole, Paul D.; Stewart, Iain

    2016-07-01

    This paper presents the findings from a study aimed at understanding whether video games (or serious games) can be effective in enhancing volcanic hazard education and communication. Using the eastern Caribbean island of St. Vincent, we have developed a video game - St. Vincent's Volcano - for use in existing volcano education and outreach sessions. Its twin aims are to improve residents' knowledge of potential future eruptive hazards (ash fall, pyroclastic flows and lahars) and to integrate traditional methods of education in a more interactive manner. Here, we discuss the process of game development including concept design through to the final implementation on St. Vincent. Preliminary results obtained from the final implementation (through pre- and post-test knowledge quizzes) for both student and adult participants provide indications that a video game of this style may be effective in improving a learner's knowledge. Both groups of participants demonstrated a post-test increase in their knowledge quiz score of 9.3 % for adults and 8.3 % for students and, when plotted as learning gains (Hake, 1998), show similar overall improvements (0.11 for adults and 0.09 for students). These preliminary findings may provide a sound foundation for the increased integration of emerging technologies within traditional education sessions. This paper also shares some of the challenges and lessons learnt throughout the development and testing processes and provides recommendations for researchers looking to pursue a similar study.

  14. Frequency Analysis of Aircraft hazards for License Application

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  15. D0 Detector Collision Hall Oxygen Deficiancy Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wu, J.; /Fermilab

    1992-08-06

    EN-258, D0 Platform ODH Analysts. provided the oxygen deficiency hazard analysts for the D0 detector in the Assembly Hall. This note covers the same analysis. but revised for the Collision Hall. Liquid cryogens. released and warming to atmosphere conditions, expand to, on average, seven hundred times their liquid volume, and displace vital atmospheric oxygen. An oxygen deficiency hazard analysis assesses the increased risk to personnel in areas containing cryogenic systems. The D0 detector Collision Hall ODH analysis has been approached five different ways using established methods. If the low beta quad magnets are powered, and the exhaust rate is below 4220 scfm, the area is ODH class 1. In any other case, the analysis shows the area to be ODH class 0 as equipped (with ventilation fans) and requiring no special safety provisions. System designers have provided for a reduced oxygen level detection and warning system as well as emergency procedures to address fault conditions.

  16. Geomorphological analysis of sinkhole and landslide hazard in a karst area of the Venetian Prealps- Italy

    Science.gov (United States)

    Tiberi, Valentina

    2010-05-01

    In the pedemountain area of the Asiago Plateau (Venetian Prealps - NE Italy) sinkholes and landslides represent in many cases a complex response to karst processes. Field survey showed that both soil and bedrock are involved, mainly represented by colluvial-alluvial sediments and carbonate rocks. Preliminary observations also reveal the key role of piping and cave-collapse phenomena and the importance of human remedial measures. Within study area, these processes cause damage mainly to agricultural and pasture activities and expose peoples and farm animals to very high hazards. This work provides preliminary results of geomorphological analysis carried out to define sinkhole and landslide hazard and his connections with karst processes. During first phases of the research program, an inventory of interesting phenomena has been elaborated employing GIS technologies. The database has been constantly revised and enriched with new field measurements and thematic maps (i.e. geomorphological, geo-structural, hydrogeological, caves development maps). Specifically, field survey focused on the morphodynamic definition of instability elements allowing to recognize a wide range of morphotypes (mainly with regard to sinkholes) and polygenic morphologies (i.e. mixed sinkholes-landslides configurations). Geomorphological analysis also revealed specific evolutionary trends of instability processes; they could be useful employed to program more effective mitigation strategies.

  17. Preliminary overview map of volcanic hazards in the 48 conterminous United States

    Science.gov (United States)

    Mullineaux, D.R.

    1976-01-01

    Volcanic eruptions and related phenomena can be expected to occur in the Western United States, and in some places are potentially hazardous enough to be considered in longe-range land-use planning. But the immediate risk from volcanic hazards is low because eruptions are so infrequent in the conterminous United States that few, if any, occur during any one person 1s lifetime. Furthermore, severely destructive effects of eruptions, other than extremely rare ones of catastrophic scale, probably would be limited to areas within a few tens of kilometers downvalley or downwind from a volcano. Thus, the area seriously endangered by any one eruption would be only a very small part of the Western United States. The accompanying map identifies areas in which volcanic hazards pose some degree of risk, and shows that the problem is virtually limited to the far western States. The map also shows the possible areal distribution of several kinds of dangerous eruptive events and indicates the relative likelihood of their occurrence at various volcanoes. The kinds of events described here as hazards are those that can occur suddenly and with little or no warning; they do not include long-term geologic processes. Table 1 summarizes the origin and some characteristics of potentially hazardous volcanic phenomena. The map is diagrammatic. It does not show the specific location of the next expected eruption , because such an event cannot be reliably predicted . Instead, the map shows general areas or zones that, over a long period of time, are relatively likely to be affected in one or more places by various kinds of hazardous volcanic events. However, only a small part of one of these areas would be affected by any single eruption.

  18. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    Science.gov (United States)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  19. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    Science.gov (United States)

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  20. Preliminary re-evaluation of probabilistic seismic hazard assessment in Chile: from Arica to Taitao Peninsula

    Directory of Open Access Journals (Sweden)

    F. Leyton

    2009-12-01

    Full Text Available Chile is one of the most seismically active countries in the world; indeed, having witnessed very large earthquakes associated with high horizontal peak ground accelerations, the use of probabilistic hazard assessment is an important tool in any decision-making. In the present study, we review all the available information to improve the estimation of the probabilistic seismic hazard caused by two main sources: shallow interplate, thrust earthquakes and intermediate depth, intraplate earthquakes. Using previously defined seismic zones, we compute Gutenberg-Richter laws and, along with appropriate attenuation laws, revaluate the probabilistic seismic hazard assessments in Chile. We obtain expected horizontal peak ground acceleration with a 10% of probability of being exceeded in 50 years, reaching from 0.6 g up 1.0 g in the coast and between 0.4 g and 0.6 g towards the Andes Mountains, with larger values in Northern part of the country. The present study improves our knowledge of geological hazards in Chile, enabling the mitigation of important human and material losses due to large earthquakes in the future.

  1. Probabilistic tsunami hazard assessment for the coasts of Italy: preliminary results in the frame of the RITMARE Project

    Science.gov (United States)

    Armigliato, Alberto; Tinti, Stefano; Pagnoni, Gianluca; Zaniboni, Filippo; Bressan, Lidia

    2013-04-01

    The five-year project called RITMARE ("La Ricerca ITaliana per il MARE") is a very ambitious national research and innovation program focussed on all aspects relevant to marine and coastal research, technology and management, with emphasis on networking and international cooperation. The program objectives fit into the overall European Commission vision documents and strategic programs and cover five major themes, one of which deals with technologies for the sustainable management of the coastal areas. The theme is further articulated in work-packages and specific actions, including the systematic and quantitative tsunami hazard assessment for the whole Italian coastlines. The University of Bologna takes part in the project RITMARE, being a member of the University Consortium Conisma, that is a direct partner in the project. We present here some preliminary results obtained by the Tsunami Research Team of the University of Bologna (TRT-UNIBO) by applying a modified version of a hybrid statistical-deterministic approach to the southern Tyrrhenian, Ionian and Adriatic coasts. A widely adopted approach formulates the problem of the tsunami hazard assessment in terms of the probability of occurrence of tsunamigenic earthquakes, which is appropriate in basins where the number of known historical tsunamis is too scarce to be used in reliable statistical analyses, and where the largest part of tsunamis have tectonic origin. The TRT-UNIBO approach starts by building a single homogeneous earthquake catalogue covering the whole national territory, as well as the adjacent areas that are believed to have the potential to produce tsunamis with relevant far-field effects along the Italian coasts. A proper statistical analysis of the catalogue allows retrieving the earthquake occurrence rate at a regional scale as well as in a set of cells in which the studied geographical domain is divided into. The final result of the statistical analysis is the computation for each cell of the

  2. Long term volcanic hazard analysis in the Canary Islands

    Science.gov (United States)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  3. Landslide Hazard Zonation Mapping and Comparative Analysis of Hazard Zonation Maps

    Institute of Scientific and Technical Information of China (English)

    S. Sarkar; R. Anbalagan

    2008-01-01

    Landslide hazard zonation mapping at regional level of a large area provides a broad trend of landslide potential zones. A macro level landslide hazard zonation for a small area may provide a better insight into the landslide hazards. The main objective of the present work was to carry out macro landslide hazard zonation mapping on 1:50,000 scale in an area where regional level zonation mapping was conducted earlier. In the previous work the regional landslide hazard zonation maps of Srinagar-Rudraprayag area of Garhwal Himalaya in the state of Uttarakhand were prepared using subjective and objective approaches. In the present work the landslide hazard zonation mapping at macro level was carded out in a small area using a Landslide Hazard Evaluation Factor rating scheme. The hazard zonation map produced by using this technique classifies the area into relative hazard classes in which the high hazard zones well correspond with high frequency of landslides. The results of this map when compared with the regional zonation maps prepared earlier show that application of the present technique identified more details of the hazard zones, which are broadly shown in the earlier zonation maps.

  4. Flood Hazard and Risk Analysis in Urban Area

    Science.gov (United States)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  5. A Hazard Analysis for a Generic Insulin Infusion Pump

    Science.gov (United States)

    Zhang, Yi; Jones, Paul L.; Jetley, Raoul

    2010-01-01

    Background Researchers at the Food and Drug Administration (FDA)/Center for Device and Radiological Health/Office of Science and Engineering Laboratories have been exploring the concept of model-based engineering as a means for improving the quality of medical device software. Insulin pumps were chosen as a research subject because their design provides the desired degree of research complexity and these types of devices present an ongoing regulatory challenge. Methods Insulin pump hazards and their contributing factors are considered in the context of a highly abstract generic insulin infusion pump (GIIP) model. Hazards were identified by consulting with manufacturers, pump users, and clinicians; by reviewing national and international standards and adverse event reports collected by the FDA; and from workshops sponsored by Diabetes Technology Society. This information has been consolidated in tabular form to facilitate further community analysis and discussion. Results A generic insulin infusion pump model architecture has been established. A fairly comprehensive hazard analysis document, corresponding to the GIIP model, is presented in this article. Conclusions We believe that this work represents the genesis of an insulin pump safety reference standard upon which future insulin pump designs can be based to help ensure a basic level of safety. More interaction with the diabetes community is needed to assure the quality of this safety modeling process. PMID:20307387

  6. BASE Flexible Array Preliminary Lithospheric Structure Analysis

    Science.gov (United States)

    Yeck, W. L.; Sheehan, A. F.; Anderson, M. L.; Siddoway, C. S.; Erslev, E.; Harder, S. H.; Miller, K. C.

    2009-12-01

    The Bighorns Arch Seismic Experiment (BASE) is a Flexible Array experiment integrated with EarthScope. The goal of BASE is to develop a better understanding of how basement-involved foreland arches form and what their link is to plate tectonic processes. To achieve this goal, the crustal structure under the Bighorn Mountain range, Bighorn Basin, and Powder River Basin of northern Wyoming and southern Montana are investigated through the deployment of 35 broadband seismometers, 200 short period seismometers, 1600 “Texan” instruments using active sources and 800 “Texan” instruments monitoring passive sources, together with field structural analysis of brittle structures. The novel combination of these approaches and anticipated simultaneous data inversion will give a detailed structural crustal image of the Bighorn region at all levels of the crust. Four models have been proposed for the formation of the Bighorn foreland arch: subhorizontal detachment within the crust, lithospheric buckling, pure shear lithospheric thickening, and fault blocks defined by lithosphere-penetrating thrust faults. During the summer of 2009, we deployed 35 broadband instruments, which have already recorded several magnitude 7+ teleseismic events. Through P wave receiver function analysis of these 35 stations folded in with many EarthScope Transportable Array stations in the region, we present a preliminary map of the Mohorovicic discontinuity. This crustal map is our first test of how the unique Moho geometries predicted by the four hypothesized models of basement involved arches fit seismic observations for the Bighorn Mountains. In addition, shear-wave splitting analysis for our first few recorded teleseisms helps us determine if strong lithospheric deformation is preserved under the range. These analyses help lead us to our final goal, a complete 4D (3D spatial plus temporal) lithospheric-scale model of arch formation which will advance our understanding of the mechanisms

  7. Lithium-thionyl chloride cell system safety hazard analysis

    Science.gov (United States)

    Dampier, F. W.

    1985-03-01

    This system safety analysis for the lithium thionyl chloride cell is a critical review of the technical literature pertaining to cell safety and draws conclusions and makes recommendations based on this data. The thermodynamics and kinetics of the electrochemical reactions occurring during discharge are discussed with particular attention given to unstable SOCl2 reduction intermediates. Potentially hazardous reactions between the various cell components and discharge products or impurities that could occur during electrical or thermal abuse are described and the most hazardous conditions and reactions identified. Design factors influencing the safety of Li/SOCl2 cells, shipping and disposal methods and the toxicity of Li/SOCl2 battery components are additional safety issues that are also addressed.

  8. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan....

  9. Preliminary design package for Sunair SEC-601 solar collector

    Science.gov (United States)

    1978-01-01

    The preliminary design of the Owens-Illinois model Sunair SEC-601 tubular air solar collector is presented. Information in this package includes the subsystem design and development approaches, hazard analysis, and detailed drawings available as the preliminary design review.

  10. Seismic Hazard analysis of Adjaria Region in Georgia

    Science.gov (United States)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  11. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    Science.gov (United States)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  12. Hazardous materials transportation: a risk-analysis-based routing methodology.

    Science.gov (United States)

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  13. Comparative analysis of hazardous household waste in two Mexican regions.

    Science.gov (United States)

    Delgado, Otoniel Buenrostro; Ojeda-Benítez, Sara; Márquez-Benavides, Liliana

    2007-01-01

    Household hazardous waste (HHW) generation in two Mexican regions was examined, a northern region (bordering with the USA) and a central region. The aim of this work was to determine the dynamics of solid waste generation and to be able to compare the results of both regions, regarding consumption patterns and solid waste generation rates. In the northern region, household solid waste was analysed quantitatively. In order to perform this analysis, the population was categorized into three socioeconomic strata (lower, middle, upper). Waste characterization revealed the presence of products that give origin to household hazardous waste. In the northern region (Mexicali city), household hazardous waste comprised 3.7% of municipal solid waste, the largest categories in this fraction were home care products (29.2%), cleaning products (19.5%) and batteries and electronic equipment (15.7%). In the central region, HHW comprised 1.03% of municipal solid waste; the main categories in this fraction were represented by cleaning products (39%), self care products (27.3%), and insecticides (14.4%). In Mexicali, the socioeconomic study demonstrated that the production of HHW is independent of the income level. Furthermore, the composition of the solid waste stream in both regions suggested the influence of another set of variables such as local climate, migration patterns and marketing coverage. Further research is needed in order to establish the effect of low quantities of HHW upon the environment and public health.

  14. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    Science.gov (United States)

    Grasso, S.; Maugeri, M.

    rigorous complex methods of analysis or qualitative procedures. A semi quantitative procedure based on the definition of the geotechnical hazard index has been applied for the zonation of the seismic geotechnical hazard of the city of Catania. In particular this procedure has been applied to define the influence of geotechnical properties of soil in a central area of the city of Catania, where some historical buildings of great importance are sited. It was also performed an investigation based on the inspection of more than one hundred historical ecclesiastical buildings of great importance, located in the city. Then, in order to identify the amplification effects due to the site conditions, a geotechnical survey form was prepared, to allow a semi quantitative evaluation of the seismic geotechnical hazard for all these historical buildings. In addition, to evaluate the foundation soil time -history response, a 1-D dynamic soil model was employed for all these buildings, considering the non linearity of soil behaviour. Using a GIS, a map of the seismic geotechnical hazard, of the liquefaction hazard and a preliminary map of the seismic hazard for the city of Catania have been obtained. From the analysis of obtained results it may be noticed that high hazard zones are mainly clayey sites

  15. Mapping a Volcano Hazard Area of Mount Sinabung Using Drone: Preliminary Results

    Science.gov (United States)

    Tarigan, A. P. M.; Suwardhi, D.; Fajri, M. N.; Fahmi, F.

    2017-03-01

    Mount Sinabung is still active since its first eruption in 2010 and has been declared as national disaster. The persistent eruptions afterward have been lively and affected severely the surrounding villages located within the 5 km from its crater. The purpose of this study is to explore drone technology and its applicability in mapping a volcanic hazard area. The first essential step in this study is to have a well-defined mission flight in order to acquire air photos that can be processed in the subsequent procedures. The following steps including geometry correction and photos stitching were conducted automatically using proper software. It is found that the resulting photo mosaic and 3D map can be obtained in effective and efficient manner and several important interpretations can be made from them.

  16. Preliminary Assessment of Operational Hazards and Safety Requirements for Airborne Trajectory Management (ABTM) Roadmap Applications

    Science.gov (United States)

    Cotton, William B.; Hilb, Robert; Koczo, Stefan, Jr.; Wing, David J.

    2016-01-01

    A set of five developmental steps building from the NASA TASAR (Traffic Aware Strategic Aircrew Requests) concept are described, each providing incrementally more efficiency and capacity benefits to airspace system users and service providers, culminating in a Full Airborne Trajectory Management capability. For each of these steps, the incremental Operational Hazards and Safety Requirements are identified for later use in future formal safety assessments intended to lead to certification and operational approval of the equipment and the associated procedures. Two established safety assessment methodologies that are compliant with the FAA's Safety Management System were used leading to Failure Effects Classifications (FEC) for each of the steps. The most likely FEC for the first three steps, Basic TASAR, Digital TASAR, and 4D TASAR, is "No effect". For step four, Strategic Airborne Trajectory Management, the likely FEC is "Minor". For Full Airborne Trajectory Management (Step 5), the most likely FEC is "Major".

  17. Tsunami hazard assessment at Port Alberni, BC, Canada: preliminary model results

    Science.gov (United States)

    Grilli, S. T.; Insua, T. L.; Grilli, A. R.; Douglas, K. L.; Shelby, M. R.; Wang, K.; Gao, D.

    2016-12-01

    Located in the heart of Vancouver Island, BC, Port Alberni has a well-known history of tsunamis. Many of the Nuu-Chah-Nulth First Nations share oral stories about a strong fight between a thunderbird and a whale that caused big waves in a winter night, a story that is compatible with the recently recognized great Cascadia tsunami in January, 1700. Port Alberni, with a total population of approximately 20,000 people, lies beside the Somass River, at the very end of Barkley Sound Inlet. The narrow canal connecting this town to the Pacific Ocean runs for more than 64 km ( 40 miles) between steep mountains, providing an ideal setting for the amplification of tsunami waves through funnelling effects. The devastating effects of tsunamis are still fresh in residents' memories from the impact of the 1964 Alaska tsunami that caused serious damage to the city. In June 2016, Emergency Management BC ran a coastal exercise in Port Alberni, simulating the response to an earthquake and a tsunami. During three days, the emergency teams in the City of Port Alberni practiced and learned from the experience. Ocean Networks Canada contributed to this exercise with the development of preliminary simulations of tsunami impact on the city from a buried rupture of the Cascadia Subduction Zone, including the Explorer segment. Wave propagation was simulated with the long-wave model FUNWAVE-TVD. Preliminary results indicate a strong amplification of tsunami waves in the Port Alberni area. The inundation zone in Port Alberni had a footprint similar to that of the 1700 Cascadia and 1964 Alaska tsunamis, inundating the area surrounding the Somass river and preferentially following the Kitsuksis and Roger Creek river margins into the city. Several other tsunami source scenarios, including splay faulting and trench-breaching ruptures are currently being modeled for the city of Port Alberni following a similar approach. These results will be presented at the conference.

  18. Fire hazards analysis for W030 tank farm ventilation upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Huckfeldt, R.A.

    1996-07-17

    This Fire Hazard Analysis (FHA) was prepared according to the requirements of U.S. Department of Energy (DOE) Order 5480.7A,FIRE PROTECTION, 2-17-93. The purpose of this FHA is to ascertain whether the objectives of DOE 5480.7A are being met. This purpose is accomplished through a conservative comprehensive assessment of the risk from fire and other perils within individual fire areas of a DOE facility in relation to proposed fire protection. This FHA is based on conditions set forth within this document and is valid only under these conditions.

  19. Two-dimensional hazard estimation for longevity analysis

    DEFF Research Database (Denmark)

    Fledelius, Peter; Guillen, M.; Nielsen, J.P.

    2004-01-01

    the two-dimensional mortality surface. Furthermore we look at aggregated synthetic population metrics as 'population life expectancy' and 'population survival probability'. For Danish women these metrics indicate decreasing mortality with respect to chronological time. The metrics can not directly be used......We investigate developments in Danish mortality based on data from 1974-1998 working in a two-dimensional model with chronological time and age as the two dimensions. The analyses are done with non-parametric kernel hazard estimation techniques. The only assumption is that the mortality surface...... for analysis of economic implications arising from mortality changes....

  20. Dual-fuel, dual-throat engine preliminary analysis

    Science.gov (United States)

    Obrien, C. J.

    1979-01-01

    A propulsion system analysis of the dual fuel, dual throat engine for launch vehicle applications was conducted. Basic dual throat engine characterization data were obtained to allow vehicle optimization studies to be conducted. A preliminary baseline engine system was defined.

  1. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  2. DEEP-South: Preliminary Lightcurves of Potentially Hazardous Asteroids from the First Year Operation

    Science.gov (United States)

    Moon, Hong-Kyu; Kim, Myung-Jin; Choi, Young-Jun; Yim, Hong-Suh; Park, Jintae; Roh, Dong-Goo; Lee, Hee-Jae; Oh, Young-Seok; Bae, Young-Ho

    2016-10-01

    Deep Ecliptic Patrol of the Southern Sky (DEEP-South) observation is being made during the off-season for exoplanet search. It started in October 2015, using Korea Microlensing Telescope Network (KMTNet), a network of three identical telescopes with 1.6 m aperture equipped with 18K × 18K CCDs located in Chile (CTIO), South Africa (SAAO), and Australia (SSO). The combination of KMTNet's prime focus optics and the 340 million pixel CCD provides four square degree field of view with 0.4 arcsec/pixel plate scale.Most of the allocated time for DEEP-South is devoted to targeted photometry of PHAs and NEAs to increase the number of those objects with known physical properties. It is efficiently achieved by multiband, time series photometry. This Opposition Census (OC) mode targets objects near their opposition, with km-sized PHAs in early stage and goes down to sub-km objects. Continuous monitoring of the sky with KMTNEt is optimized for spin characterization of various kinds of asteroids, including binaries, slow/fast- and non-principal axis- rotators, and hence expected to facilitate the debiasing of previously reported lightcurve observations. We present the preliminary lightcurves of PHAs from year one of the DEEP-South Project.

  3. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    Science.gov (United States)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

  4. Fire hazard analysis for Plutonium Finishing Plant complex

    Energy Technology Data Exchange (ETDEWEB)

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  5. Hazards by shock waves during explosive eruptions: preliminary results of experimental investigations.

    Science.gov (United States)

    Scolamacchia, Teresa; Alatorre Ibarguengoïtia, Miguel; Spieler, Oliver; Dingwell, Donald B.

    2010-05-01

    A recent study (Scolamacchia and Shouwenaars, 2009) investigated the nature of microscopic craters on the steel surface of a basketball pole left standing in one of the villages destroyed by the 1982 eruption of El Chichón volcano. The craters were attributed to the impacts of ash particles (70-280 μm) accelerated by shock waves due to an efficient momentum coupling with a gas phase, such that a sudden expansion of the gas, caused by shock wave propagation, drag the particles up to speeds of 710 to 980 m/s. Several open questions existed on this kind of phenomena. Preliminary tests were performed to investigate the correlation between particle size and the high velocities calculated, based on inner deformation of the steel and crater geometry. We used a shock tube apparatus consisting of a high-pressure (HP) steel autoclave, pressurized with Ar gas, and a low pressure (LP) tank at atmospheric conditions. We used ash and lapilli bulk samples from El Chichón trachyandesites, and lapilli with random irregular shapes obtained by crushing and abrading dacitic blocks from pyroclastic flow deposits of Unzen volcano. The samples were placed inside an autoclave at ambient T and P, located between the HP autoclave and the LP tank. Steel plates (same type of the original impacted material), were fixed to the LP tank walls, 10 cm above the autoclave that contained the samples. Shock waves were generated by the sudden decompression of the Ar gas due to the systematical failure of a diaphragm (which separate the LP from the HP section). Air expansion accelerated the particles from below toward the steel plate. The speed of the particles was measured using a system of 4 copper wires conducting an electric signal. The signals dropped when the particles reached the wires. We used low pressure ranges (3.1 to 9.8 MPa) for all experimental runs, obtaining a range of particles velocities between 40 and 257 m/s. These velocities can be attained by pyroclastic density currents. Higher

  6. Regional Hazard Analysis For Use In Vulnerability And Risk Assessment

    Directory of Open Access Journals (Sweden)

    Maris Fotios

    2015-09-01

    Full Text Available A method for supporting an operational regional risk and vulnerability analysis for hydrological hazards is suggested and applied in the Island of Cyprus. The method aggregates the output of a hydrological flow model forced by observed temperatures and precipitations, with observed discharge data. A scheme supported by observed discharge is applied for model calibration. A comparison of different calibration schemes indicated that the same model parameters can be used for the entire country. In addition, it was demonstrated that, for operational purposes, it is sufficient to rely on a few stations. Model parameters were adjusted to account for land use and thus for vulnerability of elements at risk by comparing observed and simulated flow patterns, using all components of the hydrological model. The results can be used for regional risk and vulnerability analysis in order to increase the resilience of the affected population.

  7. [Tuscan Chronic Care Model: a preliminary analysis].

    Science.gov (United States)

    Barbato, Angelo; Meggiolaro, Angela; Rossi, Luigi; Fioravanti, C; Palermita, F; La Torre, Giuseppe

    2015-01-01

    the aim of this study is to present a preliminary analysis of efficacy and effectiveness of a model of chronically ill care (Chronic Care Model, CCM). the analysis took into account 106 territorial modules, 1016 General Practitioners and 1,228,595 patients. The diagnostic and therapeutic pathways activated (PDTA), involved four chronic conditions, selected according to the prevalence and incidence, in Tuscany Region: Diabetes Mellitus (DM), Heart Failure (SC), Chronic Obstructive Pulmonary Disease (COPD) and stroke. Six epidemiological indicators of process and output were selected, in order to measure the model of care performed, before and after its application: adherence to specific follow-up for each pathology (use of clinical and laboratory indicators), annual average of expenditure per/capita/euro for diagnostic tests, in laboratory and instrumental, average expenditure per/capita/year for specialist visits; hospitalization rate for diseases related to the main pathology, hospitalization rate for long-term complications and rate of access to the emergency department (ED). Data were collected through the database; the differences before and after the intervention and between exposed and unexposed, were analyzed by method "Before-After (Controlled and Uncontrolled) Studies". The impact of the intervention was calculated as DD (difference of the differences). DM management showed an increased adhesion to follow-up (DD: +8.1%), and the use of laboratory diagnostics (DD: +4,9 €/year/pc), less hospitalization for long-term complications and for endocrine related diseases (DD respectively: 5.8/1000 and DD: +1.2/1000), finally a smaller increase of access to PS (DD: -1.6/1000), despite a slight increase of specialistic visits (DD: +0,38 €/year/pc). The management of SC initially showed a rising adherence to follow-up (DD: +2.3%), a decrease of specialist visits (DD:E 1.03 €/year/pc), hospitalization and access to PS for exacerbations (DD: -4.4/1000 and DD: -6

  8. Analyzing Distributed Functions in an Integrated Hazard Analysis

    Science.gov (United States)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  9. Preliminary hazard assessment and site characterization of Meşelik campus area, Eskişehir-Turkey

    Directory of Open Access Journals (Sweden)

    A. Orhan

    2013-01-01

    Full Text Available Limited knowledge of ground conditions, such as geotechnical parameters, is one of the main causes of foundation failure. Unknown ground conditions can also cause additional burden costs. Due to lack of geotechnical parameters in foundation soil, some problems can be observed during and after the construction.

    In this study, a comprehensive field study was conducted to make a preliminary hazard assessment on the Meşelik campus area, Eskişehir, Turkey. In this context, the experimental studies were performed in two stages. In the first stage, boreholes were drilled in the field; a standard penetration test (SPT was performed and disturbed/undisturbed samples were collected from certain levels. In the second stage, laboratory tests were performed in order to identify and classify the samples. Unconfined compression strength and triaxial compression tests were conducted on undisturbed samples for determining the engineering characteristics. XRD (X-ray diffraction tests were performed and the swelling potential of the samples were evaluated. The liquefaction potential of the area was also assessed on a SPT-based method. Thus, the geotechnical parameters and the liquefaction potential of the sub-surface in the study area were thoroughly analyzed and presented to be used for further studies.

  10. Preliminary volcano-hazard assessment for the Katmai volcanic cluster, Alaska

    Science.gov (United States)

    Fierstein, Judy; Hildreth, Wes

    2000-01-01

    The world’s largest volcanic eruption of the 20th century broke out at Novarupta (fig. 1) in June 1912, filling with hot ash what came to be called the Valley of Ten Thousand Smokes and spreading downwind more fallout than all other historical Alaskan eruptions combined. Although almost all the magma vented at Novarupta, most of it had been stored beneath Mount Katmai 10 km away, which collapsed during the eruption. Airborne ash from the 3-day event blanketed all of southern Alaska, and its gritty fallout was reported as far away as Dawson, Ketchikan, and Puget Sound (fig. 21). Volcanic dust and sulfurous aerosol were detected within days over Wisconsin and Virginia; within 2 weeks over California, Europe, and North Africa; and in latter-day ice cores recently drilled on the Greenland ice cap. There were no aircraft in Alaska in 1912—fortunately! Corrosive acid aerosols damage aircraft, and ingestion of volcanic ash can cause abrupt jet-engine failure. Today, more than 200 flights a day transport 20,000 people and a fortune in cargo within range of dozens of restless volcanoes in the North Pacific. Air routes from the Far East to Europe and North America pass over and near Alaska, many flights refueling in Anchorage. Had this been so in 1912, every airport from Dillingham to Dawson and from Fairbanks to Seattle would have been enveloped in ash, leaving pilots no safe option but to turn back or find refuge at an Aleutian airstrip west of the ash cloud. Downwind dust and aerosol could have disrupted air traffic anywhere within a broad swath across Canada and the Midwest, perhaps even to the Atlantic coast. The great eruption of 1912 focused scientific attention on Novarupta, and subsequent research there has taught us much about the processes and hazards associated with such large explosive events (Fierstein and Hildreth, 1992). Moreover, work in the last decade has identified no fewer than 20 discrete volcanic vents within 15 km of Novarupta (Hildreth and others

  11. Preliminary Cost Benefit Assessment of Systems for Detection of Hazardous Weather. Volume II. Appendices,

    Science.gov (United States)

    1981-07-01

    intensification went undetected. The straight-line winds from the storm, a microburst in Dr. Ted Fujita’s analysis, damaged 65 homes and mobile homes...Science and Technologyy , 95th Congress, No. 32, U.S. GPO, Washington, D.C. 36 U.S. House of Representatives, 1978, Weather Forecasting - Past, Present

  12. Hazard analysis system of urban post-earth-quake fire based on GIS

    Institute of Scientific and Technical Information of China (English)

    李杰; 江建华; 李明浩

    2001-01-01

    The authors study the structure, functions and data organization for the hazard analysis system of urban post-earthquake fire on the platform of GIS. A general hazard analysis model of the post-earthquake fire is presented. Taking Shanghai central district as background, a system for hazard analysis of the post-earthquake fire and auxili-ary decision-against fire is developed.

  13. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  14. Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing

    Science.gov (United States)

    Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.

    2008-01-01

    Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.

  15. Hazard function analysis for flood planning under nonstationarity

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  16. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  17. Analysis Landslide Hazard in Banjarmangu Sub District, Banjarnegara District

    Directory of Open Access Journals (Sweden)

    Kuswaji Dwi Priyono

    2016-05-01

    Full Text Available The objective of the research is to find the most suitable soil conservation practice that may be applied to control landslide hazard. In order to achieve that objective, some research steps must be done, are: (1 to identify the land characteristics of the study area that is based on the understanding of some factors that caused and triggered the landslide hazard, i.e.: slope morphology, rocks/soils characteristics, climatic condition, and landuse; (2 to study the types of landslide that occurs in every landforms and determine the area having ideal landslide form; The proposed landslide in this research is the process of masswasting down-slope as a result of the gravitation action on materials being sliding. The landslide types is including creep, slide, slump, and rocks/soils fall. The methods that being applied in the research include field survey methods and the method for determining landslide hazard by using geographic information techniques. Field survey method was intended to characterize the location of every landslide that have been happened in the study area. The results of field survey were applied as materials for determinating the grade of landslide hazard. Scorring and weighting methods of factors that influence landslide was apllied to determine the grade of landslide hazard. Scor and weight were not same for every parameters used for evaluation. The result of field research shows that landslide happen in every landform unit The study area can be devided into 9 landform unit. The landform units are differentiated into the landslide hazard classes, the study area there were found 5 classes of landslide hazard, namely: (1 vary low hazard equal to 16,65% (1 landform unit; (2 low hazard equal to 7,63% (1 landform unit; (3 medium hazard equal to 37,58% (3 landform unit; (4 high hazard equal to 25,41% (2 landforms unit; and (5 highest hazard equal to 12,73% (2 landform unit. Evaluation of landslide hazard shows hat most of study area

  18. A decision analysis framework for estimating the potential hazards for drinking water resources of chemicals used in hydraulic fracturing fluids.

    Science.gov (United States)

    Yost, Erin E; Stanek, John; Burgoon, Lyle D

    2017-01-01

    Despite growing concerns over the potential for hydraulic fracturing to impact drinking water resources, there are limited data available to identify chemicals used in hydraulic fracturing fluids that may pose public health concerns. In an effort to explore these potential hazards, a multi-criteria decision analysis (MCDA) framework was employed to analyze and rank selected subsets of these chemicals by integrating data on toxicity, frequency of use, and physicochemical properties that describe transport in water. Data used in this analysis were obtained from publicly available databases compiled by the United States Environmental Protection Agency (EPA) as part of a larger study on the potential impacts of hydraulic fracturing on drinking water. Starting with nationwide hydraulic fracturing chemical usage data from EPA's analysis of the FracFocus Chemical Disclosure Registry 1.0, MCDAs were performed on chemicals that had either noncancer toxicity values (n=37) or cancer-specific toxicity values (n=10). The noncancer MCDA was then repeated for subsets of chemicals reported in three representative states (Texas, n=31; Pennsylvania, n=18; and North Dakota, n=20). Within each MCDA, chemicals received scores based on relative toxicity, relative frequency of use, and physicochemical properties (mobility in water, volatility, persistence). Results show a relative ranking of these chemicals based on hazard potential, and provide preliminary insight into chemicals that may be more likely than others to impact drinking water resources. Comparison of nationwide versus state-specific analyses indicates regional differences in the chemicals that may be of more concern to drinking water resources, although many chemicals were commonly used and received similar overall hazard rankings. Several chemicals highlighted by these MCDAs have been reported in groundwater near areas of hydraulic fracturing activity. This approach is intended as a preliminary analysis, and represents one

  19. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    Science.gov (United States)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  20. Preliminary safety design analysis of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Suk, Soo Dong; Kwon, Y. M.; Kim, K. D. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-03-01

    The national long-term R and D program updated in 1997 requires Korea Atomic Energy Research Institute(KAERI) to complete by the year 2006 the basic design of Korea Advanced Liquid Metal Reactor (KALIMER), along with supporting R and D work, with the capability of resolving the issue of spent fuel storage as well as with significantly enhanced safety. KALIMER is a 150 MWe pool-type sodium cooled prototype reactor that uses metallic fuel. The conceptual design is currently under way to establish a self consistent design meeting a set of the major safety design requirements for accident prevention. Some of current emphasis include those for inherent and passive means of negative reactivity insertion and decay heat removal, high shutdown reliability, prevention of and protection from sodium chemical reaction, and high seismic margin, among others. All of these requirements affect the reactor design significantly and involve supporting R and D programs of substance. This document first introduces a set of safety design requirements and accident evaluation criteria established for the conceptual design of KALIMER and then summarizes some of the preliminary results of engineering and design analyses performed for the safety of KALIMER. 19 refs., 19 figs., 6 tabs. (Author)

  1. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis

    Science.gov (United States)

    Schaefer, K.; Prakash, A.; Witte, W.

    2011-12-01

    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  2. Preliminary analysis of patent trends for magnetic fusion technology

    Energy Technology Data Exchange (ETDEWEB)

    Levine, L.O.; Ashton, W.B.; Campbell, R.S.

    1984-02-01

    This study presents a preliminary analysis of development trends in magnetic fusion technology based on data from US patents. The research is limited to identification and description of general patent activity and ownership characteristics for 373 patents. The results suggest that more detailed studies of fusion patents could provide useful R and D planning information.

  3. Analysis of hazardous biological material by MALDI mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  4. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  5. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    Science.gov (United States)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  6. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    Science.gov (United States)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  7. Preliminary analysis of alternative fuel cycles for proliferation evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Steindler, M. J.; Ripfel, H. C.F.; Rainey, R. H.

    1977-01-01

    The ERDA Division of Nuclear Research and Applications proposed 67 nuclear fuel cycles for assessment as to their nonproliferation potential. The object of the assessment was to determine which fuel cycles pose inherently low risk for nuclear weapon proliferation while retaining the major benefits of nuclear energy. This report is a preliminary analysis of these fuel cycles to develop the fuel-recycle data that will complement reactor data, environmental data, and political considerations, which must be included in the overall evaluation. This report presents the preliminary evaluations from ANL, HEDL, ORNL, and SRL and is the basis for a continuing in-depth study. (DLC)

  8. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  9. Open Source Seismic Hazard Analysis Software Framework (OpenSHA)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — OpenSHA is an effort to develop object-oriented, web- & GUI-enabled, open-source, and freely available code for conducting Seismic Hazard Analyses (SHA). Our...

  10. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

    Science.gov (United States)

    Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

    2016-04-01

    Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective

  11. Preliminary Integrated Safety Analysis Status Report

    Energy Technology Data Exchange (ETDEWEB)

    D. Gwyn

    2001-04-01

    This report provides the status of the potential Monitored Geologic Repository (MGR) Integrated Safety Analysis (EA) by identifying the initial work scope scheduled for completion during the ISA development period, the schedules associated with the tasks identified, safety analysis issues encountered, and a summary of accomplishments during the reporting period. This status covers the period from October 1, 2000 through March 30, 2001.

  12. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  13. Risk analysis procedure for post-wildfire natural hazards in British Columbia

    Science.gov (United States)

    Jordan, Peter

    2010-05-01

    preliminary map of vegetation burn severity if desired. The next steps include mapping catchment boundaries, field traverses to collect data on soil burn severity and water repellency, identification of unstable hillslopes and channels, and inspection of values at risk from hazards such as debris flows or flooding. BARC (burned area reflectance classification) maps based on satellite imagery are prepared for some fires, although these are typically not available for several weeks. Our objective is to make a preliminary risk analysis report available about two weeks after the fire is contained. If high risks to public safety or infrastructure are identified, the risk analysis reports may make recommendations for mitigation measures to be considered; however, acting on these recommendations is the responsibility of local land managers, local government, or landowners. Mitigation measures for some fires have included engineering treatments to reduce the hydrologic impact of logging roads, protective structures such as dykes or berms, and straw mulching to reduce runoff and erosion on severely burned areas. The Terrace Mountain Fire, with burned 9000 hectares in the Okanagan Valley in 2009, is used as an example of the application of the procedure.

  14. Los Alamos National Laboratory corregated metal pipe saw facility preliminary safety analysis report. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-09-19

    This Preliminary Safety Analysis Report addresses site assessment, facility design and construction, and design operation of the processing systems in the Corrugated Metal Pipe Saw Facility with respect to normal and abnormal conditions. Potential hazards are identified, credible accidents relative to the operation of the facility and the process systems are analyzed, and the consequences of postulated accidents are presented. The risk associated with normal operations, abnormal operations, and natural phenomena are analyzed. The accident analysis presented shows that the impact of the facility will be acceptable for all foreseeable normal and abnormal conditions of operation. Specifically, under normal conditions the facility will have impacts within the limits posted by applicable DOE guidelines, and in accident conditions the facility will similarly meet or exceed the requirements of all applicable standards. 16 figs., 6 tabs.

  15. Hazard Detection Analysis for a Forward-Looking Interferometer

    Science.gov (United States)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; hide

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  16. Fort Drum Preliminary Fiscal Impact Analysis.

    Science.gov (United States)

    1986-02-01

    of inmigrants 0 Fiscal histories, projections, and impacts for counties, cities, towns, villages, school districts, and the state. The results of...distribution of the inmigrating population within the three counties. Thus, an accurate forecast of the expected distribution of the inmigrating population is a...The distribution of inmigration to the school districts was made using the analysis explained in Chapter 3. Children associated with 800 new on-post

  17. Preliminary analysis of turbochargers rotors dynamic behaviour

    Science.gov (United States)

    Monoranu, R.; Ştirbu, C.; Bujoreanu, C.

    2016-08-01

    Turbocharger rotors for the spark and compression ignition engines are resistant steels manufactured in order to support the exhaust gas temperatures exceeding 1200 K. In fact, the mechanical stress is not large as the power consumption of these systems is up to 10 kW, but the operating speeds are high, ranging between 30000 ÷ 250000 rpm. Therefore, the correct turbochargers functioning involves, even from the design stage, the accurate evaluation of the temperature effects, of the turbine torque due to the engine exhaust gases and of the vibration system behaviour caused by very high operating speeds. In addition, the turbocharger lubrication complicates the model, because the classical hydrodynamic theory cannot be applied to evaluate the floating bush bearings. The paper proposes a FEM study using CATIA environment, both as modeling medium and as tool for the numerical analysis, in order to highlight the turbocharger complex behaviour. An accurate design may prevent some major issues which can occur during its operation.

  18. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, R.D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  19. Hazardous Glaciers In Switzerland: A Statistical Analysis of Inventory Data

    Science.gov (United States)

    Raymond, M.; Funk, M.; Wegmann, M.

    Because of the recent increase in both occupation and economical activities in high mountain areas, a systematic overview of potential hazard zones of glaciers is needed to avoid the constuction of settlements and infrastructures in endangered areas in fu- ture. Historical informations about glacier disasters show that catastrophic events can happen repeatedly for the same causes and with the same dramatic consequences. Past catastrophic events are not only useful to identify potentially dangerous glaciers, but represent an indication of the kind of glacier hazards to expect for any given glacier. An inventory containing all known events having caused damages in the past has been compiled for Switzerland. Three different types of glacier hazards are distinguished , e.g. ice avalanches, glacier floods and glacier length changes.Hazardous glaciers have been identified in the alpine cantons of Bern, Grison, Uri, Vaud and Valais so far. The inventory data were analysed in terms of periodicity of different types of events as well as of damage occured.

  20. Preliminary evaluation of PETC-coal conversion solid and hazardous wastes. Progress report, September 15, 1977--September 30, 1978

    Energy Technology Data Exchange (ETDEWEB)

    Neufeld, R.D.; Shapiro, M.; Chen, C.; Wallach, S.; Sain, S.

    1978-09-30

    This progress report reviews issues and local area practice relative to the disposal of small quantity laboratory solid and chemical wastes from the PETC site. Research efforts to date have been in two major directions, a) solid and hazardous waste problems relative to PETC, and b) solid and hazardous waste problems relative to coal gasification and liquefaction conversion processes. It is intended that bench scale coal conversion processes located at PETC be considered as small but typical models for residuals sample generation. A literature search activity has begun in order to develop a data bank of coal conversion residual characterizations, and identify other centers of hazardous waste handling research expertise.

  1. Preliminary Analysis of Helicopter Options to Support Tunisian Counterterrorism Operations

    Science.gov (United States)

    2016-04-27

    results of the current analysis and in Mouton et al., 2015, is the relative cost -effectiveness between the CH-47D and the Mi-17v5. In the previous...helicopters from Sikorsky to fulfill a number of roles in counterterrorism operations. Rising costs and delays in delivery raised the question of...whether other cost -effective options exist to meet Tunisia’s helicopter requirement. Approach Our team conducted a preliminary assessment of

  2. Estimating Source Recurrence Rates for Probabilistic Tsunami Hazard Analysis (PTHA)

    Science.gov (United States)

    Geist, E. L.; Parsons, T.

    2004-12-01

    A critical factor in probabilistic tsunami hazard analysis (PTHA) is estimating the average recurrence rate for tsunamigenic sources. Computational PTHA involves aggregating runup values derived from numerical simulations for many far-field and local sources, primarily earthquakes, each with a specified probability of occurrence. Computational PTHA is the primary method used in the ongoing FEMA pilot study at Seaside, Oregon. For a Poissonian arrival time model, the probability for a given source is dependent on a single parameter: the mean inter-event time of the source. In other probability models, parameters such as aperiodicity are also included. In this study, we focus on methods to determine the recurrence rates for large, shallow subduction zone earthquakes. For earthquakes below about M=8, recurrence rates can be obtained from modified Gutenberg-Richter distributions that are constrained by the tectonic moment rate for individual subduction zones. However, significant runup from far-field sources is commonly associated with the largest magnitude earthquakes, for which the recurrence rates are poorly constrained by the tail of empirical frequency-magnitude relationships. For these earthquakes, paleoseismic evidence of great earthquakes can be used to establish recurrence rates. Because the number of geologic horizons representing great earthquakes along a particular subduction zone is limited, special techniques are needed to account for open intervals before the first and after the last observed events. Uncertainty in age dates for the horizons also has to be included in estimating recurrence rates and aperiodicity. A Monte Carlo simulation is performed in which a random sample of earthquake times is drawn from a specified probability distribution with varying average recurrence rates and aperiodicities. A recurrence rate can be determined from the mean rate of all random samples that fit the observations, or a range of rates can be carried through the

  3. Hazard analysis for 300 Area N Reactor Fuel Fabrication and Storage Facilty

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, D.J.; Brehm, J.R.

    1994-01-25

    This hazard analysis (HA) has been prepared for the 300 Area N Reactor Fuel Fabrication and Storage Facility (Facility), in compliance with the requirements of Westinghouse Hanford Company (Westinghouse Hanford) controlled manual WHC-CM-4-46, Nonreactor Facility Safety Analysis Manual, and to the direction of WHC-IP-0690, Safety Analysis and Regulation Desk Instructions, (WHC 1992). An HA identifies potentially hazardous conditions in a facility and the associated potential accident scenarios. Unlike the Facility hazard classification documented in WHC-SD-NR-HC-004, Hazard Classification for 300 Area N Reactor Fuel Fabrication and Storage Facility, (Huang 1993), which is based on unmitigated consequences, credit is taken in an HA for administrative controls or engineered safety features planned or in place. The HA is the foundation for the accident analysis. The significant event scenarios identified by this HA will be further evaluated in a subsequent accident analysis.

  4. RHDM procedure for analysis of the potential specific risk due to a rockfall hazard

    Directory of Open Access Journals (Sweden)

    Blažo Đurović

    2005-06-01

    Full Text Available Theoretical basis and practical legislation (Water Law and regulation acts would allow in future the determination and classification of endangered territorial zones due to various natural hazards, among them also due to rock collapse and rockfall hazard as forms of the mass movement hazard. Interdisciplinary risk analysis, assessment and management of natural hazard are factors of harmonious spatial development in future. Especially risk analysis is the essential part of preventive mitigation actions and forms the basis for evaluation of the spatial plans, programs and policies.In accordance with the basic principles of the risk analysis the Rockfall Hazard Determination Method (RHDM for estimation of the potential specific risk degree due to a rock fall hazard along roadways and in hinterland is introduced. The method is derivedfrom the Rockfall Hazard Rating System (RHRS and adjusted to a holistic concept of the risk analysis procedure. The outcomes of the phenomenon simulation with a computer programme for rock mass movement analysis at local scale are included as well as climateand seismic conditions criteria which are newly introduced, thus making this method more adequate for specific geologic conditions in Slovenia.

  5. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-04-26

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... the proposed rule, ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  6. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-02-19

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  7. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-11-20

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... 3646), entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk- Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based...

  8. Development of hazard analysis by critical control points (HACCP) procedures to control organic chemical hazards in the agricultural production of raw food commodities.

    Science.gov (United States)

    Ropkins, Karl; Ferguson, Andrew; Beck, Angus J

    2003-01-01

    Hazard Analysis by Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards in the food chain. Effective HACCP requires the consideration of all chemical microbiological, and physical hazards. However, current procedures focus primarily on microbiological and physical hazards, while chemical aspects of HACCP have received relatively little attention. In this article we discuss the application of HACCP to organic chemical contaminants and the problems that are likely to be encountered in agriculture. We also present generic templates for the development of organic chemical contaminant HACCP procedures for selected raw food commodities, that is, cereal crops,raw meats, and milk.

  9. Hazard Analysis and Critical Control Point Program for Foodservice Establishments.

    Science.gov (United States)

    Control Point ( HACCP ) inspections in foodservice operations throughout the state. The HACCP system , which first emerged in the late 1960s, is a rational...has been adopted for use in the foodservice industry. The HACCP system consists of three main components which are the: (1) Assessment of the hazards...to monitor critical control points. This system has shown promise as a tool to reduce the frequency of foodborne disease outbreaks in foodservice

  10. Rockfall hazard analysis using LiDAR and spatial modeling

    Science.gov (United States)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  11. [Investigation and analysis on occupational hazards in a carbon enterprise].

    Science.gov (United States)

    Lu, C D; Ding, Q F; Wang, Z X; Shao, H; Sun, X C; Zhang, F

    2017-04-20

    Objective: To explore occupational-disease-inductive in a carbon enterprise workplace and personnel occupational health examination, providing the basis for occupational disease prevention and control of the industry. Methods: Field occupational health survey and inspection law were used to study the the situation and degree of occupational disease hazards in carbon enterprise from 2013 to 2015.Occupational health monitoring was used for workers, physical examination, detection of occupational hazard factors and physical examination results were analyzed comprehensive. Results: Dust, coal tar pitch volatiles, and noise in carbon enterprise were more serious than others. Among them, the over standard rate of coal tar pitch volatiles was 76.67%, the maximum point detection was 1.06 mg/m(3), and the maximum of the individual detection was 0.67 mg/m(3). There was no statistical difference among the 3 years (P>0.05) . There were no significant differences in the incidence of occupation health examination, chest X-ray, skin audiometry, blood routine, blood pressure, electrocardiogram between 3 years (P>0.05) , in which the skin and audiometry abnormal rate was higher than 10% per year. Conclusion: Dust, coal tar, and noise are the main occupational hazard factors of carbon enterprise, should strengthen the corresponding protection.

  12. Solvent substitution: an analysis of comprehensive hazard screening indices.

    Science.gov (United States)

    Debia, M; Bégin, D; Gérin, M

    2011-06-01

    The air index (ψ(i)(air)) of the PARIS II software (Environmental Protection Agency), the Indiana Relative Chemical Hazard Score (IRCHS), and the Final Hazard Score (FHS) used in the P2OASys system (Toxics Use Reduction Institute) are comprehensive hazard screening indices that can be used in solvent substitution. The objective of this study was to evaluate these indices using a list of 67 commonly used or recommended solvents. The indices ψ(i)(air), IRCHS and FHS were calculated considering 9, 13, and 33 parameters, respectively, that summarized health and safety hazards, and environmental impacts. Correlation and sensitivity analyses were performed. The vapor hazard ratio (VHR) was used as a reference point. Two good correlations were found: (1) between VHR and ψ(i)(air) (ρ = 0.84), (2) and between IRCHS and FHS (ρ = 0.81). Values of sensitivity ratios above 0.2 were found with ψ(i)(air) (4 of 9 parameters) and IRCHS (3 of 13 parameters), but not with FHS. Overall, the three indices exhibited important differences in the way they integrate key substitution factors, such as volatility, occupational exposure limit, skin exposure, flammability, carcinogenicity, photochemical oxidation potential, atmospheric global effects, and environmental terrestrial and aquatic effects. These differences can result in different choices of alternatives between indices, including the VHR. IRCHS and FHS are the most comprehensive indices but are very tedious and complex to use and lack sensitivity to several solvent-specific parameters. The index ψ(i)(air) is simpler to calculate but does not cover some parameters important to solvents. There is presently no suitably comprehensive tool available for the substitution of solvents. A two-tier approach for the selection of solvents is recommended to avoid errors that could be made using only a global index or the consideration of the simple VHR. As a first tier, one would eliminate solvent candidates having crucial impacts. As a

  13. Hazard Analysis and Risk Assessment for an Automated Unmanned Protective Vehicle

    OpenAIRE

    Stolte, Torben; Bagschik, Gerrit; Reschka, Andreas; Maurer, and Markus

    2017-01-01

    For future application of automated vehicles in public traffic, ensuring functional safety is essential. In this context, a hazard analysis and risk assessment is an important input for designing functionally vehicle automation systems. In this contribution, we present a detailed hazard analysis and risk assessment (HARA) according to the ISO 26262 standard for a specific Level 4 application, namely an unmanned protective vehicle operated without human supervision for motorway hard shoulder r...

  14. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    Science.gov (United States)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  15. A simple tool for preliminary hazard identification and quick assessment in craftwork and small/medium enterprises (SME).

    Science.gov (United States)

    Colombini, Daniela; Occhipinti, E; Di Leone, G

    2012-01-01

    During the last Congress of the International Ergonomics Association (IEA), Beijing, August 2009, an international group was founded aimed at developing a "toolkit for MSD prevention" within IEA and in collaboration with World Health Organization (WHO). Possible users of toolkits are: members of health and safety committees, health and safety representatives, line supervisors; labor inspectors; health workers implementing basic occupational health services; occupational health and safety specialists.According to ISO standard 11228 series and the new Draft CD ISO 12259-2009: Application document guides for the potential user, a computer software ( in Excel®) was create dealing with hazard "mapping" in handicraft The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazard identification and risk estimation. Thus it makes possible to decide for which professional hazards a more exhaustive risk assessment will be necessary and which professional consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).

  16. Landslide hazard zonation assessment using GIS analysis at Golmakan Watershed, northeast of Iran

    Institute of Scientific and Technical Information of China (English)

    Mohammad Reza MANSOURI DANESHVAR; Ali BAGHERZADEH

    2011-01-01

    Landslide hazard is one of the major environmental hazards in geomorphic studies in mountainous areas. For helping the planners in selection of suitable locations to implement development projects, a landslide hazard zonation map has been produced for the Golmakan Watershed as part of Binaloud northern hillsides (northeast of Iran). For this purpose, after preparation of a landslide inventory of the study area, some 15 major parameters were examined for integrated analysis of landslide hazard in the region. The analyses of parameters were done by geo-referencing and lateral model making, satellite imaging of the study area, and spatial analyses by using geographical information system (GIS). The produced factor maps were weighted with analytic hierarchy process (AHP) method and then classified. The study area was classified into four classes of relative landslide hazards:negligible, low, moderate, and high. The final produced map for landslide hazard zonation in Golmakan Watershed revealed that: 1 ) the parameters of land slope and geologic formation have strong correlation (R2 = 0.79 and 0.83,respectively) with the dependent variable landslide hazard (p < 0.05). 2) About 18.8% of the study area has low and negligible hazards to future landslides, while 81.2% of the land area of Golmakan Watershed falls into the high and moderate categories.

  17. Progresses in geology and hazards analysis of Tianchi Volcano

    Institute of Scientific and Technical Information of China (English)

    WEI Hai-quan; JIN Bo-lu; LIU Yong-shun

    2004-01-01

    A number of different lahars have been recognized from a systematic survey of a mapping project. The high setting temperature feature of the deposits indicates a relationship between the lahar and the Millennium eruption event of Tianchi Volcano. The lahars caused a dramatic disaster. Recognize of the huge avalanche scars and deposits around Tianchi Volcano imply another highly destructive hazard. Three types of different texture of the avalanche deposits have been recognized. There was often magma mixing processes during the Millennium eruption of Tianchi Volcano, indicating a mixing and co-eruption regime of the eruption.

  18. Debris flow and landslide hazard mapping and risk analysis in China

    Institute of Scientific and Technical Information of China (English)

    Xilin LIU; Chengjun YU; Peijun SHI; Weihua FANG

    2012-01-01

    This paper assesses the hazardousness,vulnerability and risk of debris flow and landslide in China and compiles maps with a scale of 1∶6000000,based on Geographical Information System (GIS) technology,hazard regionalization map,socioeconomic data from 2000.Integrated hazardousness of debris flow and landslide is equivalent to the sum of debris flow hazardousness and landslide hazardousness.Vulnerability is assessed by employing a simplified assessment model.Risk is calculated by the following formula:Risk =Hazardousness × Vulnerability.The analysis results of assessment of hazardousness,vulnerability and risk show that there are extremely high risk regions of 104 km2,high risk regions of 283008 km2,moderate risk regions of 3161815 km2,low risk regions of 3299604km2,and extremely low risk regions of 2681709 km2.Exploitation activities should be prohibited in extremely high risk and high risk regions and restricted in moderate risk regions.The present study on risk analysis of debris flow and landslide not only sheds new light on the future work in this direction but also provides a scientific basis for disaster prevention and mitigation policy making.

  19. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  20. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  1. NRT Rotor Structural / Aeroelastic Analysis for the Preliminary Design Review

    Energy Technology Data Exchange (ETDEWEB)

    Ennis, Brandon Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Paquette, Joshua A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    This document describes the initial structural design for the National Rotor Testbed blade as presented during the preliminary design review at Sandia National Laboratories on October 28- 29, 2015. The document summarizes the structural and aeroelastic requirements placed on the NRT rotor for satisfactory deployment at the DOE/SNL SWiFT experimental facility to produce high-quality datasets for wind turbine model validation. The method and result of the NRT blade structural optimization is also presented within this report, along with analysis of its satisfaction of the design requirements.

  2. Preliminary Analysis of ULPC Light Curves Using Fourier Decomposition Technique

    CERN Document Server

    Ngeow, Chow-Choong; Kanbur, Shashi; Barrett, Brittany; Lin, Bin

    2013-01-01

    Recent work on Ultra Long Period Cepheids (ULPCs) has suggested their usefulness as a distance indicator, but has not commented on their relationship as compared with other types of variable stars. In this work, we use Fourier analysis to quantify the structure of ULPC light curves and compare them to Classical Cepheids and Mira variables. Our preliminary results suggest that the low order Fourier parameters of ULPCs show a continuous trend defined by Classical Cepheids after the resonance around 10 days. However their Fourier parameters also overlapped with those from Miras, which make the classification of long period variable stars difficult based on the light curves information alone.

  3. Determinants of Trade Credit: A Preliminary Analysis on Construction Sector

    Directory of Open Access Journals (Sweden)

    Nicoleta Barbuta-Misu

    2016-07-01

    Full Text Available This paper introduces a preliminary analysis of the correlations between trade credit and some selected measures of financial performance for a sample of 958 firms acting in the construction sector. The examined period covers 2004-2013. The sample derived from Amadeus database contains firms that have sold and bought on credit. Results showed that larger firms offered and used more credit than counterparties. Firms offered and used in same time credit, but not in same level. Firms with higher return on assets and profit margin used and offered less credit from suppliers, respectively to clients. Moreover, more liquid firms used less trade payables.

  4. Comparison of Hazard Analysis Requirements for Instrumentation and Control System of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo [KAERI, Daejeon (Korea, Republic of); Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2014-08-15

    A hazard, in general, is defined as 'potential for harm.' In this paper, the scope of 'harm' is limited to the loss of a safety function in a Nuclear Power Plant (NPP). The Hazard Analysis (HA) of an Instrumentation and Control (I and C) systems is to identify the relationship from the logical faults, error, and failure of I and C systems to the physical harm of the nuclear power plant, and also to find the impact of the external hazard, e.g., tsunami, of the nuclear power plant to the I and C systems. This paper includes the survey of the existing hazard analysis requirements in the nuclear industries. The purpose of the paper is to compare the HA requirements in various international standards in unclear domain, specifically the safety requirements and guidance for the instrumentation and control system for the nuclear power plant from IAEA, IEC, IEEE, and NRC.

  5. Two-dimensional hazard estimation for longevity analysis

    DEFF Research Database (Denmark)

    Fledelius, Peter; Guillen, M.; Nielsen, J.P.

    2004-01-01

    We investigate developments in Danish mortality based on data from 1974-1998 working in a two-dimensional model with chronological time and age as the two dimensions. The analyses are done with non-parametric kernel hazard estimation techniques. The only assumption is that the mortality surface...... the two-dimensional mortality surface. Furthermore we look at aggregated synthetic population metrics as 'population life expectancy' and 'population survival probability'. For Danish women these metrics indicate decreasing mortality with respect to chronological time. The metrics can not directly be used...... for prediction purposes. However, we suggest that life insurance companies use the estimation technique and the cross-validation for bandwidth selection when analyzing their portfolio mortality. The non-parametric approach may give valuable information prior to developing more sophisticated prediction models...

  6. Towards increased reliability by objectification of Hazard Analysis and Risk Assessment (HARA) of automated automotive systems

    OpenAIRE

    Khastgir, Siddartha; Birrell, Stewart A.; Dhadyalla, Gunwant; Sivencrona, Håkan; Jennings, P. A. (Paul A.)

    2017-01-01

    Hazard Analysis and Risk Assessment (HARA) in various domains like automotive, aviation, process industry etc. suffer from the issues of validity and reliability. While there has been an increasing appreciation of this subject, there have been limited approaches to overcome these issues. In the automotive domain, HARA is influenced by the ISO 26262 international standard which details functional safety of road vehicles. While ISO 26262 was a major step towards analysing hazards and risks, lik...

  7. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  8. Preliminary results of the U.S. Nuclear Regulatory Commission collaborative research program to assess tsunami hazard for nuclear power plants on the Atlantic and gulf coasts

    Science.gov (United States)

    Kammerer, A.M.; ten Brink, Uri S.; Twitchell, David C.; Geist, Eric L.; Chaytor, Jason D.; Locat, J.; Lee, H.J.; Buczkowski, Brian J.; Sansoucy, M.

    2018-01-01

    In response to the 2004 Indian Ocean Tsunami, the United States Nuclear Regulatory Commission (US NRC) initiated a long-term research program to improve understanding of tsunami hazard levels for nuclear facilities in the United States. For this effort, the US NRC organized a collaborative research program with the United States Geological Survey (USGS) and other key researchers for the purpose of assessing tsunami hazard on the Atlantic and Gulf Coasts of the United States. The initial phase of this work consisted principally of collection, interpretation, and analysis of available offshore data and information. Necessarily, the US NRC research program includes both seismic- and landslide-based tsunamigenic sources in both the near and the far fields. The inclusion of tsunamigenic landslides, an important category of sources that impact tsunami hazard levels for the Atlantic and Gulf Coasts over the long time periods of interest to the US NRC is a key difference between this program and most other tsunami hazard assessment programs. Although only a few years old, this program is already producing results that both support current US NRC activities and look toward the long-term goal of probabilistic tsunami hazard assessment. This paper provides a summary of results from several areas of current research. An overview of the broader US NRC research program is provided in a companion paper in this conference.

  9. Preliminary risk analysis applied to the handling of health-care waste

    Directory of Open Access Journals (Sweden)

    Carvalho S.M.L.

    2002-01-01

    Full Text Available Between 75% and 90% of the waste produced by health-care providers no risk or is "general" health-care waste, comparable to domestic waste. The remaining 10-25% of health-care waste is regarded as hazardous due to one or more of the following characteristics: it may contain infectious agents, sharps, toxic or hazardous chemicals or it may be radioactive. Infectious health-care waste, particularly sharps, has been responsible for most of the accidents reported in the literature. In this work the preliminary risks analysis (PRA technique was used to evaluate practices in the handling of infectious health-care waste. Currently the PRA technique is being used to identify and to evaluate the potential for hazard of the activities, products, and services from facilities and industries. The system studied was a health-care establishment which has handling practices for infectious waste. Thirty-six procedures related to segregation, containment, internal collection, and storage operation were analyzed. The severity of the consequences of the failure (risk that can occur from careless management of infectious health-care waste was classified into four categories: negligible, marginal, critical, and catastrophic. The results obtained in this study showed that events with critics consequences, about 80%, may occur during the implementation of the containment operation, suggesting the need to prioritize this operation. As a result of the methodology applied in this work, a flowchart the risk series was also obtained. In the flowchart the events that can occur as a consequence of a improper handling of infectious health-care waste, which can cause critical risks such as injuries from sharps and contamination (infection from pathogenic microorganisms, are shown.

  10. Logic-tree Approach for Probabilistic Tsunami Hazard Analysis and its Applications to the Japanese Coasts

    Science.gov (United States)

    Annaka, Tadashi; Satake, Kenji; Sakakiyama, Tsutomu; Yanagisawa, Ken; Shuto, Nobuo

    2007-03-01

    For Probabilistic Tsunami Hazard Analysis (PTHA), we propose a logic-tree approach to construct tsunami hazard curves (relationship between tsunami height and probability of exceedance) and present some examples for Japan for the purpose of quantitative assessments of tsunami risk for important coastal facilities. A hazard curve is obtained by integration over the aleatory uncertainties, and numerous hazard curves are obtained for different branches of logic-tree representing epistemic uncertainty. A PTHA consists of a tsunami source model and coastal tsunami height estimation. We developed the logic-tree models for local tsunami sources around Japan and for distant tsunami sources along the South American subduction zones. Logic-trees were made for tsunami source zones, size and frequency of tsunamigenic earthquakes, fault models, and standard error of estimated tsunami heights. Numerical simulation rather than empirical relation was used for estimating the median tsunami heights. Weights of discrete branches that represent alternative hypotheses and interpretations were determined by the questionnaire survey for tsunami and earthquake experts, whereas those representing the error of estimated value were determined on the basis of historical data. Examples of tsunami hazard curves were illustrated for the coastal sites, and uncertainty in the tsunami hazard was displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves.

  11. Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilles Youinou; R. Sonat Sen

    2013-09-01

    The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

  12. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  13. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    Science.gov (United States)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  14. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    Science.gov (United States)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  15. Electronic Warfare M-on-N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis

    Science.gov (United States)

    2017-04-12

    E. Jarvis Electronic Warfare M-on- N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis Advanced Techniques Branch Tactical...12-04-2017 NRL Memorandum Report Electronic Warfare M-on- N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis Donald E...ELECTRONIC WARFARE M-ON- N DIGITAL SIMULATION LOGGING REQUIREMENTS AND HDF5: A PRELIMINARY ANALYSIS 1. INTRODUCTION HDF5 technology [Folk] has been

  16. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  17. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    Science.gov (United States)

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  18. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    Science.gov (United States)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  19. Analysis preliminary phytochemical raw extract of leaves Nephrolepis pectinata

    Directory of Open Access Journals (Sweden)

    Natally Marreiros Gomes

    2017-06-01

    Full Text Available The Nephrolepis pectinata popularly known as paulista fern, ladder-heaven, cat tail, belongs to the family Davalliaceae. For the beauty of the arrangements of their leaves ferns are quite commercialized in Brazil, however, have not been described in the literature studies on their pharmacological potential. Thus, the objective of this research was to analyze the phytochemical properties of the crude extract of the leaves of Nephrolepis pectinata. To perform the phytochemical analysis were initially made the collection of the vegetable, preparation of voucher specimen, washing, drying and grinding. Then, extraction by percolation method and end the phytochemical analysis. Preliminary results phytochemicals the crude extract of the leaves of Nephrolepis pectinata tested positive for reducing sugars, phenols/tannins (catechins tannins and catechins.

  20. Some preliminary results of a worldwide seismicity estimation: a case study of seismic hazard evaluation in South America

    Directory of Open Access Journals (Sweden)

    C. V. Christova

    2000-06-01

    Full Text Available Global data have been widely used for seismicity and seismic hazard assessment by seismologists. In the present study we evaluate worldwide seismicity in terms of maps of maximum observed magnitude (Mmax, seismic moment (M 0 and seismic moment rate (M 0S. The data set used consists of a complete and homogeneous global catalogue of shallow (h £ 60 km earthquakes of magnitude MS ³ 5.5 for the time period 1894-1992. In order to construct maps of seismicity and seismic hazard the parameters a and b derived from the magnitude-frequency relationship were estimated by both: a the least squares, and b the maximum likelihood, methods. The values of a and b were determined considering circles centered at each grid point 1° (of a mesh 1° ´1° and of varying radius, which starts from 30 km and moves with a step of 10 km. Only a and b values which fulfill some predefined conditions were considered in the further procedure for evaluating the seismic hazard maps. The obtained worldwide M max distribution in general delineates the contours of the plate boundaries. The highest values of M max observed are along the circum-Pacific belt and in the Himalayan area. The subduction plate boundaries are characterized by the largest amount of M 0 , while areas of continental collision are next. The highest values of seismic moment rate (per 1 year and per equal area of 10 000 km 2 are found in the Southern Himalayas. The western coasts of U.S.A., Northwestern Canada and Alaska, the Indian Ocean and the eastern rift of Africa are characterized by high values of M 0 , while most of the Pacific subduction zones have lower values of seismic moment rate. Finally we analyzed the seismic hazard in South America comparing the predicted by the NUVEL1 model convergence slip rate between Nazca and South America plates with the average slip rate due to earthquakes. This consideration allows for distinguishing between zones of high and low coupling along the studied convergence

  1. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...steps are applicable for a software HAZOP . 2 Plan HAZOP Establish HAZOP analysis goals, definitions, worksheets, schedule and process. Divide the...Subtle Incorrect Output’s value is wrong, but cannot be detected Table 3. HAZOP Guide Words for Software or System Interface Analysis31 The

  2. Preliminary Results of Bedrock Variations in the Tekirdag Region (NW Turkey) by Multidisciplinary Geophysical Methods for Earthquake Hazard Mitigation

    Science.gov (United States)

    Tuncer, M. K.; Arslan, M. S.; Ozel, A. O.; İşseven, T.; Genc, T.; Aksahin, B. B.

    2016-12-01

    As it is well known, North Anatolian fault Zone is highly capable of producing destructive earthquakes. Hence, earthquake hazard mitigation studies are very important for the urban areas which is close to the major faults. From this point of view, multidisciplinary geophysical methods has important role for the study of seismic hazard problems including seismotectonic zoning. Our study area Tekirdag region which located western end of Nort Anatolian Fault Zone is quite close to the North Anatolian Fault which is capable of producing a large earthquake. We carried out research on determination of bedrock variations has been carried out in the Tekirdag Region which took place in the western end of North Anatolian Fault Zone by using multidisciplinary geophysical methods. This research has been performed in the frame of a national project, which is a complimentary project of the joint project between Turkey and Japan (JICA&JST), named as "Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education. Microgravity and magnetic measurements are performed on the seven profiles of 45km to 60km length. We attempt to map variations in bedrock, its geologic structure along the profiles. According to the results obtained in the region where the north-south direction is toward the north branch of the bedrock, and also in the east-west direction it was determined to be deepening westward. Final target would be 3-dimensional mapping of bedrock in the area.

  3. Risk analysis for roadways subjected to multiple landslide-related hazards

    Science.gov (United States)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two

  4. Assessing sensitivity of Probabilistic Seismic Hazard Analysis (PSHA) to fault parameters: Sumatra case study

    Science.gov (United States)

    Omang, A.; Cummins, P. R.; Horspool, N.; Hidayati, S.

    2012-12-01

    Slip rate data and fault geometry are two important inputs in determining seismic hazard, because they are used to estimate earthquake recurrence intervals which strongly influence the hazard level in an area. However, the uncertainty of slip-rates and geometry of the fault are rarely considered in any probabilistic seismic hazard analysis (PSHA), which is surprising given the estimates of slip-rates can vary significantly from different data sources (e.g. geological vs. Geodetic). We use the PSHA method to assess the sensitivity of seismic hazard to fault slip-rates along the Great Sumatran Fault in Sumatra, Indonesia. We will consider the epistemic uncertainty of fault slip rate by employing logic trees to include alternative slip rate models. The weighting of the logic tree is determined by the probability density function of the slip rate estimates using the approach of Zechar and Frankel (2009). We consider how the PSHA result accounting for slip rate uncertainty differs from that for a specific slip rate by examining hazard values as a function of return period and distance from the fault. We also consider the geometry of the fault, especially the top and the bottom of the rupture area within a fault, to study the effect from different depths. Based on the results of this study, in some cases the uncertainty in fault slip-rates, fault geometry and maximum magnitude have a significant effect on hazard level and area impacted by earthquakes and should be considered in PSHA studies.

  5. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    Science.gov (United States)

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-08

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  6. Empirical Projection of Long-Term Coastal Erosion Hazards in Hawaii Under Rising Sea Levels: Preliminary Findings

    Science.gov (United States)

    Anderson, T. R.; Barbee, M.; Fletcher, C. H., II; Romine, B. M.; Lemmo, S.

    2014-12-01

    Chronic erosion dominates sandy beaches of Hawaii causing loss and beach narrowing; and damaging homes, infrastructure, and critical habitat. Increased rates of sea level rise (SLR) will likely exacerbate these problems. Shoreline managers and other stakeholders need guidance to support long-range planning and adaptation efforts. Despite recent advances in sophisticated numerical models, there remains a need for simple approaches to estimating land areas that are threatened by erosion on decadal-to-century time scales due to SLR. While not as detailed as numerical models, empirical approaches can provide a first-order approximation to shoreline change that may be useful for coastal management and planning. Shoreline managers in Hawaii commonly work with historical data to provide information on coastal erosion. Simple linear regression methods have been especially attractive in Hawaii, where complex reef topography can cause high spatial variability in sediment transport patterns. Yet, facing projected future increases in the rate of SLR, extrapolating historical trends is insufficient. Predictions of shoreline change with SLR commonly employ controversial geometric models (e.g., the Bruun Model) that do not account for sediment availability and alongshore variability captured in historical data. Furthermore, these two projections often produce conflicting results. We report here on the early results of mapping probability-based erosion hazard areas, determined by combining the extrapolated historical shoreline change model with a geometric model of shoreline response (Davidson-Arnott, 2005) to strictly accelerated SLR. A geographic information system is used to explore the intersection between potential erosion hazards, coastal geology, and development patterns. This approach is attractive because it is simple and utilizes existing datasets. Yet, its simplicity implies broad assumptions of the coastal system and leads to large uncertainty in projections. To

  7. Solar Stirling power generation - Systems analysis and preliminary tests

    Science.gov (United States)

    Selcuk, M. K.; Wu, Y.-C.; Moynihan, P. I.; Day, F. D., III

    1977-01-01

    The feasibility of an electric power generation system utilizing a sun-tracking parabolic concentrator and a Stirling engine/linear alternator is being evaluated. Performance predictions and cost analysis of a proposed large distributed system are discussed. Design details and preliminary test results are presented for a 9.5 ft diameter parabolic dish at the Jet Propulsion Laboratory (Caltech) Table Mountain Test Facility. Low temperature calorimetric measurements were conducted to evaluate the concentrator performance, and a helium flow system is being used to test the solar receiver at anticipated working fluid temperatures (up to 650 or 1200 C) to evaluate the receiver thermal performance. The receiver body is designed to adapt to a free-piston Stirling engine which powers a linear alternator assembly for direct electric power generation. During the next phase of the program, experiments with an engine and receiver integrated into the concentrator assembly are planned.

  8. Primate phylogeny studied by comparative determinant analysis. A preliminary report.

    Science.gov (United States)

    Bauer, K

    1993-01-01

    In this preliminary report the divergence times for the major primate groups are given, calculated from a study by comparative determinant analysis of 69 proteins (equaling 0.1% of the whole genetic information). With an origin of the primate order set at 80 million years before present, the ages of the last common ancestors (LCAs) of man and the major primate groups obtained this way are as follows: Pan troglodytes 5.2; Gorilla gorilla 7.4; Pongo pygmaeus 19.2; Hylobates lar 20.3; Old World monkeys 31.4; Lagothrix lagotricha 46.0; Cebus albifrons 59.5; three lemur species 67.0, and Galago crassicaudatus 73.3 million years. The LCA results and the approach are shortly discussed. A full account of this extended investigation including results on nonprimate mammals and on the determinant structures and the immunologically derived evolutionary rates of the proteins analyzed will be published elsewhere.

  9. PRELIMINARY PHYTOCHEMICAL ANALYSIS OF ACTINIOPTERIS RADIATA (SWARTZ LINK.

    Directory of Open Access Journals (Sweden)

    R. Manonmani

    2013-06-01

    Full Text Available The objective of the present study was to find out the presence of preliminary phytochemicals in six different solvent extracts of Actiniopteris radiata (Swartz link. by qualitative screening methods. The solvent used for the extraction of leaf and rhizome powder were ethanol, petroleum ether, chloroform, acetone, DMSO and aqueous. The secondary metabolites such as steroids, triterpenoids, reducing sugars, sugars, alkaloids, phenolic compounds, catechins, flavonoids, saponins, tannins, anthroquinones and amino acids were screened by using standard methods. The phytochemical analysis of the ethanolic extract of both (leaf & rhizome revealed the presence of most active constituents than the other solvents. The ethanolic rhizome extracts of Actiniopteris radiata showed higher amount of phytochemicals when compared with the ethanolic leaf extracts.

  10. A preliminary study of a wake vortex encounter hazard boundary for a B737-100 airplane

    Science.gov (United States)

    Reimer, Heidi M.; Vicroy, Dan D.

    1996-01-01

    A preliminary batch simulation study was conducted to define the wake decay required for a Boeing 737-100 airplane to safely encounter a Boeing 727 wake and land. The baseline six-degree-of-freedom B737 simulation was modified to include a wake model and the strip-theory calculation of the vortex-induced forces and moments. The guidance and control inputs for the airplane were provided by an autoland system. The wake strength and encounter altitude were varied to establish a safe encounter boundary. The wake was positioned such that the desired flight path traversed the core of the port Vortex. Various safe landing criteria were evaluated for defining a safe encounter boundary. A sensitivity study was also conducted to assess the effects of encounter model inaccuracies.

  11. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    Science.gov (United States)

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  12. PENERAPAN SISTEM HAZARD ANALYSIS CRITICAL CONTROL POINT (HACCP PADA PROSES PEMBUATAN KERIPIK TEMPE

    Directory of Open Access Journals (Sweden)

    Rahmi Yuniarti

    2015-06-01

    Full Text Available Malang is one of the industrial centers of tempe chips. To maintain the quality and food safety, analysis is required to identify the hazards during the production process. This study was conducted to identify the hazards during the production process of tempe chips and provide recommendations for developing a HACCP system. The phases of production process of tempe chips are started from slice the tempe, move it to the kitchen, coat it with flour dough, fry it in the pan, drain it, package it, and then storage it. There are 3 types of potential hazards in terms of biological, physical, and chemical during the production process. With the CCP identification, there are three processes that have Critical Control Point. There are the process of slicing tempe, immersion of tempe into the flour mixture and draining. Recommendations for the development of HACCP systems include recommendations related to employee hygiene, supporting equipment, 5-S analysis, and the production layout.

  13. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    Yi DAI; Zhao-jun WANG; Chang-liang ZOU

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method.Sullivan and woodall pointed out the test statistic lrt (n1, n2) is approximately distributed as x2 (2) as the sample size n, n1 and n2 are very large, and the value of n1 = 2, 3,..., n- 2 and that of n2 = n- n1.So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained.In addition, the properties of the standardized likelihood ratio statistic slr(n1,n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i ≠ n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both.Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  14. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method. Sullivan and woodall pointed out the test statistic lrt(n1, n2) is approximately distributed as x2(2) as the sample size n,n1 and n2 are very large, and the value of n1 = 2,3,..., n - 2 and that of n2 = n - n1. So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained. In addition, the properties of the standardized likelihood ratio statistic slr(n1, n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i≠n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both. Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  15. Liquefaction hazard analysis for infrastructure development in gulf of Jakarta

    Science.gov (United States)

    Dinata, Indra A.; Darlan, Yudi; Sadisun, Imam A.; Pindratno, Haris; Saryanto, Agus

    2016-05-01

    Gulf of Jakarta is an area of active sedimentation. There exist a wide sediment deposition area on the north coast of Jakarta. Generally, these sediments have not been consolidated, so that the conditions in these area is an important factor to determining liquefaction in these area. Liquefaction may occur because of earthquake that cause loss of strength and stiffness in soils. Analysis of liquefaction potential based from SPT data taken at gulf of Jakarta, include susceptibility rate and the factors that triggering. Liquefaction analysis methods compared with each other to get the factor of safety against liquefaction according to the characteristics of the soil. Liquefaction analysis at surface using susceptibility rating factor (SRF). SRF method controled by factors: history, geology, composition, and groundwater. Each factors have parameters that determine the value of SRF.From the analysis, Gulf of Jakarta has susceptibility rating from liquefaction with SRF value 12 - 35. The value shows that Gulf of Jakarta dominated by area that have susceptibility rating from medium to high. High susceptibility rating from liquefaction concentrated at coast area.

  16. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  17. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    Science.gov (United States)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  18. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  19. Damage functions for climate-related hazards: unification and uncertainty analysis

    Science.gov (United States)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  20. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    Science.gov (United States)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  1. Example process hazard analysis of a Department of Energy water chlorination process

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  2. An Earthquake Source Ontology for Seismic Hazard Analysis and Ground Motion Simulation

    Science.gov (United States)

    Zechar, J. D.; Jordan, T. H.; Gil, Y.; Ratnakar, V.

    2005-12-01

    Representation of the earthquake source is an important element in seismic hazard analysis and earthquake simulations. Source models span a range of conceptual complexity - from simple time-independent point sources to extended fault slip distributions. Further computational complexity arises because the seismological community has established so many source description formats and variations thereof; what this means is that conceptually equivalent source models are often expressed in different ways. Despite the resultant practical difficulties, there exists a rich semantic vocabulary for working with earthquake sources. For these reasons, we feel it is appropriate to create a semantic model of earthquake sources using an ontology, a computer science tool from the field of knowledge representation. Unlike the domain of most ontology work to date, earthquake sources can be described by a very precise mathematical framework. Another uniqueness associated with developing such an ontology is that earthquake sources are often used as computational objects. A seismologist generally wants more than to simply construct a source and have it be well-formed and properly described; additionally, the source will be used for performing calculations. Representation and manipulation of complex mathematical objects presents a challenge to the ontology development community. In order to enable simulations involving many different types of source models, we have completed preliminary development of a seismic point source ontology. The use of an ontology to represent knowledge provides machine interpretability and the ability to validate logical consistency and completeness. Our ontology, encoded using the OWL Web Ontology Language - a standard from the World Wide Web Consortium, contains the conceptual definitions and relationships necessary for source translation services. For example, specification of strike, dip, rake, and seismic moment will automatically translate into a double

  3. Reduction of uncertainties in probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-02-01

    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  4. Time-dependent neo-deterministic seismic hazard scenarios: Preliminary report on the M6.2 Central Italy earthquake, 24th August 2016

    CERN Document Server

    Peresan, Antonella; Romashkova, Leontina; Magrin, Andrea; Soloviev, Alexander; Panza, Giuliano F

    2016-01-01

    A scenario-based Neo-Deterministic approach to Seismic Hazard Assessment (NDSHA) is available nowadays, which permits considering a wide range of possible seismic sources as the starting point for deriving scenarios by means of full waveforms modeling. The method does not make use of attenuation relations and naturally supplies realistic time series of ground shaking, including reliable estimates of ground displacement, readily applicable to complete engineering analysis. Based on the neo-deterministic approach, an operational integrated procedure for seismic hazard assessment has been developed that allows for the definition of time dependent scenarios of ground shaking, through the routine updating of earthquake predictions, performed by means of the algorithms CN and M8S. The integrated NDSHA procedure for seismic input definition, which is currently applied to the Italian territory, combines different pattern recognition techniques, designed for the space-time identification of strong earthquakes, with al...

  5. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    Science.gov (United States)

    Sung, Chih-Hsuan; Gao, Jia-Cian; Lee, Chyi-Tyi

    2015-04-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  6. Spatial temporal analysis of urban heat hazard in Tangerang City

    Science.gov (United States)

    Wibowo, Adi; Kuswantoro; Ardiansyah; Rustanto, Andry; Putut Ash Shidiq, Iqbal

    2016-11-01

    Urban heat is a natural phenomenon which might caused by human activities. The human activities were represented by various types of land-use such as urban and non-urban area. The aim of this study is to identify the urban heat behavior in Tangerang City as it might threats the urban environment. This study used three types of remote sensing data namely, Landsat TM, Landsat ETM+ and Landsat OLI-TIRS, to capture the urban heat behavior and to analysis the urban heat signature of Tangerang City in 2001, 2012, 2013, 2014, 2015 and 2016. The result showed that urban heat signature change dynamically each month based on the sun radiation. The urban heat island covered only small part of Tangerang City in 2001, but it was significantly increased and reached 50% of the area in 2012. Based on the result on urban heat signature, the threshold for threatening condition is 30 oC which recognized from land surface temperature (LST). The effective temperature (ET) index explains that condition as warm, uncomfortable, increase stress due to sweating and blood flow and may causing cardiovascular disorder.

  7. The research of mine rock burst hazard identification based on fault tree analysis

    Institute of Scientific and Technical Information of China (English)

    LI Wen; JI Hong-guang; CHENG Jiu-long; CAI Si-jing

    2007-01-01

    In order to identify the rock burst hazard in coalmine and thus to give a credible forecast, firstly, analyzed such effect factors as natural geological factors and mining technological conditions based on the investigation of more than one hundred mine rock burst cases. Secondly, adopted the fault tree analysis (FTA) technology to the mine rock burst hazard identification for the first time and confirmed twelve kinds of basic events,that is, the large mining depth, the burst-orientation coal seams, the solid strata of roof and bottom, near the faults with bigger fall, the folds, the change of seam thickness, other regional tectonics transformation or stress strip, the drilling, blasting and extracting operation,the unscientific extracting methods, the illogical extracting sequence, the residual pillars and the too close distance between the working face and the residual areas or the stopping extracting lines. Moreover, worked out the fault tree of mine rock burst. At last, it made qualitative analysis and quantitative analysis and forecasted the rock burst hazard according to the characteristic of geologic structure and exploitation technology conditions in certain mine of Shandong Province, China, the rock burst accidents happened in the following exploitation validated that it is of feasibility and veracity adopting FTA to identify the mine rock burst hazard.

  8. In silico analysis of nanomaterials hazard and risk.

    Science.gov (United States)

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    false positive relative to false negative predictions and the reliability of toxicity data. To establish the environmental impact of ENMs identified as toxic, researchers will need to estimate the potential level of environmental exposure concentration of ENMs in the various media such as air, water, soil, and vegetation. When environmental monitoring data are not available, models of ENMs fate and transport (at various levels of complexity) serve as alternative approaches for estimating exposure concentrations. Risk management decisions regarding the manufacturing, use, and environmental regulations of ENMs would clearly benefit from both the assessment of potential ENMs exposure concentrations and suitable toxicity metrics. The decision process should consider the totality of available information: quantitative and qualitative data and the analysis of nanomaterials toxicity, and fate and transport behavior in the environment. Effective decision-making to address the potential impacts of nanomaterials will require considerations of the relevant environmental, ecological, technological, economic, and sociopolitical factors affecting the complete lifecycle of nanomaterials, while accounting for data and modeling uncertainties. Accordingly, researchers will need to establish standardized data management and analysis tools through nanoinformatics as a basis for the development of rational decision tools.

  9. Preliminary analysis of distributed in situ soil moisture measurements

    Directory of Open Access Journals (Sweden)

    L. Brocca

    2005-01-01

    Full Text Available Surface soil moisture content is highly variable in both space and time. Remote sensing can provide an effective methodology for mapping surface moisture content over large areas but ground based measurements are required to test its reliability and to calibrate retrieval algorithms. Recently, we had the opportunity to design and perform an experiment aimed at jointly acquiring measurements of surface soil water content at various locations and remotely sensed hyperspectral data. The area selected for the experiment is located in central Umbria and it extends for 90km2. For the area, detailed lithological and multi-temporal landslide inventory maps were available. We identified eight plots where measurements of soil water content were made using a Time Domain Reflectometer (TDR. The plots range in size from 100m2 to 600m2, and cover a variety of topographic and morphological settings. The TDR measurements were conducted during four days, on 5 April, 15 April, 2 May and 3 May 2004. On 3 May the NERC airborne CASI 2 acquired the hyperspectral data. Preliminary analysis concerning the matching between the landslides and the soil moisture were reported. Statistical and geostatistical analysis investigating the spatial-temporal soil moisture distribution were performed. These results will be compared with the data of surface temperature obtained from the remotely sensed hyperspectral sensor.

  10. 78 FR 48636 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-08-09

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food... period. These two proposals are related to the proposed rule ``Current Good Manufacturing Practice...

  11. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Science.gov (United States)

    2010-02-24

    ... Service 7 CFR Parts 210 and 220 RIN 0584-AD65 School Food Safety Program Based on Hazard Analysis and... rule entitled School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Program (SBP) to develop a school food safety program for the preparation and service of school meals...

  12. Postwildfire preliminary debris flow hazard assessment for the area burned by the 2011 Las Conchas Fire in north-central New Mexico

    Science.gov (United States)

    Tillery, Anne C.; Darr, Michael J.; Cannon, Susan H.; Michael, John A.

    2011-01-01

    The Las Conchas Fire during the summer of 2011 was the largest in recorded history for the state of New Mexico, burning 634 square kilometers in the Jemez Mountains of north-central New Mexico. The burned landscape is now at risk of damage from postwildfire erosion, such as that caused by debris flows and flash floods. This report presents a preliminary hazard assessment of the debris-flow potential from 321 basins burned by the Las Conchas Fire. A pair of empirical hazard-assessment models developed using data from recently burned basins throughout the intermountain western United States was used to estimate the probability of debris-flow occurrence and volume of debris flows at the outlets of selected drainage basins within the burned area. The models incorporate measures of burn severity, topography, soils, and storm rainfall to estimate the probability and volume of debris flows following the fire. In response to a design storm of 28.0 millimeters of rain in 30 minutes (10-year recurrence interval), the probabilities of debris flows estimated for basins burned by the Las Conchas Fire were greater than 80 percent for two-thirds (67 percent) of the modeled basins. Basins with a high (greater than 80 percent) probability of debris-flow occurrence were concentrated in tributaries to Santa Clara and Rio del Oso Canyons in the northeastern part of the burned area; some steep areas in the Valles Caldera National Preserve, Los Alamos, and Guaje Canyons in the east-central part of the burned area; tributaries to Peralta, Colle, Bland, and Cochiti canyons in the southwestern part of the burned area; and tributaries to Frijoles, Alamo, and Capulin Canyons in the southeastern part of the burned area (within Bandelier National Monument). Estimated debris-flow volumes ranged from 400 cubic meters to greater than 72,000 cubic meters. The largest volumes (greater than 40,000 cubic meters) were estimated for basins in Santa Clara, Los Alamos, and Water Canyons, and for two

  13. Preliminary design package for Sunair SEC-601 solar collector

    Energy Technology Data Exchange (ETDEWEB)

    1978-12-01

    This report presents the preliminary design of the Owens-Illinois mode Sunair SEC-601 tubular air solar collector. Information in this package includes the Subsystem Design and Development Approaches, hazard analysis, and detailed drawings available as the Preliminary Design Review.

  14. Current issues and related activities in seismic hazard analysis in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong-Moon [Korea Atomic Energy Research Inst., Taejon (Korea, Republic of); Lee, Jong-Rim; Chang, Chun-Joong

    1997-03-01

    This paper discusses some technical issues identified from the seismic hazard analyses for probabilistic safety assessment on the operating Korean nuclear power plants and the related activities to resolve the issues. Since there are no strong instrumental earthquake records in Korea, the seismic hazard analysis is mainly dependent on the historical earthquake records. Results of the past seismic hazard analyses show that there are many uncertainties in attenuation function and intensity level and that there is a need to improve statistical method. The identification of the activity of the Yangsan Fault, which is close to nuclear power plant sites, has been an important issue. But the issue has not been resolved yet in spite of much research works done. Recently, some capable faults were found in the offshore area of Gulupdo Island in the Yellow Sea. It is anticipated that the results of research on both the Yangsan Fault and reduction of uncertainty in seismic hazard analysis will have an significant influence on seismic design and safety assessment of nuclear power plants in the future. (author)

  15. Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)

    Science.gov (United States)

    Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.

    2016-10-01

    Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.

  16. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  17. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    Science.gov (United States)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  18. Research on the spatial analysis method of seismic hazard for island

    Science.gov (United States)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  19. Grid-connected ICES: preliminary feasibility analysis and evaluation. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-30

    The HEAL Complex in New Orleans will serve as a Demonstration Community for which the ICES Demonstration System will be designed. The complex is a group of hospitals, clinics, research facilities, and medical educational facilities. The five tasks reported on are: preliminary energy analysis; preliminary institutional assessment; conceptual design; firming-up of commitments; and detailed work management plan.

  20. Purification, crystallization and preliminary X-ray analysis of struthiocalcin 1 from ostrich (Struthio camelus) eggshell

    Energy Technology Data Exchange (ETDEWEB)

    Reyes-Grajeda, Juan Pablo [Unidad de Proteómica Médica, Instituto Nacional de Medicina Genómica, Mexico City (Mexico); Marín-García, Liliana [Instituto de Química, Universidad Nacional Autónoma de México (Mexico); Stojanoff, Vivian [Brookhaven National Laboratories, NSLS, Upton, New York (United States); Moreno, Abel, E-mail: carcamo@servidor.unam.mx [Instituto de Química, Universidad Nacional Autónoma de México (Mexico); Unidad de Proteómica Médica, Instituto Nacional de Medicina Genómica, Mexico City (Mexico)

    2007-11-01

    The purification, crystallization and preliminary X-ray diffraction data of the protein struthiocalcin 1 isolated from ostrich eggshell are reported. The purification, crystallization and preliminary X-ray analysis of struthiocalcin 1 (SCA-1), a protein obtained from the intramineral part of ostrich (Struthio camelus) eggshell, is reported.

  1. Investigation of Sorption and Diffusion Mechanisms, and Preliminary Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bhave, Ramesh R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jubin, Robert Thomas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Spencer, Barry B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nair, Sankar [Georgia Inst. of Technology, Atlanta, GA (United States)

    2017-02-01

    This report describes the synthesis and evaluation of molecular sieve zeolite membranes to separate and concentrate tritiated water (HTO) from dilute HTO-bearing aqueous streams. Several monovalent and divalent cation exchanged silico alumino phosphate (SAPO-34) molecular sieve zeolite membranes were synthesized on disk supports and characterized with gas and vapor permeation measurements. The pervaporation process performance was evaluated for the separation and concentration of tritiated water. Experiments were performed using tritiated water feed solution containing tritium at the high end of the range (1 mCi/mL) anticipated in a nuclear fuel processing system that includes both acid and water streams recycling. The tritium concentration was about 0.1 ppm. The permeate was recovered under vacuum. The HTO/H2O selectivity and separation factor calculated from the measured tritium concentrations ranged from 0.99 to 1.23, and 0.83-0.98, respectively. Although the membrane performance for HTO separation was lower than expected, several encouraging observations including molecular sieving and high vapor permeance are reported. Additionally, several new approaches are proposed, such as tuning the sorption and diffusion properties offered by small pore LTA zeolite materials, and cation exchanged aluminosilicates with high metal loading. It is hypothesized that substantially improved preferential transport of tritium (HTO) resulting in a more concentrated permeate can be achieved. Preliminary economic analysis for the membrane-based process to concentrate tritiated water is also discussed.

  2. Crystallization and preliminary crystallographic analysis of recombinant human galectin-1

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Stacy A. [Institute for Glycomics, Gold Coast Campus, Griffith University, Queensland 4222 (Australia); Scott, Ken [School of Biological Sciences, University of Auckland, Auckland (New Zealand); Blanchard, Helen, E-mail: h.blanchard@griffith.edu.au [Institute for Glycomics, Gold Coast Campus, Griffith University, Queensland 4222 (Australia)

    2007-11-01

    Human galectin-1 has been cloned, expressed in E. coli, purified and crystallized in the presence of both lactose (ligand) and β-mercaptoethanol under six different conditions. The X-ray diffraction data obtained have enabled the assignment of unit-cell parameters for two novel crystal forms of human galectin-1. Galectin-1 is considered to be a regulator protein as it is ubiquitously expressed throughout the adult body and is responsible for a broad range of cellular regulatory functions. Interest in galectin-1 from a drug-design perspective is founded on evidence of its overexpression by many cancers and its immunomodulatory properties. The development of galectin-1-specific inhibitors is a rational approach to the fight against cancer because although galectin-1 induces a plethora of effects, null mice appear normal. X-ray crystallographic structure determination will aid the structure-based design of galectin-1 inhibitors. Here, the crystallization and preliminary diffraction analysis of human galectin-1 crystals generated under six different conditions is reported. X-ray diffraction data enabled the assignment of unit-cell parameters for crystals grown under two conditions, one belongs to a tetragonal crystal system and the other was determined as monoclinic P2{sub 1}, representing two new crystal forms of human galectin-1.

  3. Preliminary radiation criteria and nuclear analysis for ETF

    Energy Technology Data Exchange (ETDEWEB)

    Engholm, B.A.

    1980-09-01

    Preliminary biological and materials radiation dose criteria for the Engineering Test Facility are described and tabulated. In keeping with the ETF Mission Statement, a key biological dose criterion is a 24-hour shutdown dose rate of 2 mrem/hr on the surface of the outboard bulk shield. Materials dose criteria, which primarily govern the inboard shield design, include 10/sup 9/ rads exposure limit to epoxy insulation, 3 x 10/sup -4/ dpa damage to the TF coil copper stabilizer, and a total nuclear heating rate of 5 kW in the inboard TF coils. Nuclear analysis performed during FY 80 was directed primarily at the inboard and outboard bulk shielding, and at radiation streaming in the neutral beam drift ducts. Inboard and outboard shield thicknesses to achieve the biological and materials radiation criteria are 75 cm inboard and 125 cm outboard, the configuration consisting of alternating layers of stainless steel and borated water. The outboard shield also includes a 5 cm layer of lead. NBI duct streaming analyses performed by ORNL and LASL will play a key role in the design of the duct and NBI shielding in FY 81. The NBI aluminum cryopanel nuclear heating rate during the heating cycle is about 1 milliwatt/cm/sup 3/, which is far less than the permissible limit.

  4. Preliminary analysis of aerial hyperspectral data on shallow lacustrine waters

    Science.gov (United States)

    Bianchi, Remo; Castagnoli, A.; Cavalli, Rosa M.; Marino, Carlo M.; Pignatti, Stefano; Zilioli, Eugenio

    1995-11-01

    The availability of MIVIS hyperspectral data, deriving from an aerial survey recently performed over a test-site in Lake Garda, Italy, gave the possibility of a preliminary new insight in the field of specific applications of remote sensing to shallow water analysis. The spectroradiometers in the visible and in the thermal infrared were explored in particular, accessing to helpful information for the detection of bio-physical indicators of water quality, either related to the surface/sub-surface of waters or to the bottom of the lake, since the study area presents very shallow waters, never exceeding a 6-meter depth in any case. Primary interest was the detection of man-induced activities along the margins, like sewage effect and sedimentary structure in the bottom or algal bloom. Secondly, a correlation between absorbivity coefficients in the visible bands and bathimetric contour lines in the proximity of the marginal zone of the lake was accomplished, by means of two indicative spectroradiometric transects.

  5. Implementation of the Hazard Analysis Critical Control Point (HACCP system to UF white cheese production line

    Directory of Open Access Journals (Sweden)

    Mahmoud El-Hofi

    2010-09-01

    Full Text Available Background. HACCP, or the Hazard Analysis and Critical Control Point System has been recognised as an effective and rational means of assuring food safety from primary production through to final consumption, using a “farm to table” methodology. The application of this preventive oriented approach would give the food producer better control over operation, better manufacturing practices and greater efficiencies, including reduced wastes. Material and methods. The steps taken to put HACCP in place are described and the process was monitored to assess its impact. Assessment of the hygiene quality of the UF white cheese products line before and after HACCP showed an improvement in quality and an overall improvement in the conditions at the company. Results. HACCP was introduced for the in UF White Cheese line at Misr Milk and Food, Mansoura, Egypt, for safe and good quality foods products. All necessary quality control procedures were verified for completeness and to determine if they are being implemented to required standards. A hazard analysis was conducted to identify hazards that may occur in the product cycle, Critical Control Points (CCPs were determined to control the identified hazards. CCP signs were then posted on the factory floor. Critical limits were established at each CCP, corrective actions to be taken when monitoring indicates deviation or loss of control were established. Verification procedures were established to confirm that the HACCP system is working effectively. Documentation concerning all procedures and records was established and integrating HACCP with ISO 9000 under one management system was applied. Conclusions. The HACCP system in this study for UF White Cheese line manufacture is developed step-by-step based on the twelve steps mentioned in the literature review. The prerequisite program was provided to deal with some hazards before the production to simplify the HACCP plan.

  6. Conversion Preliminary Safety Analysis Report for the NIST Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Diamond, D. J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Baek, J. S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Hanson, A. L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Cheng, L-Y [Brookhaven National Lab. (BNL), Upton, NY (United States); Brown, N. [Brookhaven National Lab. (BNL), Upton, NY (United States); Cuadra, A. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-01-30

    The NIST Center for Neutron Research (NCNR) is a reactor-laboratory complex providing the National Institute of Standards and Technology (NIST) and the nation with a world-class facility for the performance of neutron-based research. The heart of this facility is the NIST research reactor (aka NBSR); a heavy water moderated and cooled reactor operating at 20 MW. It is fueled with high-enriched uranium (HEU) fuel elements. A Global Threat Reduction Initiative (GTRI) program is underway to convert the reactor to low-enriched uranium (LEU) fuel. This program includes the qualification of the proposed fuel, uranium and molybdenum alloy foil clad in an aluminum alloy, and the development of the fabrication techniques. This report is a preliminary version of the Safety Analysis Report (SAR) that would be submitted to the U.S. Nuclear Regulatory Commission (NRC) for approval prior to conversion. The report follows the recommended format and content from the NRC codified in NUREG-1537, “Guidelines for Preparing and Reviewing Applications for the Licensing of Non-power Reactors,” Chapter 18, “Highly Enriched to Low-Enriched Uranium Conversions.” The emphasis in any conversion SAR is to explain the differences between the LEU and HEU cores and to show the acceptability of the new design; there is no need to repeat information regarding the current reactor that will not change upon conversion. Hence, as seen in the report, the bulk of the SAR is devoted to Chapter 4, Reactor Description, and Chapter 13, Safety Analysis.

  7. Environmental justice implications of industrial hazardous waste generation in India: a national scale analysis

    Science.gov (United States)

    Basu, Pratyusha; Chakraborty, Jayajit

    2016-12-01

    While rising air and water pollution have become issues of widespread public concern in India, the relationship between spatial distribution of environmental pollution and social disadvantage has received less attention. This lack of attention becomes particularly relevant in the context of industrial pollution, as India continues to pursue industrial development policies without sufficient regard to its adverse social impacts. This letter examines industrial pollution in India from an environmental justice (EJ) perspective by presenting a national scale study of social inequities in the distribution of industrial hazardous waste generation. Our analysis connects district-level data from the 2009 National Inventory of Hazardous Waste Generating Industries with variables representing urbanization, social disadvantage, and socioeconomic status from the 2011 Census of India. Our results indicate that more urbanized and densely populated districts with a higher proportion of socially and economically disadvantaged residents are significantly more likely to generate hazardous waste. The quantity of hazardous waste generated is significantly higher in more urbanized but sparsely populated districts with a higher proportion of economically disadvantaged households, after accounting for other relevant explanatory factors such as literacy and social disadvantage. These findings underscore the growing need to incorporate EJ considerations in future industrial development and waste management in India.

  8. Geological Hazards analysis in Urban Tunneling by EPB Machine (Case study: Tehran subway line 7 tunnel

    Directory of Open Access Journals (Sweden)

    Hassan Bakhshandeh Amnieh

    2016-06-01

    Full Text Available Technological progress in tunneling has led to modern and efficient tunneling methods in vast underground spaces even under inappropriate geological conditions. Identification and access to appropriate and sufficient geological hazard data are key elements to successful construction of underground structures. Choice of the method, excavation machine, and prediction of suitable solutions to overcome undesirable conditions depend on geological studies and hazard analysis. Identifying and investigating the ground hazards in excavating urban tunnels by an EPB machine could augment the strategy for improving soil conditions during excavation operations. In this paper, challenges such as geological hazards, abrasion of the machine cutting tools, clogging around these tools and inside the chamber, diverse work front, severe water level fluctuations, existence of water, and fine-grained particles in the route were recognized in a study of Tehran subway line 7, for which solutions such as low speed boring, regular cutter head checks, application of soil improving agents, and appropriate grouting were presented and discussed. Due to the presence of fine particles in the route, foam employment was suggested as the optimum strategy where no filler is needed.

  9. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    Energy Technology Data Exchange (ETDEWEB)

    Adelman, D.D. [Water Resources Engineer, Lincoln, NE (United States); Stansbury, J. [Univ. of Nebraska-Lincoln, Omaha, NE (United States)

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  10. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    Science.gov (United States)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  11. Probabilistic Earthquake-Tsunami Multi-Hazard Analysis: Application to the Tohoku Region, Japan.

    Directory of Open Access Journals (Sweden)

    Raffaele De Risi

    2016-10-01

    Full Text Available This study develops a novel simulation-based procedure for the estimation of the likelihood that seismic intensity (in terms of spectral acceleration and tsunami inundation (in terms of wave height, at a particular location, will exceed given hazard levels. The procedure accounts for a common physical rupture process for shaking and tsunami. Numerous realizations of stochastic slip distributions of earthquakes having different magnitudes are generated using scaling relationships of source parameters for subduction zones and then using a stochastic synthesis method of earthquake slip distribution. Probabilistic characterization of earthquake and tsunami intensity parameters is carried out by evaluating spatially correlated strong motion intensity through the adoption of ground motion prediction equations as a function of magnitude and shortest distance from the rupture plane and by solving nonlinear shallow water equations for tsunami wave propagation and inundation. The minimum number of simulations required to obtain stable estimates of seismic and tsunami intensity measures is investigated through a statistical bootstrap analysis. The main output of the proposed procedure is the earthquake-tsunami hazard curves representing, for each mean annual rate of occurrence, the corresponding seismic and inundation tsunami intensity measures. This simulation-based procedure facilitates the earthquake-tsunami hazard deaggregation with respect to magnitude and distance. Results are particularly useful for multi-hazard mapping purposes and the developed framework can be further extended to probabilistic earthquake-tsunami risk assessment.

  12. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  13. Preliminary Analysis of Remote Monitoring & Robotic Concepts for Performance Confirmation

    Energy Technology Data Exchange (ETDEWEB)

    D.A. McAffee

    1997-02-18

    ) Identify and discuss the main Performance Confirmation monitoring needs and requirements during the post-emplacement preclosure period. This includes radiological, non-radiological, host rock, and infrastructure performance monitoring needs. It also includes monitoring for possible off-normal events. (Presented in Section 7.3). (3) Identify general approaches and methods for obtaining performance information from within the emplacement drifts for Performance Confirmation. (Presented in Section 7.4) (4)Review and discuss available technologies and design strategies that may permit the use of remotely operated systems within the hostile thermal and radiation environment expected within the emplacement drifts. (Presented in Section 7.5). (5) Based on Performance Confirmation monitoring needs and available technologies, identify potential application areas for remote systems and robotics for post-emplacement preclosure Performance Confirmation activities (Presented in Section 7.6). (6) Develop preliminary remote monitoring and robotic concepts for post-emplacement, preclosure Performance Confirmation activities. (Presented in Section 7.7) This analysis is being performed very early in the systems engineering cycle, even as issues related to the Performance Confirmation program planning phase are being formulated and while the associated needs, constraints and objectives are yet to be fully determined and defined. This analysis is part of an issue formulation effort and is primarily concerned with identification and description of key issues related to remotely monitoring repository performance for Performance Confirmation. One of the purposes of this analysis is to provide an early investigation of potential design challenges that may have a high impact on future design concepts. This analysis can be used to guide future concept development and help access what is feasible and achievable by application of remote systems technology. Future design and systems engineering

  14. Hazard Evaluation for the Saltwell Chempump and a Saltwell Centrifugal Pump Design using Service Water for Lubrication and Cooling

    Energy Technology Data Exchange (ETDEWEB)

    GRAMS, W.H.

    2000-11-16

    This report documents results of a preliminary hazard analysis (PHA) covering the existing Crane Chempump and the new salt well pumping design. Three hazardous conditions were identified for the Chempump and ten hazardous conditions were identified for the new salt well pump design. This report also presents the results of the control decision/allocation process. A backflow preventer and associated limiting condition for operation were assigned to one hazardous condition with the new design.

  15. A Preliminary Tsunami vulnerability analysis for Bakirkoy district in Istanbul

    Science.gov (United States)

    Tufekci, Duygu; Lutfi Suzen, M.; Cevdet Yalciner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Resilience of coastal utilities after earthquakes and tsunamis has major importance for efficient and proper rescue and recovery operations soon after the disasters. Vulnerability assessment of coastal areas under extreme events has major importance for preparedness and development of mitigation strategies. The Sea of Marmara has experienced numerous earthquakes as well as associated tsunamis. There are variety of coastal facilities such as ports, small craft harbors, and terminals for maritime transportation, water front roads and business centers mainly at North Coast of Marmara Sea in megacity Istanbul. A detailed vulnerability analysis for Yenikapi region and a detailed resilience analysis for Haydarpasa port in Istanbul have been studied in previously by Cankaya et al., (2015) and Aytore et al., (2015) in SATREPS project. In this study, the methodology of vulnerability analysis under tsunami attack given in Cankaya et al., (2015) is modified and applied to Bakirkoy district of Istanbul. Bakirkoy district is located at western part of Istanbul and faces to the North Coast of Marmara Sea from 28.77oE to 28.89oE. High resolution spatial dataset of Istanbul Metropolitan Municipality (IMM) is used and analyzed. The bathymetry and topography database and the spatial dataset containing all buildings/structures/infrastructures in the district are collated and utilized for tsunami numerical modeling and following vulnerability analysis. The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability assessment parameters in the district according to vulnerability and resilience are defined; and scored by implementation of a GIS based TVA with appropriate MCDA methods. The risk level is computed using tsunami intensity (level of flow depth from simulations) and TVA results at every location in Bakirkoy district. The preliminary results are presented and discussed

  16. Application of Hazard Analysis and Critical Control Points in Cherry Juice Processing Enterprises

    OpenAIRE

    Peilong Xu; Na Na

    2015-01-01

    Qingdao is one of the homelands for Cherry in China and in recent years, deep processing industry of cherry is developing rapidly. In this study, Hazard Analysis and Critical Control Points (HACCP) quality control system is introduced into production process of cherry juice, which has effectively controlled food safety risks in food production processes. The practices have proved that application of HACCP system reduced probability of pollution in cherry juice production process effectively. ...

  17. Mapping the hazard of extreme rainfall by peaks-over-threshold extreme value analysis and spatial regression techniques

    NARCIS (Netherlands)

    Beguería, S.; Vicente-Serrano, S.M.

    2006-01-01

    The occurrence of rainfalls of high magnitude constitutes a primary natural hazard in many parts of the world, and the elaboration of maps showing the hazard of extreme rainfalls has great theoretical and practical interest. In this work a procedure based on extreme value analysis and spatial

  18. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    Energy Technology Data Exchange (ETDEWEB)

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  19. Uncertainty treatment and sensitivity analysis of the European Probabilistic Seismic Hazard Assessment

    Science.gov (United States)

    Woessner, J.; Danciu, L.; Giardini, D.

    2013-12-01

    Probabilistic seismic hazard assessment (PSHA) aims to characterize the best available knowledge on seismic hazard of a study area, ideally taking into account all sources of uncertainty. The EC-FP7 funded project Seismic Hazard Harmonization for Europe (SHARE) generated a time-independent community-based hazard model for the European region for ground motion parameters spanning from spectral ordinates of PGA to 10s and annual exceedance probabilities from one-in-ten to one-in-ten thousand years. The results will serve as reference to define engineering applications within the EuroCode 8 and provide homogeneous input for state-of-the art seismic safety assessment of critical infrastructure. The SHARE model accounts for uncertainties, whether aleatory or epistemic, via a logic tree. Epistemic uncertainties within the seismic source-model are represented by three source models including a traditional area source model, a model that characterizes fault sources, and an approach that uses kernel-smoothing for seismicity and fault source moment release. Activity rates and maximum magnitudes in the source models are treated as aleatory uncertainties. For practical implementation and computational purposes, some of the epistemic uncertainties in the source model (i.e. dip and strike angles) are treated as aleatory, and a mean seismicity model is considered. Epistemic uncertainties for ground motions are considered by multiple Ground Motion Prediction Equations as a function of tectonic settings and treated as being correlated. The final results contain the full distribution of ground motion variability. We show how we used the logic-tree approach to consider the alternative models and how, based on the degree-of-belief in the models, we defined the weights of the single branches. This contribution features results and sensitivity analysis of the entire European hazard model and selected sites.

  20. Preliminary Core Analysis of a Micro Modular Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Chang Keun; Chang, Jongwa [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Venneri, Francesco [Ultra Safe Nuclear Corporation, Los Alamos (United States); Hawari, Ayman [NC State Univ., Raleigh (United States)

    2014-05-15

    The Micro Modular Reactor (MMR) will be 'melt-down proof'(MDP) under all circumstances, including the complete loss of coolant, and will be easily transportable and retrievable, and suitable for use with very little site preparation and Balance of Plant (BOP) requirements for a variety of applications, from power generation and process heat applications in remote areas to grid-unattached locations, including ship propulsion. The Micro Modular Reactor design proposed in this paper has 3 meter diameter core (2 meter active core) which is suitable for 'factory manufactured' and has few tens year of service life for remote deployment. We confirmed the feasibility of long term service life by a preliminary neutronic analysis in terms of the excess reactivity, the temperature feedback coefficient, and the control margins. We are able to achieve a reasonably long core life time of 5 ∼ 10 years under typical thermal hydraulic condition of a helium cooled reactor. However, on a situation where longer service period and safety is important, we can reduce the power density to the level of typical pebble bed reactor. In this case we can design 10 MWt MMR with core diameter for 10 ∼ 40 years core life time without much loss in the economics. Several burnable poisons are studied and it is found that erbia mixed in the compact matrix seems reasonably good poison. The temperature feedback coefficients were remaining negative during lifetime. Drum type control rods at reflector region and few control rods inside core region are sufficient to control the reactivity during operation and to achieve safe cold shutdown state.

  1. Preliminary results of an oilspill risk analysis for the Bombay High Region

    Digital Repository Service at National Institute of Oceanography (India)

    Mascarenhas, A.A.M.Q.; Gouveia, A.D.; Sitaraman, R.

    An oilspill risk analysis was conducted to determine the relative environmental hazards of developing oil in different regions of the Bombay High, Maharashtra, India. The likely paths of oilslicks, and locations of resources vulnerable to spilled...

  2. Preliminary evaluation of diabatic heating distribution from FGGE level 3b analysis data

    Science.gov (United States)

    Kasahara, A.; Mizzi, A. P.

    1985-01-01

    A method is presented for calculating the global distribution of diabatic heating rate. Preliminary results of global heating rate evaluated from the European center for Medium Range Weather Forecasts Level IIIb analysis data is also presented.

  3. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-risk informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.

  4. The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis

    DEFF Research Database (Denmark)

    Kristensen, Niels Heine; Nielsen, Thorkild; Bruselius-Jensen, Maria Louisa

    2003-01-01

    Kristensen NH, Nielsen T, Bruselius-Jensen M, Scheperlen-Bøgh P, Beckie M, Foster C, Midmore P, Padel S (2002): The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis. Final Report to the EU Commission......Kristensen NH, Nielsen T, Bruselius-Jensen M, Scheperlen-Bøgh P, Beckie M, Foster C, Midmore P, Padel S (2002): The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis. Final Report to the EU Commission...

  5. A Preliminary Tsunami Vulnerability Analysis for Yenikapi Region in Istanbul

    Science.gov (United States)

    Ceren Cankaya, Zeynep; Suzen, Lutfi; Cevdet Yalciner, Ahmet; Kolat, Cagil; Aytore, Betul; Zaytsev, Andrey

    2015-04-01

    One of the main requirements during post disaster recovery operations is to maintain proper transportation and fluent communication at the disaster areas. Ports and harbors are the main transportation hubs which must work with proper performance at all times especially after the disasters. Resilience of coastal utilities after earthquakes and tsunamis have major importance for efficient and proper rescue and recovery operations soon after the disasters. Istanbul is a mega city with its various coastal utilities located at the north coast of the Sea of Marmara. At Yenikapi region of Istanbul, there are critical coastal utilities and vulnerable coastal structures and critical activities occur daily. Fishery ports, commercial ports, small craft harbors, passenger terminals of intercity maritime transportation, water front commercial and/or recreational structures are some of the examples of coastal utilization which are vulnerable against marine disasters. Therefore their vulnerability under tsunami or any other marine hazard to Yenikapi region of Istanbul is an important issue. In this study, a methodology of vulnerability analysis under tsunami attack is proposed with the applications to Yenikapi region. In the study, high resolution (1m) GIS database of Istanbul Metropolitan Municipality (IMM) is used and analyzed by using GIS implementation. The bathymetry and topography database and the vector dataset containing all buildings/structures/infrastructures in the study area are obtained for tsunami numerical modeling for the study area. GIS based tsunami vulnerability assessment is conducted by applying the Multi-criteria Decision Making Analysis (MCDA). The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability parameters in the region due to two different classifications i) vulnerability of buildings/structures and ii) vulnerability of (human) evacuation

  6. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette Jackson [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the new methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.

  7. Hazard analysis of EUCLIDIAN: an image-guided robotic brachytherapy system.

    Science.gov (United States)

    Hu, Yida; Podder, Tarun; Buzurovic, Ivan; Yan, Kaiguo; Ng, Wan Sing; Yu, Yan

    2007-01-01

    Robotic assistance can help clinicians to improve the flexibility of needle insertion and accuracy of seed deposition. However, the robotic platform is a safety critical system for its automated operational mode. Thus, it is important to perform Hazard Identification & Safety Insurance Control (HISIC) for securing the safety of a medical robotic system. In this paper, we have performed HISIC for our robotic platform, called Endo-Uro Computer Lattice for Intratumoral Delivery, Implementation, and Ablation with Nanosensing (ECLIDIAN). The definition and requirements of the system are described by Unified Modeling Language (UML). Failure Mode and Effect Analysis (FMEA) are executed for the principles of HISIC, such as hazard identification, safety insurance control, safety critical limit, monitoring and control. FMEA combined with UML can also be implemented to ensure reliability of the human operation. On the basis of safety control index and fuzzy mathematics, safety effective value is outlined to assess the validity of safety insurance control for robotic system. The above principles and methods are feasible and effective for hazard analysis during the development of the robotic system.

  8. Application of hazard analysis critical control point (HACCP) as a possible control measure for Opisthorchis viverrini infection in cultured carp (Puntius gonionotus).

    Science.gov (United States)

    Khamboonruang, C; Keawvichit, R; Wongworapat, K; Suwanrangsi, S; Hongpromyart, M; Sukhawat, K; Tonguthai, K; Lima dos Santos, C A

    1997-01-01

    Opisthorchiasis due to Opisthorchis viverrini and transmitted through infected freshwater cyprinoid fish (carps) affects more than 8 million people in Thailand, People's Democratic Republic of Lao, and Vietnam. The Hazard Analysis Critical Control Point (HACCP)-concept has been recommended by FAO and WHO to be included in programs to control foodborne trematode infections (FBT). HACCP is a multifactorial approach to control food hazards through surveillance of diseases, foods, and operations and education. This study describes the first attempt to apply HACCP to the prevention and control of Opisthorchis viverrini in pond culture carp (Puntius gonionotus). The experiment was designed and carried out by a multidisciplinary "HACCP team" including experts in the field of public health, parasitology, epidemiology, aquaculture, fisheries extension and fish inspection. The investigation was performed in two fish ponds in the District of Sun Pa Tong, Chiang Mai, Thailand. In the experimental pond, fish was cultured according to HACCP principles and compared with the control pond, which followed conventional aquaculture practices. Water supply to the pond, fish fry, fish feed and pond conditions during the growing period were identified as critical control points (CCPs). Hazards were identified and analyzed, as well as control measures, critical limits, monitoring procedures, corrective actions, and record keeping developed for each one of the above CCPs. Complete pond preparation, particularly aiming to eliminate contamination of pond water with O. viverrini eggs, fish infected with parasite meacercariae and the first intermediate host (Bithynia spp), was conducted. After the pond was filled with water, O. viverrini metacercaria-free fry were released into the pond. The preliminary results obtained indicate that HACCP-based principles applied to carp pond culture could be used as a strategy to prevent and control O. viverrini. Further studies should be undertaken aiming

  9. Ergonomics hazards analysis of linemen's power line fixing work in China.

    Science.gov (United States)

    Yu, Ming; Sun, Linyan; Du, Jianhua; Wu, Fengge

    2009-01-01

    This study used qualitative and quantitative methods, such as OWAS (Ovako working posture analysis system) and behavior observation, to analyze musculoskeletal disorder (MSD) risk factors of power line fixing work in China. Video-based sampling was used to record and analyze the frequency and posture of on-pole activities. Those key subtasks showed ergonomics characteristics of on-pole fixing tasks. Insulator-fixing was the longest subtask (33% of total working time). Bar-installing was the second longest (26% of total working time). It was evident that bar-installing and insulator-fixing were full of hazardous risks. The action categories of the 2 subtasks were higher than of the other ones. The 2 subtasks were also time-consuming, difficult and induced MSDs. Assistant linemen faced more hazardous factors than chief linemen.

  10. Probabilistic hazard analysis of Citlaltépetl (Pico de Orizaba) Volcano, eastern Mexican Volcanic Belt

    Science.gov (United States)

    De la Cruz-Reyna, Servando; Carrasco-Núñez, Gerardo

    2002-03-01

    Citlaltépetl or Pico de Orizaba is the highest active volcano in the North American continent. Although Citlaltépetl is at present in repose, its eruptive history reveals repetitive explosive eruptions in the past. Its relatively low eruption rate has favored significant population growth in areas that may be affected by a potential eruptive activity. The need of some criteria for hazards assessment and land-use planning has motivated the use of statistical methods to estimate the time and space distribution of volcanic hazards around this volcano. The analysis of past activity, from late Pleistocene to historic times, and the extent of some well-identified deposits are used to calculate the recurrence probabilities of eruptions of various size during time periods useful for land-use planning.

  11. Rockfall Hazard Analysis From Discrete Fracture Network Modelling with Finite Persistence Discontinuities

    Science.gov (United States)

    Lambert, Cédric; Thoeni, Klaus; Giacomini, Anna; Casagrande, Davide; Sloan, Scott

    2012-09-01

    Developing an accurate representation of the rock mass fabric is a key element in rock fall hazard analysis. The orientation, persistence and density of fractures control the volume and shape of unstable blocks or compartments. In this study, the discrete fracture modelling technique and digital photogrammetry were used to accurately depict the fabric. A volume distribution of unstable blocks was derived combining polyhedral modelling and kinematic analyses. For each block size, probabilities of failure and probabilities of propagation were calculated. A complete energy distribution was obtained by considering, for each block size, its occurrence in the rock mass, its probability of falling, its probability to reach a given location, and the resulting distribution of energies at each location. This distribution was then used with an energy-frequency diagram to assess the hazard.

  12. Preliminary study on washability and composition analysis of highsulfur coal in some mining areas in Guizhou

    Institute of Scientific and Technical Information of China (English)

    QIU Yue-qin; MAO Song; ZHANG Qin; TIAN Ye; LIU Zhi-hong

    2011-01-01

    Preliminary sink-float experiments on high-sulfur coal was done in some mining areas and carried on elementary analysis, industrial analysis, and ashcontent analysis. Through the experiments, definite middlings, and gangue, the phase analysis of sulfur was carried on, by which a good understanding of sulfur characters in raw coal was achieved.

  13. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    Science.gov (United States)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  14. Seismic Hazard Analysis Using the Adaptive Kernel Density Estimation Technique for Chennai City

    Science.gov (United States)

    Ramanna, C. K.; Dodagoudar, G. R.

    2012-01-01

    Conventional method of probabilistic seismic hazard analysis (PSHA) using the Cornell-McGuire approach requires identification of homogeneous source zones as the first step. This criterion brings along many issues and, hence, several alternative methods to hazard estimation have come up in the last few years. Methods such as zoneless or zone-free methods, modelling of earth's crust using numerical methods with finite element analysis, have been proposed. Delineating a homogeneous source zone in regions of distributed seismicity and/or diffused seismicity is rather a difficult task. In this study, the zone-free method using the adaptive kernel technique to hazard estimation is explored for regions having distributed and diffused seismicity. Chennai city is in such a region with low to moderate seismicity so it has been used as a case study. The adaptive kernel technique is statistically superior to the fixed kernel technique primarily because the bandwidth of the kernel is varied spatially depending on the clustering or sparseness of the epicentres. Although the fixed kernel technique has proven to work well in general density estimation cases, it fails to perform in the case of multimodal and long tail distributions. In such situations, the adaptive kernel technique serves the purpose and is more relevant in earthquake engineering as the activity rate probability density surface is multimodal in nature. The peak ground acceleration (PGA) obtained from all the three approaches (i.e., the Cornell-McGuire approach, fixed kernel and adaptive kernel techniques) for 10% probability of exceedance in 50 years is around 0.087 g. The uniform hazard spectra (UHS) are also provided for different structural periods.

  15. Cusum charts for preliminary analysis of individual observations

    NARCIS (Netherlands)

    A.J. Koning (Alex); R.J.M.M. Does (Ronald)

    1997-01-01

    textabstractA preliminary Cusum chart based on individual observations is developed from the uniformly most powerful test for the detection of linear trends. This Cusum chart is compared with several of its competitors which are based on the likelihood ratio test and on transformations of standardiz

  16. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)

    2016-09-15

    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  17. Probabilistic Rockfall Hazard Analysis in the area affect by the Christchurch Earthquakes, New Zealand

    Science.gov (United States)

    Frattini, P.; Lari, S.; Agliardi, F.; Crosta, G. B.; Salzmann, H.

    2012-04-01

    To limit damages to human lives and property in case of natural disasters, land planning and zonation, as well as the design of countermeasures, are fundamental tools, requiring however a rigorous quantitative risk analysis. As a consequence of the 3rd September 2010 (Mw 7.1) Darfield Earthquake, and the 22nd February (Mw 6.2), the 16th April 2011 (Mw 5.3) and the 13th June, 2011 (Mw 6.2) aftershock events, about 6000 rockfalls were triggered in the Port Hills of Christchurch, New Zealand. Five people were killed by falling rocks in the area, and several hundred homes were damaged or evacuated. In this work, we present a probabilistic rockfall hazard analysis for a small area located in the south-eastern slope of Richmond Hill (0.6 km2, Sumner, Christchurch, NZ). For the analysis, we adopted a new methodology (Probabilistic Rockfall Hazard Analysis, PRHA), which allows to quantify the exceedance probability for a given slope location of being affected by a rockfall event with a specific level of kinetic energy, integrating the contribution of different rockfall magnitude (volume) scenarios. The methodology requires the calculation of onset annual frequency, rockfall runout, and spatially-varying kinetic energy. Onset annual frequencies for different magnitude scenarios were derived from frequency-magnitude relationship adapted from the literature. The probability distribution of kinetic energy for a given slope location and volume scenario was obtained by rockfall runout modeling of non-interacting blocks through the 3D Hy-Stone simulation code. The reference simulation was calibrated by back-analysis of rockfall events occurred during the earthquake. For each rockfall magnitude scenario, 20 rockfall trajectories have been simulated for each source cell using stochastically variable values of restitution parameters. Finally, probabilistic analysis integrating over six rockfall magnitude scenarios (ranging from 0.001 m3 to 1000 m3) was carried out to produce

  18. Flooding hazards from sea extremes and subsidence

    DEFF Research Database (Denmark)

    Sørensen, Carlo; Vognsen, Karsten; Broge, Niels

    2015-01-01

    If we do not understand the effects of climate change and sea level rise (SLR) we cannot live in low-lying coastal areas in the future. Permanent inundation may become a prevalent issue but more often floods related to extreme events have the largest damage potential, and the management of flooding...... hazards needs to integrate the water loading from various sources. Furthermore, local subsidence must be accounted for in order to evaluate current and future flooding hazards and management options. We present the methodology (Figure) and preliminary results from the research project “Coastal Flooding...... Hazards due to Storm Surges and Subsidence” (2014-2017) with the objective to develop and test a practice oriented methodology for combining extreme water level statistics and land movement in coastal flooding hazard mapping and in climate change adaptation schemes in Denmark. From extreme value analysis...

  19. Pasteurised milk and implementation of HACCP (Hazard Analysis Critical Control Point

    Directory of Open Access Journals (Sweden)

    T.B Murdiati

    2004-10-01

    Full Text Available The purpose of pasteurisation is to destroy pathogen bacteria without affecting the taste, flavor, and nutritional value. A study on the implementation of HACCP (Hazard Analysis Critical Control Point in producing pasteurized milk was carried out in four processing unit of pasteurised milk, one in Jakarta, two in Bandung and one in Bogor. The critical control points in the production line were identified. Milk samples were collected from the critical points and were analysed for the total number of microbes. Antibiotic residues were detected on raw milks. The study indicated that one unit in Bandung dan one unit in Jakarta produced pasteurized milk with lower number of microbes than the other units, due to better management and control applied along the chain of production. Penisilin residues was detected in raw milk used by unit in Bogor. Six critical points and the hazard might arise in those points were identified, as well as how to prevent the hazards. Quality assurance system such as HACCP would be able to produce high quality and safety of pasteurised milk, and should be implemented gradually.

  20. Spatial hazard analysis and prediction on rainfall-induced landslide using GIS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The application of landslide hazard model cou-pled with GIS provides an effective means to spatial hazardanalysis and prediction on rainfall-induced landslides. Amodified SINMAP model is established based upon the sys-tematic investigation on previous GIS-based landslide analy-sis models. By integrating the landslide deterministic modelwith the hydrological distribution model based on DEM, thismodel deeply studied the effect of underground water dis-tribution due to rainfall on the slope stability and landslideoccurrence, including the effect of dynamic water pressureresulting from the down slope seepage process as well as thatof static water pressure. Its applicability has been testified onthe Xiaojiang watershed, the rainfall-induced landslideswidespread area in Southeast China. Detailed discussion wascarried out on the spatial distribution characteristics oflandslide hazard and its extending trend, as well as thequantitative relationship between landslide hazard with pre-cipitation, slope angle and specific catchment area in theXiaojiang watershed. And the precipitation threshold forlandslide occurrence was estimated. These analytical resultsare proved useful for geohazard control and engineeringdecision-making in the Xiaojiang watershed.

  1. Laser hazard analysis for airborne AURA (Big Sky variant) Proteus platform.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2004-02-01

    A laser safety and hazard analysis was performed for the airborne AURA (Big Sky Laser Technology) lidar system based on the 2000 version of the American National Standard Institute's (ANSI) Standard Z136.1, for the Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for the Safe Use of Lasers Outdoors. The AURA lidar system is installed in the instrument pod of a Proteus airframe and is used to perform laser interaction experiments and tests at various national test sites. The targets are located at various distances or ranges from the airborne platform. In order to protect personnel, who may be in the target area and may be subjected to exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength, calculate the Nominal Ocular Hazard Distance (NOHD), and determine the maximum 'eye-safe' dwell times for various operational altitudes and conditions. It was also necessary to calculate the appropriate minimum Optical Density (ODmin) of the laser safety eyewear used by authorized personnel who may receive hazardous exposures during ground base operations of the airborne AURA laser system (system alignment and calibration).

  2. Seismic hazard analysis of Sinop province, Turkey using probabilistic and statistical methods

    Indian Academy of Sciences (India)

    Recai Feyiz Kartal; Günay Beyhan; Ayhan Keskinsezer

    2014-04-01

    Using 4.0 and greater magnitude earthquakes which occurred between 1 January 1900 and 31 December 2008 in the Sinop province of Turkey this study presents a seismic hazard analysis based on the probabilistic and statistical methods. According to the earthquake zonation map, Sinop is divided into first, second, third and fourth-degree earthquake regions. Our study area covered the coordinates between 40.66°–42.82°N and 32.20°–36.55°E. The different magnitudes of the earthquakes during the last 108 years recorded on varied scales were converted to a common scale (Mw). The earthquake catalog was then recompiled to evaluate the potential seismic sources in the aforesaid province. Using the attenuation relationships given by Boore et al. (1997) and Kalkan and Gülkan (2004), the largest ground accelerations corresponding to a recurrence period of 475 years are found to be 0.14 g for bedrock at the central district. Comparing the seismic hazard curves, we show the spatial variations of seismic hazard potential in this province, enumerating the recurrence period in the order of 475 years.

  3. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    Science.gov (United States)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2017-01-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  4. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, North-Western Italy

    Science.gov (United States)

    Turconi, L.; Tropeano, D.; Savio, G.; De, S. Kumar; Mason, P. J.

    2015-04-01

    A Civil Protection Plan has been drafted for a 600 km2 mountainous region in NW Italy Consisting of Orco and Soana Valleys. It is a part of the oldest natural park in Italy and attracts several thousand tourists every year. The work is concerned with the analysis of relevant physiographic characteristics of this Alpine landscapehaving extremely variable geomorphology and possess a long history of instability. Thousands of records as well as digital maps (involving overlay and comparison of up to 90 GIS layers) have been analyzed and cross-correlated to find out the details of the events. The study area experienced different types of natural hazards, typical of the whole Alpine environment. Thus, the present area has been selected for such multi-hazard research in which several natural processes have been investigated, concerning their damaging effects over the land. Due to 36 different severe hazardous events at least 250 deaths have been recorded in the area since 18th Century, in the occasion of.

  5. Criticality analysis for hazardous materials transportation; Classificacao da criticidade das rotas do transporte rodoviario de produtos perigosos da BRASKEM

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Katia; Brady, Mariana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Diniz, Americo [BRASKEM S.A., Sao Paulo, SP (Brazil)

    2008-07-01

    The bad conditions of Brazilians roads drive the companies to be more exigent with the transportation of hazardous materials to avoid accidents or materials releases with actions to contain the releases to community and water sources. To minimize this situation, DNV and BRASKEM developed a methodology for risk analysis called Criticality Analysis for Hazardous Materials Transportation. The objective of this methodology is identifying the most critical points of routes to make actions to avoid accidents. (author)

  6. Critical load analysis in hazard assessment of metals using a Unit World Model.

    Science.gov (United States)

    Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L

    2011-09-01

    A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic

  7. The hazard analysis and critical control point system in food safety.

    Science.gov (United States)

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series.

  8. Preliminary environmental analysis of a geopressured-geothermal test well in Brazoria County, Texas

    Energy Technology Data Exchange (ETDEWEB)

    White, W.A.; McGraw, M.; Gustavson, T.C.; Meriwether, J.

    1977-11-16

    Preliminary environmental data, including current land use, substrate lithology, soils, natural hazards, water resources, biological assemblages, meteorological data, and regulatory considerations have been collected and analyzed for approximately 150 km/sup 2/ of land near Chocolate Bayou, Brazoria County, Texas, in which a geopressured-geothermal test well is to be drilled in the fall of 1977. The study was designed to establish an environmental data base and to determine, within spatial constraints set by subsurface reservoir conditions, environmentally suitable sites for the proposed well. Preliminary analyses of data revealed the eed for focusing on the following areas: potential for subsidence and fault activation, susceptibility of test well and support facilities to fresh- and salt-water flooding, possible effects of produced saline waters on biological assemblages and groundwaer resources, distribution of expansive soils, and effect of drilling and associated support activities on known archeological-cultural resources.

  9. Remote sensing and GIS-based landslide hazard analysis and cross-validation using multivariate logistic regression model on three test areas in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet

    2010-05-01

    This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross

  10. Current Mooring Design in Partner WECs and Candidates for Preliminary Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Ferri, Francesco; Kofoed, Jens Peter

    This report is the combined report of Commercial Milestone "CM1: Design and Cost of Current Mooring Solutions of Partner WECs" and Milestone "M3: Mooring Solutions for Preliminary Analysis" of the EUDP project "Mooring Solutions for Large Wave Energy Converters". The report covers a description...... of the current mooring design of the partner Wave Energy Converter (WEC) developers in the project, together with a preliminary cost estimate of the systems....

  11. Asymptotics on Semiparametric Analysis of Multivariate Failure Time Data Under the Additive Hazards Model

    Institute of Scientific and Technical Information of China (English)

    Huan-bin Liu; Liu-quan Sun; Li-xing Zhu

    2005-01-01

    Many survival studies record the times to two or more distinct failures on each subject. The failures may be events of different natures or may be repetitions of the same kind of event. In this article, we consider the regression analysis of such multivariate failure time data under the additive hazards model. Simple weighted estimating functions for the regression parameters are proposed, and asymptotic distribution theory of the resulting estimators are derived. In addition, a class of generalized Wald and generalized score statistics for hypothesis testing and model selection are presented, and the asymptotic properties of these statistics are examined.

  12. [Incorporation of the Hazard Analysis and Critical Control Point system (HACCP) in food legislation].

    Science.gov (United States)

    Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R

    2004-01-01

    The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety.

  13. Application of Hazard Analysis and Critical Control Points in Cherry Juice Processing Enterprises

    Directory of Open Access Journals (Sweden)

    Peilong Xu

    2015-09-01

    Full Text Available Qingdao is one of the homelands for Cherry in China and in recent years, deep processing industry of cherry is developing rapidly. In this study, Hazard Analysis and Critical Control Points (HACCP quality control system is introduced into production process of cherry juice, which has effectively controlled food safety risks in food production processes. The practices have proved that application of HACCP system reduced probability of pollution in cherry juice production process effectively. The application of risk control system in cherry juice production provides benefits for standardization of the production process and helps in food safety supervision in production processes.

  14. Preliminary Dynamic Siol-Structure-Interaction Analysis for the Waste Handling Building

    Energy Technology Data Exchange (ETDEWEB)

    G. Wagenblast

    2000-05-01

    The objective of this analysis package is to document a preliminary dynamic seismic evaluation of a simplified design concept of the Wade Handling Building (WHB). Preliminary seismic ground motions and soil data will be used. Loading criteria of the WHB System Design Description will be used. Detail design of structural members will not be performed.. The results of the analysis will be used to determine preliminary sizes of structural concrete and steel members and to determine whether the seismic response of the structure is within an acceptable level for future License Application design of safety related facilities. In order to complete this preliminary dynamic evaluation to meet the Site Recommendation (SR) schedule, the building configuration was ''frozen in time'' as the conceptual design existed in October 1999. Modular design features and dry or wet waste storage features were intentionally excluded from this preliminary dynamic seismic evaluation. The document was prepared in accordance with the Development Plan for the ''Preliminary/Dynamic Soil Structure Interaction Analysis for the Waste Handling Building'' (CRWMS M&O 2000b), which was completed, in accordance with AP-2.13Q, ''Technical Product Development Planning''.

  15. Landslide and debris-flow hazard analysis and prediction using GIS in Minamata Hougawachi area, Japan

    Science.gov (United States)

    Wang, Chunxiang; Esaki, Tetsuro; Xie, Mowen; Qiu, Cheng

    2006-10-01

    On July 20, 2003, following a short duration of heavy rainfall, a debris-flow disaster occurred in the Minamata Hougawachi area, Kumamoto Prefecture, Japan. This disaster was triggered by a landslide. In order to assess the landslide and debris-flow hazard potential of this mountainous region, the study of historic landslides is critical. The objective of the study is to couple 3D slope-stability analysis models and 2D numerical simulation of debris flow within a geographical information systems in order to identity the potential landslide-hazard area. Based on field observations, the failure mechanism of the past landslide is analyzed and the mechanical parameters for 3D slope-stability analysis are calculated from the historic landslide. Then, to locate potential new landslides, the studied area is divided into slope units. Based on 3D slope-stability analysis models and on Monte Carlo simulation, the spots of potential landslides are identified. Finally, we propose a depth-averaged 2D numerical model, in which the debris and water mixture is assumed to be a uniform continuous, incompressible, unsteady Newtonian fluid. The method accurately models the historic debris flow. According to the 2D numerical simulation, the results of the debris-flow model, including the potentially inundated areas, are analyzed, and potentially affected houses, river and road are mapped.

  16. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steve [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ma, Zhegang [Idaho National Lab. (INL), Idaho Falls, ID (United States); Spears, Bob [Idaho National Lab. (INL), Idaho Falls, ID (United States); Szilard, Ronaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kosbab, Ben [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-07-26

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA models for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.

  17. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  18. Thermal Hydraulic Analysis of K-DEMO Single Blanket Module for Preliminary Accident Analysis using MELCOR

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Sung Bo; Bang, In Cheol [UNIST, Ulsan (Korea, Republic of)

    2016-05-15

    To develop the Korean fusion commercial reactor, preliminary design concept for K-DEMO (Korean fusion demonstration reactor) has been announced by NFRI (National Fusion Research Institute). This pre-conceptual study of K-DEMO has been introduced to identify technical details of a fusion power plant for the future commercialization of fusion reactor in Korea. Before this consideration, to build the K-DEMO, accident analysis is essential. Since the Fukushima accident, which is severe accident from unexpected disaster, safety analysis of nuclear power plant has become important. The safety analysis of both fission and fusion reactors is deemed crucial in demonstrating the low radiological effect of these reactors on the environment, during severe accidents. A risk analysis of K-DEMO should be performed, as a prerequisite for the construction of a fusion reactor. In this research, thermal-hydraulic analysis of single blanket module of K-DEMO is conducted for preliminary accident analysis for K-DEMO. Further study about effect of flow distributer is conducted. The normal K-DEMO operation condition is applied to the boundary condition and simulated to verify the material temperature limit using MELCOR. MELCOR is fully integrated, relatively fast-running code developed by Sandia National Laboratories. MELCOR had been used for Light Water Reactors and fusion reactor version of MELCOR was developed for ITER accident analysis. This study shows the result of thermal-hydraulic simulation of single blanket module with MELCOR which is severe accident code for nuclear fusion safety analysis. The difference of mass flow rate for each coolant channel with or without flow distributer is presented. With flow distributer, advantage of broadening temperature gradient in the K-DEMO blanket module and increase mass flow toward first wall is obtained. This can enhance the safety of K-DEMO blanket module. Most 13 .deg. C temperature difference in blanket module is obtained.

  19. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    Science.gov (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  20. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    Directory of Open Access Journals (Sweden)

    H. Apel

    2015-08-01

    Full Text Available Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU for time-efficient flood propagation modelling. All hazards – fluvial, pluvial and combined – were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median

  1. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

    Energy Technology Data Exchange (ETDEWEB)

    Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

    2010-11-01

    Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

  2. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

    Energy Technology Data Exchange (ETDEWEB)

    Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

    2010-11-01

    Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

  3. Strong Ground-Motion Prediction in Seismic Hazard Analysis: PEGASOS and Beyond

    Science.gov (United States)

    Scherbaum, F.; Bommer, J. J.; Cotton, F.; Bungum, H.; Sabetta, F.

    2005-12-01

    The SSHAC Level 4 approach to probabilistic seismic hazard analysis (PSHA), which could be considered to define the state-of-the-art in PSHA using multiple expert opinions, has been fully applied only twice, firstly in the multi-year Yucca Mountain study and subsequently (2002-2004) in the PEGASOS project. The authors of this paper participated as ground-motion experts in this latter project, the objective of which was comprehensive seismic hazard analysis for four nuclear power plant sites in Switzerland, considering annual exceedance frequencies down to 1/10000000. Following SSHAC procedure, particular emphasis was put on capturing both the aleatory and epistemic uncertainties. As a consequence, ground motion prediction was performed by combining several empirical ground motion models within a logic tree framework with the weights on each logic tree branch expressing the personal degree-of-belief of each ground-motion expert. In the present paper, we critically review the current state of ground motion prediction methodology in PSHA in particular for regions of low seismicity. One of the toughest lessons from PEGASOS was that in systematically and rigorously applying the laws of uncertainty propagation to all of the required conversions and adjustments of ground motion models, a huge price has to be paid in an ever-growing aleatory variability. Once this path has been followed, these large sigma values will drive the hazard, particularly for low annual frequencies of exceedance. Therefore, from a post-PEGASOS perspective, the key issues in the context of ground-motion prediction for PSHA for the near future are to better understand the aleatory variability of ground motion and to develop suites of ground-motion prediction equations that employ the same parameter definitions. The latter is a global rather than a regional challenge which might be a desirable long-term goal for projects similar to the PEER NGA (Pacific Earthquake Engineering Research Center, Next

  4. Geophysical techniques in the historical center of Venice (Italy): preliminary results from HVSR and multichannel analysis of surface waves

    Science.gov (United States)

    Trevisani, Sebastiano; Rocca, Michele; Boaga, Jacopo

    2014-05-01

    This presentation aims to outline the preliminary findings related to an extensive seismic survey conducted in the historical center of Venice, Italy. The survey was conducted via noninvasive and low-cost seismic techniques based on surface waves analysis and microtremor methods, mainly using single station horizontal to vertical spectral ratio techninques (HVSR) and multichannel analysis of surface waves in passive (ReMI) and active (MASW) configurations. The importance and the fragility of the cultural heritage of Venice, coupled with its peculiar geological and geotechnical characteristics, stress the importance of a good knowledge of its geological architecture and seismic characteristics as an opportunity to improve restoration and conservation planning. Even if Venice is located in a relatively low seismic hazard zone, a local characterization of soil resonance frequencies and surficial shear waves velocities could improve the planning of engineering interventions, furnishing important information on possible local effects related to seismic amplification and possible coupling within buildings and soil resonance frequencies. In the specific we collected more than 50 HVSR single station noise measurements and several passive and active multichannel analysis of surface waves located in the historical center. In this work we report the characteristics of the conducted seismic surveys (instrumentation, sampling geometry, etc.) and the preliminary findings of our analysis. Moreover, we discuss briefly the practical issues, mainly of logistic nature, of conducting this kind of surveys in a peculiar and crowed historical center as represented by Venice urban contest. Acknowledgments Instrumentation acquired in relation to the project co-financed by Regione Veneto, POR-CRO, FESR, 2007-2013, action 1.1.1. "Supporto ad attività di ricerca, processi e reti di innovazione e alla creazione di imprese in settori a elevato contenuto tecnologico"

  5. Cusum charts for preliminary analysis of individual observations

    OpenAIRE

    1997-01-01

    textabstractA preliminary Cusum chart based on individual observations is developed from the uniformly most powerful test for the detection of linear trends. This Cusum chart is compared with several of its competitors which are based on the likelihood ratio test and on transformations of standardized recursive residuals on which for instance the Q-chart methodology is based. It turns out that the new proposed Cusum chart is not only superior in the detection of linear trend out-of-control co...

  6. Grid-connected ICES preliminary feasibility analysis and evaluation. Final report. Volume I. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-30

    A group of hospitals, clinics, research facilities, and medical education facilities, known as the HEAL Complex, was chosen as the site (in New Orleans) for the demonstration of a Grid-Connected Integrated Community Energy System (ICES). The contract work included a preliminary energy supply/demand assessment of the Demonstration Community, a preliminary feasibility analysis and conceptual design of a candidate Demonstration System, preliminary assessment of institutional factors, preparation of a detailed work management plan for subsequent phases of the demonstration program, firming-up of commitments from participating parties, and reporting thereon. This Phase I study has indicated that a central ICES plant producing steam, chilled water, and by-product electricity to serve the HEAL Complex is technically and economically feasible to the extent that Phase II, Detailed Feasibility and Preliminary Design, should be implemented. (MCW)

  7. Fire Hazard Analysis for the Cold Vacuum Drying facility (CVD) Facility

    CERN Document Server

    Singh, G

    2000-01-01

    The CVDF is a nonreactor nuclear facility that will process the Spent Nuclear Fuels (SNF) presently stored in the 105-KE and 105-KW SNF storage basins. Multi-canister overpacks (MCOs) will be loaded (filled) with K Basin fuel transported to the CVDF. The MCOs will be processed at the CVDF to remove free water from the fuel cells (packages). Following processing at the CVDF, the MCOs will be transported to the CSB for interim storage until a long-term storage solution can be implemented. This operation is expected to start in November 2000. A Fire Hazard Analysis (FHA) is required for all new facilities and all nonreactor nuclear facilities, in accordance with U.S. Department of Energy (DOE) Order 5480.7A, Fire Protection. This FHA has been prepared in accordance with DOE 5480.7A and HNF-PRO-350, Fire Hazard Analysis Requirements. Additionally, requirements or criteria contained in DOE, Richland Operations Office (RL) RL Implementing Directive (RLID) 5480.7, Fire Protection, or other DOE documentation are cite...

  8. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    Science.gov (United States)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  9. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Science.gov (United States)

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  10. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    OpenAIRE

    2015-01-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and doc...

  11. Turbine Fuels from Tar Sands Bitumen and Heavy Oil. Phase I. Preliminary Process Analysis.

    Science.gov (United States)

    1985-04-09

    Process Analysis A. F. Talbot. V. Elanchenny, L. H. Finkel, A. Macris and 3. P. Schwedock Sun Tech, Inc., A Subsidiary of Sun Co. P. 0. Box 1135 Marcus Hook...investigation be carried out in three discrete phases, as described below: Phase I - Preliminary process analysis includes an eval- uation of the potential of

  12. A Preliminary Study on Gender Differences in Studying Systems Analysis and Design

    Science.gov (United States)

    Lee, Fion S. L.; Wong, Kelvin C. K.

    2017-01-01

    Systems analysis and design is a crucial task in system development and is included in a typical information systems programme as a core course. This paper presented a preliminary study on gender differences in studying a systems analysis and design course of an undergraduate programme. Results indicated that male students outperformed female…

  13. Incorporating Climate Change Projections into a Hydrologic Hazard Analysis for Friant Dam

    Science.gov (United States)

    Holman, K. D.; Novembre, N.; Sankovich-Bahls, V.; England, J. F.

    2015-12-01

    The Bureau of Reclamation's Dam Safety Office has initiated a series of pilot studies focused on exploring potential impacts of climate change on hydrologic hazards at specific dam locations across the Western US. Friant Dam, located in Fresno, California, was chosen for study because the site had recently undergone a high-level hydrologic hazard analysis using the Stochastic Event Flood Model (SEFM). SEFM is a deterministic flood-event model that treats input parameters as variables, rather than fixed values. Monte Carlo sampling allows the hydrometeorological input parameters to vary according to observed relationships. In this study, we explore the potential impacts of climate change on the hydrologic hazard at Friant Dam using historical and climate-adjusted hydrometeorological inputs to the SEFM. Historical magnitude-frequency relationships of peak inflow and reservoir elevation were developed at Friant Dam for the baseline study using observed temperature and precipitation data between 1966 and 2011. Historical air temperatures, antecedent precipitation, mean annual precipitation, and the precipitation-frequency curve were adjusted for the climate change study using the delta method to create climate-adjusted hydrometeorological inputs. Historical and future climate projections are based on the Bias-Corrected Spatially-Disaggregated CMIP5 dataset (BCSD-CMIP5). The SEFM model was run thousands of times to produce magnitude-frequency relationships of peak reservoir inflow, inflow volume, and reservoir elevation, based on historical and climate-adjusted inputs. Results suggest that peak reservoir inflow and peak reservoir elevation increase (decrease) for all return periods under mean increases (decreases) in precipitation, independently of changes in surface air temperature.

  14. Microbiological quality of food in relation to hazard analysis systems and food hygiene training in UK catering and retail premises.

    Science.gov (United States)

    Little, C L; Lock, D; Barnes, J; Mitchell, R T

    2003-09-01

    A meta-analysis of eight UK food studies was carried out to determine the microbiological quality of food and its relationship with the presence in food businesses of hazard analysis systems and food hygiene training. Of the 19,022 premises visited to collect food samples in these studies between 1997 and 2002, two thirds (66%) were catering premises and one third (34%) were retail premises. Comparison with PHLS Microbiological Guidelines revealed that significantly more ready-to-eat food samples from catering premises (20%; 2,511/12,703) were of unsatisfactory or unacceptable microbiological quality compared to samples from retail premises (12%; 1,039/8,462) (p catering premises (p catering premises (p catering) compared with premises where the manager had received food hygiene training (11% retail, 19% catering) (p catering) were from premises where there was no hazard analysis system in place compared to premises that had a documented hazard analysis system in place (10% retail, 18% catering) (p catering premises compared with those collected from retail premises may reflect differences in management food hygiene training and the presence of a hazard analysis system. The importance of adequate training for food handlers and their managers as a pre-requisite for effective hazard analysis and critical control point (HACCP) based controls is therefore emphasised.

  15. Regional analysis assessment of landslide hazard and zoning map for transmission line route selection using GIS

    Science.gov (United States)

    Baharuddin, I. N. Z.; Omar, R. C.; Usman, F.; Mejan, M. A.; Abd Halim, M. K.; Zainol, M. A.; Zulkarnain, M. S.

    2013-06-01

    The stability of ground as foundation for infrastructure development is always associated with geology and geomorphology aspects. Failure to carefully analyze these aspects may induce ground instability such subsidence and landslide which eventually can cause catastrophe to the infrastructure i.e. instability of transmission tower. However, in some cases such as the study area this is unavoidable. A GIS system for analysis of route was favoured to perform optimal route predictions based selection by incorporating multiple influence factors into its analysis by incorporating the Landslide Hazard Map (LHM) that was produced on basis of slope map, aspect map, land use map and geological map with the help of ArcGIS using weighted overlay method. Based on LHM it is safe to conclude that the proposed route for Ulu Jelai- Neggiri-Lebir-LILO transmission line has very low risk in term of landslides.

  16. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  17. Evaluation of the Potential of NASA Multi-satellite Precipitation Analysis in Global Landslide Hazard Assessment

    Science.gov (United States)

    Hong, Yang; Adler, Robert F.; Huffman, George J.

    2007-01-01

    Landslides are one of the most widespread natural hazards on Earth, responsible for thousands of deaths and billions of dollars in property damage every year. In the U.S. alone landslides occur in every state, causing an estimated $2 billion in damage and 25- 50 deaths each year. Annual average loss of life from landslide hazards in Japan is 170. The situation is much worse in developing countries and remote mountainous regions due to lack of financial resources and inadequate disaster management ability. Recently, a landslide buried an entire village on the Philippines Island of Leyte on Feb 17,2006, with at least 1800 reported deaths and only 3 houses left standing of the original 300. Intense storms with high-intensity , long-duration rainfall have great potential to trigger rapidly moving landslides, resulting in casualties and property damage across the world. In recent years, through the availability of remotely sensed datasets, it has become possible to conduct global-scale landslide hazard assessment. This paper evaluates the potential of the real-time NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA) system to advance our understanding of and predictive ability for rainfall-triggered landslides. Early results show that the landslide occurrences are closely associated with the spatial patterns and temporal distribution of rainfall characteristics. Particularly, the number of landslide occurrences and the relative importance of rainfall in triggering landslides rely on the influence of rainfall attributes [e.g. rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms). TMPA precipitation data are available in both real-time and post-real-time versions, which are useful to assess the location and timing of rainfall-triggered landslide hazards by monitoring landslide-prone areas while receiving heavy rainfall. For the purpose of identifying rainfall-triggered landslides, an empirical global rainfall intensity

  18. Approach of fuzzy logic in the preliminary risk analysis of the upstream and downstream lines of an offshore petroleum production unit

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Claudio B. [PETROBRAS Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Pinho, Edson [Universidade Federal Rural do Rio de Janeiro (UFRRJ), Seropedica, RJ (Brazil); Maia Neto, Luiz

    2009-07-01

    This work consists of the application of a model of qualitative risk assessment based in fuzzy logic for the judgment of criticality of the scenarios of accident identified through the technique of preliminary hazard analysis in the upstream and downstream of an offshore oil production unit already in operation. The model based on fuzzy logic acts as substitute to the traditional Risks Matrix that uses subjective concepts for the categories of expected severity and frequency of the accidents. The structure of the employed model consists of 7 input variables, an internal variable and an output variable, all linked in accordance with the modules of analysis for each type of accident. The developed base of knowledge, that complete the expert system consists of membership functions developed for each one of the variables and a set of 219 distributed inference rules in the 7 different modules. The developed knowledge base, which incorporates the mechanisms of logical reasoning of specialists, assists and guides, with efficiency, the teams that carry through the preliminary hazard analyses with the use of a computer program having previously inserted routines. The employed model incorporates in the knowledge base of the program the existing concepts in the categories of frequency and severity, under the form of membership functions of the linguistic variable and the set of rules. With this, scales subdivided in ranges, defined on the basis of the existing direction present in the risks matrices are used to define the actions to be taken for the analyzed accident scenarios. (author)

  19. Health care system hazard vulnerability analysis: an assessment of all public hospitals in Abu Dhabi.

    Science.gov (United States)

    Fares, Saleh; Femino, Meg; Sayah, Assaad; Weiner, Debra L; Yim, Eugene Sun; Douthwright, Sheila; Molloy, Michael Sean; Irfan, Furqan B; Karkoukli, Mohamed Ali; Lipton, Robert; Burstein, Jonathan L; Mazrouei, Mariam Al; Ciottone, Gregory

    2014-04-01

    Hazard vulnerability analysis (HVA) is used to risk-stratify potential threats, measure the probability of those threats, and guide disaster preparedness. The primary objective of this project was to analyse the level of disaster preparedness in public hospitals in the Emirate of Abu Dhabi, utilising the HVA tool in collaboration with the Disaster Medicine Section at Harvard Medical School. The secondary objective was to review each facility's disaster plan and make recommendations based on the HVA findings. Based on the review, this article makes eight observations, including on the need for more accurate data; better hazard assessment capabilities; enhanced decontamination capacities; and the development of hospital-specific emergency management programmes, a hospital incident command system, and a centralised, dedicated regional disaster coordination centre. With this project, HVAs were conducted successfully for the first time in health care facilities in Abu Dhabi. This study thus serves as another successful example of multidisciplinary emergency preparedness processes. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.

  20. Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment

    Science.gov (United States)

    Catelli, J.; Nong, S.

    2014-12-01

    Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.

  1. Analysis of aerosol emission and hazard evaluation of electrical discharge machining (EDM) process.

    Science.gov (United States)

    Jose, Mathew; Sivapirakasam, S P; Surianarayanan, M

    2010-01-01

    The safety and environmental aspects of a manufacturing process are important due to increased environmental regulations and life quality. In this paper, the concentration of aerosols in the breathing zone of the operator of Electrical Discharge Machining (EDM), a commonly used non traditional manufacturing process is presented. The pattern of aerosol emissions from this process with varying process parameters such as peak current, pulse duration, dielectric flushing pressure and the level of dielectric was evaluated. Further, the HAZOP technique was employed to identify the inherent safety aspects and fire risk of the EDM process under different working conditions. The analysis of aerosol exposure showed that the concentration of aerosol was increased with increase in the peak current, pulse duration and dielectric level and was decreased with increase in the flushing pressure. It was also found that at higher values of peak current (7A) and pulse duration (520 micros), the concentration of aerosols at breathing zone of the operator was above the permissible exposure limit value for respirable particulates (5 mg/m(3)). HAZOP study of the EDM process showed that this process is vulnerable to fire and explosion hazards. A detailed discussion on preventing the fire and explosion hazard is presented in this paper. The emission and risk of fire of the EDM process can be minimized by selecting proper process parameters and employing appropriate control strategy.

  2. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    Science.gov (United States)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  3. Off-Road Terrain Traversability Analysis and Hazard Avoidance for UGVs

    Science.gov (United States)

    2011-01-01

    vehicle to perform hazard detection and avoidance at speeds of up to 10 mph (4.5 m/s), as long as the hazards can be detected at sufficient ranges. The...ranges of hazard detection in this data set are provided in table I. 6 Figure 10: Off-road course Google sky-view image Hazard Feature Max. Detection...Steep slope 115.1 Steep hill Table I: Hazard detection ranges V. FUTURE WORK We have noticed that even in off-road environments, there is usually some

  4. Grand Junction projects office mixed-waste treatment program, VAC*TRAX mobile treatment unit process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bloom, R.R.

    1996-04-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changes to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  5. A regional analysis of elements at risk exposed to mountain hazards in the Eastern European Alps

    Science.gov (United States)

    Fuchs, Sven; Zischg, Andreas

    2014-05-01

    We present a method to quantify the number and value of buildings exposed to torrents and snow avalanches in the Austrian Alps, as well as the number of exposed people. Based on a unique population and building register dataset, a relational SQL database was developed that allows in combination with GIS data a rule-based nation-wide automated analysis. Furthermore, possibilities and challenges are discussed with respect to the use of such data in vulnerability assessment and with respect to resilience measures. We comprehensively address the challenge of data accuracy, scale and uncertainties. From the total of approximately 2.4 million buildings with a clearly attributable geographical location, around 120,000 are exposed to torrent processes (5 %) and snow avalanches (0.4 %); exposition was defined here as located within the digitally available hazard maps of the Austrian Torrent and Avalanche Control Service. Around 5 % of the population (360,000 out of 8.5 million inhabitants), based on those people being compulsory listed in the population register, are located in these areas. The analysis according to the building category resulted in 2.05 million residential buildings in Austria (85 %), 93,000 of which (4.5 %) are exposed to these hazards. In contrast, 37,300 buildings (1.6 %) throughout the country belong to the category of accommodation facilities, 5,600 of which are exposed (15 %). Out of the 140,500 commercial buildings, 8,000 (5 %) are exposed. A considerable spatial variation was detectable within the communities and Federal States. In general, an above-average exposition of buildings to torrent process and snow avalanches was detectable in communities located in the Federal State of Salzburg, Styria and Vorarlberg (torrents), and Tyrol and Vorarlberg (snow avalanches). In the alpine part of Austria, the share of exposed accommodation buildings was two times (Salzburg) and three times (Vorarlberg) higher than the regional average of exposed buildings

  6. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    Science.gov (United States)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  7. Thick Concrete Specimen Construction, Testing, and Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, Dwight A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoegh, Kyle [Univ. of Minnesota, Minneapolis, MN (United States); Khazanovich, Lev [Univ. of Minnesota, Minneapolis, MN (United States)

    2015-03-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations. A preliminary report detailed some of the challenges associated with thick reinforced concrete sections and prioritized conceptual designs of specimens that could be fabricated to represent NPP concrete structures for using in NDE evaluation comparisons. This led to the construction of the concrete specimen presented in this report, which has sufficient reinforcement density and cross-sectional size to represent an NPP containment wall. Details on how a suitably thick concrete specimen was constructed are presented, including the construction materials, final nominal design schematic, as well as formwork and rigging required to safely meet the desired dimensions of the concrete structure. The report also details the type and methods of forming the concrete specimen as well as information on how the rebar and simulated defects were embedded. Details on how the resulting specimen was transported, safely anchored, and marked to allow access for systematic comparative NDE testing of defects in a representative NPP containment wall concrete specimen are also given. Data collection using the MIRA Ultrasonic NDE equipment and

  8. Analysis and evaluation of "noise" of occupational hazards in pumped storage power station

    Science.gov (United States)

    Zhao, Xin; Yang, Hongjian; Zhang, Huafei; Chen, Tao

    2017-05-01

    Aiming at the influence of "noise" of occupational hazards on the physical health of workers, the noise intensity of a working area of a hydropower station in China was evaluated comprehensively. Under the condition of power generation, noise detection is conducted on the main patrol area of the operator, and the noise samples in different regions are analyzed and processed by the single factor analysis of variance. The results show that the noise intensity of different working areas is significantly different, in which the overall noise level of the turbine layer is the highest and beyond the national standard, the protection measures need to be strengthened and the noise intensity of the rest area is normal

  9. Expressed breast milk on a neonatal unit: a hazard analysis and critical control points approach.

    Science.gov (United States)

    Cossey, Veerle; Jeurissen, Axel; Thelissen, Marie-José; Vanhole, Chris; Schuermans, Annette

    2011-12-01

    With the increasing use of human milk and growing evidence of the benefits of mother's milk for preterm and ill newborns, guidelines to ensure its quality and safety are an important part of daily practice in neonatal intensive care units. Operating procedures based on hazard analysis and critical control points can standardize the handling of mother's expressed milk, thereby improving nutrition and minimizing the risk of breast milk-induced infection in susceptible newborns. Because breast milk is not sterile, microorganisms can multiply when the milk is not handled properly. Additional exogenous contamination should be prevented. Strict hygiene and careful temperature and time control are important during the expression, collection, transport, storage, and feeding of maternal milk. In contrast to formula milk, no legal standards exist for the use of expressed maternal milk. The need for additional measures, such as bacteriological screening or heat treatment, remains unresolved.

  10. Developing Sustainable Modeling Software and Necessary Data Repository for Volcanic Hazard Analysis -- Some Lessons Learnt

    Science.gov (United States)

    Patra, A. K.; Connor, C.; Webley, P.; Jones, M.; Charbonnier, S. J.; Connor, L.; Gallo, S.; Bursik, M. I.; Valentine, G.; Hughes, C. G.; Aghakhani, H.; Renschler, C. S.; Kosar, T.

    2014-12-01

    We report here on an effort to improve the sustainability, robustness and usability of the core modeling and simulation tools housed in the collaboratory VHub.org and used in the study of complex volcanic behavior. In particular, we focus on tools that support large scale mass flows (TITAN2D), ash deposition/transport and dispersal (Tephra2 and PUFF), and lava flows (Lava2). These tools have become very popular in the community especially due to the availability of an online usage modality. The redevelopment of the tools ot take advantage of new hardware and software advances was a primary thrust for the effort. However, as we start work we have reoriented the effort to also take advantage of significant new opportunities for supporting the complex workflows and use of distributed data resources that will enable effective and efficient hazard analysis.

  11. Assessment of hygiene standards and Hazard Analysis Critical Control Points implementation on passenger ships.

    Science.gov (United States)

    Mouchtouri, Varavara; Malissiova, Eleni; Zisis, Panagiotis; Paparizou, Evina; Hadjichristodoulou, Christos

    2013-01-01

    The level of hygiene on ferries can have impact on travellers' health. The aim of this study was to assess the hygiene standards of ferries in Greece and to investigate whether Hazard Analysis Critical Control Points (HACCP) implementation contributes to the hygiene status and particularly food safety aboard passenger ships. Hygiene inspections on 17 ferries in Greece were performed using a standardized inspection form, with a 135-point scale. Thirty-four water and 17 food samples were collected and analysed. About 65% (11/17) of ferries were scored with >100 points. Ferries with HACCP received higher scores during inspection compared to those without HACCP (p value hygiene aboard passenger ships.

  12. [The Hazard Analysis Critical Control Point approach (HACCP) in meat production].

    Science.gov (United States)

    Berends, B R; Snijders, J M

    1994-06-15

    The Hazard Analysis Critical Control Point (HACCP) approach is a method that could transform the current system of safety and quality assurance of meat into a really effective and flexible integrated control system. This article discusses the origin and the basic principles of the HACCP approach. It also discusses why the implementation of the approach is not as widespread as might be expected. It is concluded that a future implementation of the approach in the entire chain of meat production, i.e. from conception to consumption, is possible. Prerequisites are, however, that scientifically validated risk analyses become available, that future legislation forms a framework that actively supports the approach, and that all parties involved in meat production not only become convinced of the advantages, but also are trained to implement the HACCP approach with insight.

  13. Pathogen Reduction and Hazard Analysis and Critical Control Point (HACCP) systems for meat and poultry. USDA.

    Science.gov (United States)

    Hogue, A T; White, P L; Heminover, J A

    1998-03-01

    The United States Department of Agriculture (USDA) Food Safety Inspection Service (FSIS) adopted Hazard Analysis and Critical Control Point Systems and established finished product standards for Salmonella in slaughter plants to improve food safety for meat and poultry. In order to make significant improvements in food safety, measures must be taken at all points in the farm-to-table chain including production, transportation, slaughter, processing, storage, retail, and food preparation. Since pathogens can be introduced or multiplied anywhere along the continuum, success depends on consideration and comparison of intervention measures throughout the continuum. Food animal and public health veterinarians can create the necessary preventative environment that mitigates risks for food borne pathogen contamination.

  14. [Monitoring of a HACCP (Hazard Analysis Critical Control Point) plan for Listeria monocytogenes control].

    Science.gov (United States)

    Mengoni, G B; Apraiz, P M

    2003-01-01

    The monitoring of a HACCP (Hazard Analysis Critical Control Point) plan for the Listeria monocytogenes control in the cooked and frozen meat section of a thermo-processing meat plant was evaluated. Seventy "non-product-contact" surface samples and fourteen finished product samples were examined. Thirty eight positive sites for the presence of Listeria sp. were obtained. Twenty-two isolates were identified as L. monocytogenes, two as L. seeligeri and fourteen as L. innocua. Non isolates were obtained from finished product samples. The detection of L. monocytogenes in cooked and frozen meat section environment showed the need for the HACCP plan to eliminate or prevent product contamination in the post-thermal step.

  15. Validation of acid washes as critical control points in hazard analysis and critical control point systems.

    Science.gov (United States)

    Dormedy, E S; Brashears, M M; Cutter, C N; Burson, D E

    2000-12-01

    A 2% lactic acid wash used in a large meat-processing facility was validated as an effective critical control point (CCP) in a hazard analysis and critical control point (HACCP) plan. We examined the microbial profiles of beef carcasses before the acid wash, beef carcasses immediately after the acid wash, beef carcasses 24 h after the acid wash, beef subprimal cuts from the acid-washed carcasses, and on ground beef made from acid-washed carcasses. Total mesophilic, psychrotrophic, coliforms, generic Escherichia coli, lactic acid bacteria, pseudomonads, and acid-tolerant microorganisms were enumerated on all samples. The presence of Salmonella spp. was also determined. Acid washing significantly reduced all counts except for pseudomonads that were present at very low numbers before acid washing. All other counts continued to stay significantly lower (P HACCP plans and can significantly reduce the total number of microorganisms present on the carcass and during further processing.

  16. 230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis

    Science.gov (United States)

    Paces, James B.

    2014-01-01

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  17. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Paces, James B. [U.S. Geological Survey

    2014-08-31

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  18. IMPORTANCE OF APPLICATION OF HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP IN MONTENEGRO TOURISM

    Directory of Open Access Journals (Sweden)

    Vesna Vujacic

    2014-01-01

    Full Text Available Tourism in Montenegro is the leading economic sector, a culinary product - food is an important element of tourist offers. With the development of tourism in Montenegro there is a need to provide quality as well as safe healthy food according to international standards. This paper presents the concept of HACCP and importance of its application in the tourism and hospitality industry. HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. HACCP is designed to act preventively with its principles and presents the most effective solution in providing healthy safe food. The aim of this paper is to present the importance of the application of HACCP concept in tourism of Montenegro as a recognizable and accepted international standard.

  19. Prediction of gas pressurization and hydrogen generation for shipping hazard analysis : Six unstabilized PU 02 samples

    Energy Technology Data Exchange (ETDEWEB)

    Moody, E. W. (Eddie W.); Veirs, D. K. (Douglas Kirk); Lyman, J. L. (John L.)

    2001-01-01

    Radiolysis of water to form hydrogen gas is a safety concern for safe storage and transport of plutonium-bearing materials. Hydrogen gas is considered a safety hazard if its concentration in the container exceeds five percent hydrogen by volume, DOE Docket No. 00-1 1-9965. Unfortunately, water cannot be entirely avoided in a processing environment and these samples contain a range of water inherently. Thermodynamic, chemical, and radiolysis modeling was used to predict gas generation and changes in gas composition as a function of time within sealed containers containing plutonium bearing materials. The results are used in support of safety analysis for shipping six unstabilized (i.e. uncalcined) samples from Rocky Flats Environmental Technology Sits (RFETS) to the Material Identification and Surveillance (MIS) program at Los Alamos National Lab (LANL). The intent of this work is to establish a time window in which safe shipping can occur.

  20. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  1. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  2. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    Science.gov (United States)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  3. GIS-supported geomorphological landslide hazard analysis in the Lainbach catchment, Upper Bavaria

    Science.gov (United States)

    Trau, J.; Ergenzinger, P.

    2003-04-01

    The Lainbach basin is located at the fringe of the Northern Limestone Alps. Predominant mass movements such as translational and rotational slides as well as debris flows are mainly linked to glacial deposits (Pleistocene valley fill) and Flysch series covering approximately 50% of the basin. The pre-Pleistocene relief is buried to a maximum thickness of 170 m of till, glacio-limnic and glacio-fluvial sediments. The spatial and temporal distributions of mass movements are coupled with different stages of fluvial incision. Recent fluvial processes are mainly bedrock controlled in the lower reaches. A special geomorphological map at a scale of 1:10.000 illustrates the relief evolution. In addition, the map focuses on past and recent process-forms related to mass movements. Thus areas of active and inactive mass movements can be easily distinguished. Zones of activity and the hazard potential can be deduced from the map. Hazard assessment is supported by GIS modelling, DEM analysis, multi-temporal time series analysis and aerial photo interpretation. Geophysical soundings are important for detailed site specific information such as shear planes and sediment thickness. A GIS model based on the parameters geology, topography (slope angle, curvature), thickness of loosely-consolidated material, vegetation and hydrology (proximity to receiving stream) was developed. Calculation of failure rates allow a specific value to be assigned to each parameter class indicating its role in the mass movement process. About 90% of the mapped mass movements were correctly classified by the model. Although the overall match seems to be quite good there are some localities where the modelled and the mapped results differ significantly. In the future, the mapped results should be considered together with further “expert knowledge” for an improvement of the GIS model.

  4. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  5. Rapid, reliable geodetic data analysis for hazard response: Results from the Advanced Rapid Imaging and Analysis (ARIA) project

    Science.gov (United States)

    Owen, S. E.; Simons, M.; Hua, H.; Yun, S.; Cruz, J.; Webb, F.; Rosen, P. A.; Fielding, E. J.; Moore, A. W.; Polet, J.; Liu, Z.; Agram, P. S.; Lundgren, P.

    2013-12-01

    ARIA is a joint JPL/Caltech coordinated project to automate InSAR and GPS imaging capabilities for scientific understanding, hazard response, and societal benefit. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allows us to resolve the fault geometry and distribution of slip associated with earthquakes in high spatial & temporal detail. In certain cases, it can be complementary to seismic data, providing constraints on location, geometry, or magnitude that is difficult to determine with seismic data alone. In addition, remote sensing with SAR provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. We have built an end-to-end prototype geodetic imaging data system that forms the foundation for a hazard response and science analysis capability that integrates InSAR, high-rate GPS, seismology, and modeling to deliver monitoring, science, and situational awareness products. This prototype incorporates state-of-the-art InSAR and GPS analysis algorithms from technologists and scientists. The products have been designed and a feasibility study conducted in collaboration with USGS scientists in the earthquake and volcano science programs. We will present results that show the capabilities of this data system in terms of latency, data processing capacity, quality of automated products, and feasibility of use for analysis of large SAR and GPS data sets and for earthquake response activities.

  6. Preliminary Analysis of Species Partitioning in the DWPF Melter

    Energy Technology Data Exchange (ETDEWEB)

    Choi, A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Kesterson, M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Johnson, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-07-15

    The work described in this report is preliminary in nature since its goal was to demonstrate the feasibility of estimating the off-gas entrainment rates from the Defense Waste Processing Facility (DWPF) melter based on a simple mass balance using measured feed and glass pour stream compositions and timeaveraged melter operating data over the duration of one canister-filling cycle. The only case considered in this study involved the SB6 pour stream sample taken while Canister #3472 was being filled over a 20-hour period on 12/20/2010, approximately three months after the bubblers were installed. The analytical results for that pour stream sample provided the necessary glass composition data for the mass balance calculations. To estimate the “matching” feed composition, which is not necessarily the same as that of the Melter Feed Tank (MFT) batch being fed at the time of pour stream sampling, a mixing model was developed involving three preceding MFT batches as well as the one being fed at that time based on the assumption of perfect mixing in the glass pool but with an induction period to account for the process delays involved in the calcination/fusion step in the cold cap and the melter turnover.

  7. Preliminary Coupling of MATRA Code for Multi-physics Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seongjin; Choi, Jinyoung; Yang, Yongsik; Kwon, Hyouk; Hwang, Daehyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    The boundary conditions such as the inlet temperature, mass flux, averaged heat flux, power distributions of the rods, and core geometry is given by constant values or functions of time. These conditions are separately calculated and provided by other codes, such as a neutronics or a system codes, into the MATRA code. In addition, the coupling of several codes in the different physics field is focused and embodied. In this study, multiphysics coupling methods were developed for a subchannel code (MATRA) with neutronics codes (MASTER, DeCART) and a fuel performance code (FRAPCON-3). Preliminary evaluation results for representative sample cases are presented. The MASTER and DeCART codes provide the power distribution of the rods in the core to the MATRA code. In case of the FRAPCON-3 code, the variation of the rod diameter induced by the thermal expansion is yielded and provided. The MATRA code transfers the thermal-hydraulic conditions that each code needs. Moreover, the coupling method with each code is described.

  8. Laboratory Investigations on Estuary Salinity Mixing: Preliminary Analysis

    Directory of Open Access Journals (Sweden)

    F. H. Nuryazmeen

    2014-05-01

    Full Text Available Estuaries are bodies of water along the coasts that are formed when fresh water from rivers flows into and mixes with salt water from the ocean. The estuaries serve as a habitat to some aquatic lives, including mangroves. Human-induced activities such as dredging of shipping lanes along the bottom estuarine, the disposal of industrial wastes into the water system and shoreline development influence estuarine dynamics which include mixing process. These activities might contribute to salinity changes and further adversely affect the estuarine ecosystem. In order to study at the characteristics of the mixing between salt water (estuary and freshwater (river, a preliminary investigation had been done in the laboratory. Fresh water was released from one end of the flume and overflowing at weir at the other end. Meanwhile, salt water was represented by the red dye tracer released through a weir and intruded upstream as a gravity current. The isohalines are plotted to see the salinity patterns. Besides, to examine the spatial and temporal salinity profiles along the laboratory investigations, the plotted graphs have been made. The results show that the changes in salinity level along the flume due to mixing between fresh water and salt water. This showed typical salt-wedge estuary characteristics.

  9. An OSHA based approach to safety analysis for nonradiological hazardous materials

    Energy Technology Data Exchange (ETDEWEB)

    Yurconic, M.

    1992-08-01

    The PNL method for chemical hazard classification defines major hazards by means of a list of hazardous substances (or chemical groups) with associated trigger quantities. In addition, the functional characteristics of the facility being classified is also be factored into the classification. In this way, installations defined as major hazard will only be those which have the potential for causing very serious incidents both on and off site. Because of the diversity of operations involving chemicals, it may not be possible to restrict major hazard facilities to certain types of operations. However, this hazard classification method recognizes that in the industrial sector major hazards are most commonly associated with activities involving very large quantities of chemicals and inherently energetic processes. These include operations like petrochemical plants, chemical production, LPG storage, explosives manufacturing, and facilities which use chlorine, ammonia, or other highly toxic gases in bulk quantities. The basis for this methodology is derived from concepts used by OSHA in its proposed chemical process safety standard, the Dow Fire and Explosion Index Hazard Classification Guide, and the International Labor Office`s program on chemical safety. For the purpose of identifying major hazard facilities, this method uses two sorting criteria, (1) facility function and processes and (2) quantity of substances to identify facilities requiringclassification. Then, a measure of chemical energy potential (material factor) is used to identify high hazard class facilities.

  10. An OSHA based approach to safety analysis for nonradiological hazardous materials

    Energy Technology Data Exchange (ETDEWEB)

    Yurconic, M.

    1992-08-01

    The PNL method for chemical hazard classification defines major hazards by means of a list of hazardous substances (or chemical groups) with associated trigger quantities. In addition, the functional characteristics of the facility being classified is also be factored into the classification. In this way, installations defined as major hazard will only be those which have the potential for causing very serious incidents both on and off site. Because of the diversity of operations involving chemicals, it may not be possible to restrict major hazard facilities to certain types of operations. However, this hazard classification method recognizes that in the industrial sector major hazards are most commonly associated with activities involving very large quantities of chemicals and inherently energetic processes. These include operations like petrochemical plants, chemical production, LPG storage, explosives manufacturing, and facilities which use chlorine, ammonia, or other highly toxic gases in bulk quantities. The basis for this methodology is derived from concepts used by OSHA in its proposed chemical process safety standard, the Dow Fire and Explosion Index Hazard Classification Guide, and the International Labor Office's program on chemical safety. For the purpose of identifying major hazard facilities, this method uses two sorting criteria, (1) facility function and processes and (2) quantity of substances to identify facilities requiringclassification. Then, a measure of chemical energy potential (material factor) is used to identify high hazard class facilities.

  11. Hazardous Waste

    Science.gov (United States)

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  12. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  13. Large landslides in the Pyrenees: preliminary tasks carried out for a harmonized cross-border risk analysis

    Science.gov (United States)

    Moya, José; Grandjean, Gilles; Copons, Ramon; Vaunat, Jean; Buxó, Pere; Colas, Bastien; Darrozes, José; Gasc, Muriel; Guinau, Marta; Gutiérrez, Francisco; García, Juan Carlos; Virely, Didier; Crosetto, Michele; Mas, Raül

    2017-04-01

    Large landslides are recognised as one of the main erosional agents in mountain ranges, having a significant influence on landscape evolution. However, few efforts have been carried out to assess their geomorphological impact from a regional perspective. Regional-scale investigations are also necessary for the reliable evaluation of the associated risks (i.e. for land-use planning). Large landslides are common in the Pyrenees but: 1) their geographic distribution on a regional scale is not well known; 2) their geological and geomorphological controlling factors have been only studied preliminarily; and 3) their state of activity and stability conditions are unknown for most of the cases. Regional analyses of large landslides, as those carried out by Crosta et al. (2013) in the Alps, are rare worldwide. Jarman et al. (2014) conducted a very preliminary analysis in a sector of the Pyrenees. The construction of a cartographic inventory constitutes the basics for such type of studies, which are typically hindered by the lack of cross-border landslide data bases and methodologies. The aim of this contribution is to present the preliminary works carried out for constructing a harmonized inventory of large landslides in the Pyrenees, involving for the first time both sides of the cordillera and the main groups working in landslide risk in France, Spain and Andorra. Methods used for landslide hazard and risk analysis have been compiled and compared, showing a significant divergence, even as regards the terminology. A preliminary cross-border inventory sheet on risk of large landslides has been prepared. It includes specific fields for the assessment of landslide activity (by using complimentary methods such as morpho-stratigraphy, morphometric analysis and remote techniques) and indirect potential costs (that typically overcome direct ones), which usually are neglected in the existing data bases. Crosta, G.B., Frattini, P. and Agliardi, F., 2013. Deep seated gravitational

  14. Preliminary Analysis of a Novel SAR Based Emergency System for Earth Orbit Satellites using Galileo

    NARCIS (Netherlands)

    Gill, E.K.A.; Helderweirt, A.

    2010-01-01

    This paper presents a preliminary analysis of a novel Search and Rescue (SAR) based emergency system for Low Earth Orbit (LEO) satellites using the Galileo Global Navigation Satellite System (GNSS). It starts with a description of the space user SAR system including a concept description, mission ar

  15. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  16. Preliminary Analysis on Matric Suction for Barren Soil

    Science.gov (United States)

    Azhar, A. T. S.; Fazlina, M. I. S.; Aziman, M.; Fairus, Y. M.; Azman, K.; Hazreek, Z. A. M.

    2016-11-01

    Most research conducted on slope failures can broadly be attributed to the convergence of three factors, i.e. rainfall, steepness of slope, and soil geological profile. The mechanism of the failures is mainly due to the loss of matric suction of soils by rainwater. When rainwater infiltrates into the slopes, it will start to saturate the soil, i.e., reduce the matric suction. A good understanding of landslide mechanisms and the characteristics of unsaturated soil and rock in tropical areas is crucial in landslide hazard formulation. Most of the slope failures in unsaturated tropical residual soil in Malaysia are mainly due to infiltration, especially during intense and prolonged rainfall, which reduces the soil matric suction and hence decreases the stability of the slope. Therefore, the aim of this research is to determine the matric suction for barren soil and to model an unsaturated slope with natural rainfall to evaluate the effects of matric suction on rainfall intensity. A field test was carried out using the Watermark Soil Moisture Sensor to determine the matric suction. The sensor was connected to a program called SpecWare 9 Basic which also used Data Logging Rain gauge Watermark 1120 to measure the intensity and duration of rainfall. This study was conducted at the Research Centre for Soft Soil which is a new Research and Development (R & D) initiative by Universiti Tun Hussein Onn Malaysia, Parit Raja. Field observation showed that the highest daily suction was recorded during noon while the lowest suction was obtained at night and early morning. The highest matric suction for loose condition was 31.0 kPa while the highest matric suction for compacted condition was 32.4 kPa. The results implied that the field suction variation was not only governed by the rainfall, but also the cyclic evaporation process. The findings clearly indicated that the changes in soil suction distribution patterns occurred due to different weather conditions.

  17. Preliminary Analysis of the Oklahoma Wavefields Demonstration Dataset

    Science.gov (United States)

    Anderson, K. R.; Sweet, J. R.; Woodward, R.; Karplus, M. S.; DeShon, H. R.; Magnani, M. B.; Hayward, C.; Langston, C. A.

    2016-12-01

    In June 2016, a field crew of 50 students, faculty, industry personnel and IRIS staff deployed a total of 390 stations as part of a community seismic experiment above an active seismic lineament in north-central Oklahoma. The goals of the experiment were to test new instrumentation and deployment strategies that record the full wavefield, and to advance understanding of earthquake source processes and regional lithospheric structure. The crew deployed 363 3C 4.5Hz Generation 2 Fairfield Z-Land nodes along three seismic lines and in a seven-layer nested gradiometer array. The seismic lines spanned a region 13 km long by 5 km wide. The nested gradiometer was designed to measure the full seismic wavefield using standard frequency-wavenumber techniques and spatial wave gradients. A broadband, 18 station "Golay 3x6" array was deployed around the gradiometer and seismic lines with an aperture of approximately 5 km to collect waveform data from local and regional events. In addition, 9 infrasound stations were deployed in order to capture and identify acoustic events that might be recorded by the seismic arrays and to quantify the wind acoustic noise effect on co-located broadband stations. The variety of instrumentation used in this deployment was chosen to capture the full seismic wavefield generated by the local and regional seismicity beneath the array and the surrounding region. We present preliminary results from the data collected during the experiment. We analyze the level of signal coherence observed across the nested gradiometer and Golay array as well as array design fidelity. We report on data quality, including completeness and noise levels, for the various types of instrumentation. We also examine the performance of co-located surface and buried nodes to determine the benefits of each installation type. Finally, we present performance comparisons between co-located nodes and broadband stations and compare these results to prior wavefield/large-N deployments

  18. Preliminary analysis techniques for ring and stringer stiffened cylindrical shells

    Science.gov (United States)

    Graham, J.

    1993-03-01

    This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.

  19. Preliminary Evaluation on Occupational Hazards of Passenger Vehicle Assembly Lines%乘用车装配生产线职业病危害预评价分析

    Institute of Scientific and Technical Information of China (English)

    王多多

    2012-01-01

    Objective To investigate the occupational hazards in two passenger vehicle enterprises, and to provide reference for developing preventive measures of occupational diseases. Methods The preliminary evaluation was conducted in accordance with Technical Guidelines for Pre - assessment of Occupation Hazards in Construction Projects (GBZ/T 196 - 2007). Results Manganese and inorganic compounds, xylene, welding fume, noise, high temperature and heat radiation might exist in the process of passenger vehicle production. Conclusions A variety of occupational hazards and several high toxic substances exist in the process of passenger vehicle production. This project is classified as the project with serious occupational hazards. The critical control points of occupational hazards are noise existed in stamping plant and welding workshop, welding fume and manganese dioxide existed in welding workshop, and organic solvents such as benzene series existed in paint workshop. As a wide range of population comes into contact with various occupational hazards, individual protection must also strengthen in the case of various occupational hazards protective measures putting in place.%目的 了解2家乘用车生产企业可能产生的职业病危害因素,为制定职业病的预防措施提供依据. 方法 按照《建设项目职业病危害预评价技术导则》(GBZ/T196-2007)进行评价. 结果 乘用车生产线可能存在锰及其化合物、二甲苯、电焊烟尘、噪声、高温热辐射等. 结论 该项目职业病危害种类繁多且包括多种高毒物品,属职业病危害严重的建设项目,其主要控制点为冲压车间及焊装车间等噪声危害;焊装车间的电焊烟尘、二氧化锰等危害;涂装车间的苯系物等有机溶剂的危害.由于接触人群范围广,在各项职业病危害防护措施落实到位的情况下,还必须加强个体防护.

  20. Vulnerability analysis of Landslide hazard area: Case study of South Korea

    Science.gov (United States)

    Oh, Chaeyeon; Jun, Kyewon; Kim, Younghwan

    2017-04-01

    Recently such as Landslide and debris flow are occurring over the due to climate changes, frequent sedimentation disaster in mountains area. A scientific analysis of landslide risk areas along with the collection and analysis of a variety of spatial information would be critical for minimizing damage in the event of mountainous disasters such as landslide and debris flow. We carried out a case study of the selected areas at Inje, Gangwon province which suffered from serious landslides due to flash floods by Typhoon Ewiniar in 2006. Landslide and debris flow locations were identified in the study area from interpretation of airborne image and field surveys. We used GIS to construct a spatial information database integrating the data required for a comprehensive analysis of landslide risk areas including geography, hydrology, pedology, and forestry. Furthermore, this study evaluates slope stability of the affected areas using SINMAP(Stability Index Mapping), analyzes spatial data that have high correlation with selected landslide areas using Likelihood ratio. And by applying the Weight of evidence techniques weight values (W+ and W-) which were calculated for each element. We then analyzed the spatial data which were significantly correlated with the landslide occurrence and predicted the mountainous areas with elevated risks of landslide which are vulnerable to disasters, and the hazard map was generated using GIS. Acknowledgments This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(No.NRF-2014R1A1A3050495).

  1. The value of integrating information from multiple hazards for flood risk analysis and management

    Science.gov (United States)

    Castillo-Rodríguez, J. T.; Escuder-Bueno, I.; Altarejos-García, L.; Serrano-Lombillo, A.

    2014-02-01

    This article presents a methodology for estimating flood risk in urban areas integrating pluvial flooding, river flooding and failure of both small and large dams. The first part includes a review of basic concepts on flood risk analysis, evaluation and management. Flood risk analyses may be developed at local, regional and national level, however a general methodology to perform a quantitative flood risk analysis including different flood hazards is still required. The second part describes the proposed methodology, which presents an integrated approach - combining pluvial, river flooding and flooding from dam failure, as applied to a case study: an urban area located downstream of a dam under construction. The methodology enhances the approach developed within the SUFRI project ("Sustainable Strategies of Urban Flood Risk Management to cope with the residual risk", 2009-2011). This article also shows how outcomes from flood risk analysis provide better and more complete information to inform authorities, local entities and the stakeholders involved in decision-making with regard to flood risk management.

  2. Use of hazard analysis critical control point and alternative treatments in the production of apple cider.

    Science.gov (United States)

    Senkel, I A; Henderson, R A; Jolbitado, B; Meng, J

    1999-07-01

    The purpose of this study was to evaluate the practices of Maryland cider producers and determine whether implementing hazard analysis critical control point (HACCP) would reduce the microbial contamination of cider. Cider producers (n = 11) were surveyed to determine existing manufacturing practices and sanitation. A training program was then conducted to inform operators of safety issues, including contamination with Escherichia coli O157:H7, and teach HACCP concepts and principles, sanitation procedures, and good manufacturing practice (GMP). Although all operators used a control strategy from one of the model HACCP plans provided, only one developed a written HACCP plan. None developed specific GMP, sanitation standard operating procedures, or sanitation monitoring records. Six operators changed or added production controls, including the exclusion of windfall apples, sanitizing apples chemically and by hot dip, and cider treatment with UV light or pasteurization. Facility inspections indicated improved sanitation and hazard control but identified ongoing problems. Microbiological evaluation of bottled cider before and after training, in-line apples, pomace, cider, and inoculated apples was conducted. E. coli O157:H7, Salmonella, or Staphylococcus aureus were not found in samples of in-line apple, pomace, and cider, or bottled cider. Generic E. coli was not isolated on in-coming apples but was found in 4 of 32 (13%) in-line samples and 3 of 17 (18%) bottled fresh cider samples, suggesting that E. coli was introduced during in-plant processing. To produce pathogen-free cider, operators must strictly conform to GMP and sanitation procedures in addition to HACCP controls. Controls aimed at preventing or eliminating pathogens on source apples are critical but alone may not be sufficient for product safety.

  3. The ARIA project: Advanced Rapid Imaging and Analysis for Natural Hazard Monitoring and Response

    Science.gov (United States)

    Owen, S. E.; Webb, F.; Simons, M.; Rosen, P. A.; Cruz, J.; Yun, S.; Fielding, E. J.; Moore, A. W.; Hua, H.; Agram, P.; Lundgren, P.

    2012-12-01

    ARIA is a joint JPL/Caltech coordinated effort to automate geodetic imaging capabilities for hazard response and societal benefit. Over the past decade, space-based geodetic measurements such as InSAR and GPS have provided new assessment capabilities and situational awareness on the size and location of earthquakes following seismic disasters and on volcanic eruptions following magmatic events. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allow us to resolve the fault geometry and distribution of slip associated with any given earthquake in correspondingly high spatial & temporal detail. In addition, remote sensing with radar provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. These data sets are still essentially hand-crafted, and thus are not generated rapidly and reliably enough for informing decision-making agencies and the public following an earthquake. We are building an end-to-end prototype geodetic imaging data system that would form the foundation for an envisioned operational hazard response center integrating InSAR, GPS, seismology, and modeling to deliver monitoring, actionable science, and situational awareness products. This prototype exploits state-of-the-art analysis algorithms from technologists and scientists, These algorithms enable the delivery of actionable products from larger data sets with enhanced modeling and interpretation, and the development of next generation techniques. We are collaborating with USGS scientists in both the earthquake and volcano science program for our initial data product infusion. We present our progress to date on development of prototype data system and demonstration data products, and example responses we have run such as generating products for the 2011 M9.0 Tohoku-oki, M6.3 Christchurch earthquakes, the 2011 M7.1 Van earthquake, and several simulated

  4. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  5. The dilemma in prioritizing chemicals for environmental analysis: known versus unknown hazards.

    Science.gov (United States)

    Anna, Sobek; Sofia, Bejgarn; Christina, Rudén; Magnus, Breitholtz

    2016-08-10

    A major challenge for society is to manage the risks posed by the many chemicals continuously emitted to the environment. All chemicals in production and use cannot be monitored and science-based strategies for prioritization are essential. In this study we review available data to investigate which substances are included in environmental monitoring programs and published research studies reporting analyses of chemicals in Baltic Sea fish between 2000 and 2012. Our aim is to contribute to the discussion of priority settings in environmental chemical monitoring and research, which is closely linked to chemical management. In total, 105 different substances or substance groups were analyzed in Baltic Sea fish. Polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans (PCDD/Fs) and polychlorinated biphenyls (PCBs) were the most studied substances or substance groups. The majority, 87%, of all analyses comprised 20% of the substances or substance groups, whereas 46 substance groups (44%) were analyzed only once. Almost three quarters of all analyses regarded a POP-substance (persistent organic pollutant). These results demonstrate that the majority of analyses on environmental contaminants in Baltic Sea fish concern a small number of already regulated chemicals. Legacy pollutants such as POPs pose a high risk to the Baltic Sea due to their hazardous properties. Yet, there may be a risk that prioritizations for chemical analyses are biased based on the knowns of the past. Such biases may lead to society failing in identifying risks posed by yet unknown hazardous chemicals. Alternative and complementary ways to identify priority chemicals are needed. More transparent communication between risk assessments performed as part of the risk assessment process within REACH and monitoring programs, and information on chemicals contained in consumer articles, would offer ways to identify chemicals for environmental analysis.

  6. Preliminary assessment of hazardous-waste pretreatment as an air-pollution-control technique. Final report, 25 July 1983-31 July 1984

    Energy Technology Data Exchange (ETDEWEB)

    Spivey, J.J.; Allen, C.C.; Green, D.A.; Wood, J.P.; Stallings, R.L.

    1986-03-01

    The report evaluates twelve commercially available treatment techniques for their use in removing volatile constituents from hazardous and potentially hazardous waste streams. A case study of the cost of waste treatment is also provided for each technique. The results show that air stripping or evaporation coupled with carbon adsorption of the off gases; steam stripping; and batch distillation are the most widely applicable pretreatment techniques. The cost-effectiveness of pretreatment varies widely with waste-stream characteristics and type of pretreatment, with typical values being between $55 and $1,800 per megagram of volatile removed.

  7. Safety and Hazard Analysis for the Coherent/Acculite Laser Based Sandia Remote Sensing System (Trailer B70).

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2005-09-01

    A laser safety and hazard analysis is presented, for the Coherent(r) driven Acculite(r) laser central to the Sandia Remote Sensing System (SRSS). The analysis is based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The trailer (B70) based SRSS laser system is a mobile platform which is used to perform laser interaction experiments and tests at various national test sites. The trailer based SRSS laser system is generally operated on the United State Air Force Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances. In order to protect personnel who work inside the Nominal Hazard Zone (NHZ) from hazardous laser exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (ODmin) necessary for the laser safety eyewear used by authorized personnel. Also, the Nominal Ocular Hazard Distance (NOHD) and The Extended Ocular Hazard Distance (EOHD) are calculated in order to protect unauthorized personnel who may have violated the boundaries of the control area and might enter into the laser's NHZ for testing outside the trailer. 4Page intentionally left blank

  8. Smartphones for post-event analysis: a low-cost and easily accessible approach for mapping natural hazards

    Science.gov (United States)

    Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo

    2015-04-01

    A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar

  9. Preliminary Design and Analysis of ITER In-Wall Shielding

    Institute of Scientific and Technical Information of China (English)

    LIU Changle; YU Jie; WU Songtao; CAI Yingxiang; PAN Wanjiang

    2007-01-01

    ITER in-wall shielding (IIS) is situated between the doubled shells of the ITER Vacuum Vessel (IVV). Its main functions are applied in shielding neutron, gamma-ray and toroidal field ripple reduction. The structure of IIS has been modelled according to the IVV design criteria which has been updated by the ITER team (IT). Static analysis and thermal expansion analysis were performed for the structure. Thermal-hydraulic analysis verified the heat removal capability and resulting temperature, pressure, and velocity changes in the coolant flow. Consequently, our design work is possibly suitable as a reference for IT's updated or final design in its next step.

  10. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  11. First fungal genome sequence from Africa: A preliminary analysis

    Directory of Open Access Journals (Sweden)

    Rene Sutherland

    2012-01-01

    Full Text Available Some of the most significant breakthroughs in the biological sciences this century will emerge from the development of next generation sequencing technologies. The ease of availability of DNA sequence made possible through these new technologies has given researchers opportunities to study organisms in a manner that was not possible with Sanger sequencing. Scientists will, therefore, need to embrace genomics, as well as develop and nurture the human capacity to sequence genomes and utilise the ’tsunami‘ of data that emerge from genome sequencing. In response to these challenges, we sequenced the genome of Fusarium circinatum, a fungal pathogen of pine that causes pitch canker, a disease of great concern to the South African forestry industry. The sequencing work was conducted in South Africa, making F. circinatum the first eukaryotic organism for which the complete genome has been sequenced locally. Here we report on the process that was followed to sequence, assemble and perform a preliminary characterisation of the genome. Furthermore, details of the computer annotation and manual curation of this genome are presented. The F. circinatum genome was found to be nearly 44 million bases in size, which is similar to that of four other Fusarium genomes that have been sequenced elsewhere. The genome contains just over 15 000 open reading frames, which is less than that of the related species, Fusarium oxysporum, but more than that for Fusarium verticillioides. Amongst the various putative gene clusters identified in F. circinatum, those encoding the secondary metabolites fumosin and fusarin appeared to harbour evidence of gene translocation. It is anticipated that similar comparisons of other loci will provide insights into the genetic basis for pathogenicity of the pitch canker pathogen. Perhaps more importantly, this project has engaged a relatively large group of scientists

  12. Performance analysis tool (PATO): Development and preliminary validation

    National Research Council Canada - National Science Library

    Fernando Martins; Filipe Clemente; Frutuoso Silva

    2017-01-01

    .... The Performance Analysis Tool (PATO) software was built with the aim to quickly codify relationships between players and built the adjacency matrices that can be used to test the network measures...

  13. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  14. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xueqin [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); School of Social Development and Public Policy, Beijing Normal University, Beijing 100875 (China); Li, Ning [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); Yuan, Shuai, E-mail: syuan@nmemc.org.cn [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); Xu, Ning; Shi, Wenqin; Chen, Weibin [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China)

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  15. Integrated transcriptome and methylome analysis in youth at high risk for bipolar disorder: a preliminary analysis.

    Science.gov (United States)

    Fries, G R; Quevedo, J; Zeni, C P; Kazimi, I F; Zunta-Soares, G; Spiker, D E; Bowden, C L; Walss-Bass, C; Soares, J C

    2017-03-14

    First-degree relatives of patients with bipolar disorder (BD), particularly their offspring, have a higher risk of developing BD and other mental illnesses than the general population. However, the biological mechanisms underlying this increased risk are still unknown, particularly because most of the studies so far have been conducted in chronically ill adults and not in unaffected youth at high risk. In this preliminary study we analyzed genome-wide expression and methylation levels in peripheral blood mononuclear cells from children and adolescents from three matched groups: BD patients, unaffected offspring of bipolar parents (high risk) and controls (low risk). By integrating gene expression and DNA methylation and comparing the lists of differentially expressed genes and differentially methylated probes between groups, we were able to identify 43 risk genes that discriminate patients and high-risk youth from controls. Pathway analysis showed an enrichment of the glucocorticoid receptor (GR) pathway with the genes MED1, HSPA1L, GTF2A1 and TAF15, which might underlie the previously reported role of stress response in the risk for BD in vulnerable populations. Cell-based assays indicate a GR hyporesponsiveness in cells from adult BD patients compared to controls and suggest that these GR-related genes can be modulated by DNA methylation, which poses the theoretical possibility of manipulating their expression as a means to counteract the familial risk presented by those subjects. Although preliminary, our results suggest the utility of peripheral measures in the identification of biomarkers of risk in high-risk populations and further emphasize the potential role of stress and DNA methylation in the risk for BD in youth.

  16. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    Science.gov (United States)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready

  17. Hazard, Vulnerability and Capacity Mapping for Landslides Risk Analysis using Geographic Information System (GIS)

    Science.gov (United States)

    Sari, D. A. P.; Innaqa, S.; Safrilah

    2017-06-01

    This research analyzed the levels of disaster risk in the Citeureup sub-District, Bogor Regency, West Java, based on its potential hazard, vulnerability and capacity, using map to represent the results, then Miles and Huberman analytical techniques was used to analyze the qualitative interviews. The analysis conducted in this study is based on the concept of disaster risk by Wisner. The result shows that the Citeureup sub-District has medium-low risk of landslides. Of the 14 villages, three villages have a moderate risk level, namely Hambalang, Tajur, and Tangkil, or 49.58% of the total land area. Eleven villages have a low level of risk, namely Pasir Mukti, Sanja, Tarikolot, Gunung Sari, Puspasari, East Karang Asem, Citeureup, Leuwinutug, Sukahati, West Karang Asem West and Puspanegara, or 48.68% of the total land area, for high-risk areas only around 1.74%, which is part of Hambalang village. The analysis using Geographic Information System (GIS) prove that areas with a high risk potential does not necessarily have a high level of risk. The capacity of the community plays an important role to minimize the risk of a region. Disaster risk reduction strategy is done by creating a safe condition, which intensified the movement of disaster risk reduction.

  18. Analysis on Topological Properties of Dalian Hazardous Materials Road Transportation Network

    Directory of Open Access Journals (Sweden)

    Pengyun Chong

    2015-01-01

    Full Text Available To analyze the topological properties of hazardous materials road transportation network (HMRTN, this paper proposed two different ways to construct the cyberspace of HMRTN and constructed their complex network models, respectively. One was the physical network model of HMRTN based on the primal approach and the other was the service network model of HMRTN based on neighboring nodes. The two complex network models were built by using the case of Dalian HMRTN. The physical network model contained 154 nodes and 238 edges, and the statistical analysis results showed that (1 the cumulative node degree of physical network was subjected to exponential distribution, showing the network properties of random network and that (2 the HMRTN had small characteristic path length and large network clustering coefficient, which was a typical small-world network. The service network model contained 569 nodes and 1318 edges, and the statistical analysis results showed that (1 the cumulative node degree of service network was subjected to power-law distribution, showing the network properties of scale-free network and that (2 the relationship between nodes strength and their descending order ordinal and the relationship between nodes strength and cumulative nodes strength were both subjected to power-law distribution, also showing the network properties of scale-free network.

  19. Site specific seismic hazard analysis and determination of response spectra of Kolkata for maximum considered earthquake

    Science.gov (United States)

    Shiuly, Amit; Sahu, R. B.; Mandal, Saroj

    2017-06-01

    This paper presents site specific seismic hazard analysis of Kolkata city, former capital of India and present capital of state West Bengal, situated on the world’s largest delta island, Bengal basin. For this purpose, peak ground acceleration (PGA) for a maximum considered earthquake (MCE) at bedrock level has been estimated using an artificial neural network (ANN) based attenuation relationship developed on the basis of synthetic ground motion data for the region. Using the PGA corresponding to the MCE, a spectrum compatible acceleration time history at bedrock level has been generated by using a wavelet based computer program, WAVEGEN. This spectrum compatible time history at bedrock level has been converted to the same at surface level using SHAKE2000 for 144 borehole locations in the study region. Using the predicted values of PGA and PGV at the surface, corresponding contours for the region have been drawn. For the MCE, the PGA at bedrock level of Kolkata city has been obtained as 0.184 g, while that at the surface level varies from 0.22 g to 0.37 g. Finally, Kolkata has been subdivided into eight seismic subzones, and for each subzone a response spectrum equation has been derived using polynomial regression analysis. This will be very helpful for structural and geotechnical engineers to design safe and economical earthquake resistant structures.

  20. Bioelectrical impedance analysis for bovine milk: Preliminary results

    Science.gov (United States)

    Bertemes-Filho, P.; Valicheski, R.; Pereira, R. M.; Paterno, A. S.

    2010-04-01

    This work reports the investigation and analysis of bovine milk quality by using biological impedance measurements using electrical impedance spectroscopy (EIS). The samples were distinguished by a first chemical analysis using Fourier transform midinfrared spectroscopy (FTIR) and flow citometry. A set of milk samples (100ml each) obtained from 17 different cows in lactation with and without mastitis were analyzed with the proposed technique using EIS. The samples were adulterated by adding distilled water and hydrogen peroxide in a controlled manner. FTIR spectroscopy and flow cytometry were performed, and impedance measurements were made in a frequency range from 500Hz up to 1MHz with an implemented EIS system. The system's phase shift was compensated by measuring saline solutions. It was possible to show that the results obtained with the Bioelectrical Impedance Analysis (BIA) technique may detect changes in the milk caused by mastitis and the presence of water and hydrogen peroxide in the bovine milk.

  1. A hazard analysis via an improved timed colored petri net with time-space coupling safety constraint

    Institute of Scientific and Technical Information of China (English)

    Li Zelin; Wang Shihai; Zhao Tingdi; Liu Bin

    2016-01-01

    Petri nets are graphical and mathematical tools that are applicable to many systems for modeling, simulation, and analysis. With the emergence of the concept of partitioning in time and space domains proposed in avionics application standard software interface (ARINC 653), it has become difficult to analyze time–space coupling hazards resulting from resource partitioning using classical or advanced Petri nets. In this paper, we propose a time–space coupling safety constraint and an improved timed colored Petri net with imposed time–space coupling safety constraints (TCCP-NET) to fill this requirement gap. Time–space coupling hazard analysis is conducted in three steps: specification modeling, simulation execution, and results analysis. A TCCP-NET is employed to model and analyze integrated modular avionics (IMA), a real-time, safety-critical system. The analysis results are used to verify whether there exist time–space coupling hazards at runtime. The method we propose demonstrates superior modeling of safety-critical real-time systems as it can specify resource allocations in both time and space domains. TCCP-NETs can effectively detect underlying time–space coupling hazards.

  2. A hazard analysis via an improved timed colored petri net with time–space coupling safety constraint

    Directory of Open Access Journals (Sweden)

    Li Zelin

    2016-08-01

    Full Text Available Petri nets are graphical and mathematical tools that are applicable to many systems for modeling, simulation, and analysis. With the emergence of the concept of partitioning in time and space domains proposed in avionics application standard software interface (ARINC 653, it has become difficult to analyze time–space coupling hazards resulting from resource partitioning using classical or advanced Petri nets. In this paper, we propose a time–space coupling safety constraint and an improved timed colored Petri net with imposed time–space coupling safety constraints (TCCP-NET to fill this requirement gap. Time–space coupling hazard analysis is conducted in three steps: specification modeling, simulation execution, and results analysis. A TCCP-NET is employed to model and analyze integrated modular avionics (IMA, a real-time, safety-critical system. The analysis results are used to verify whether there exist time–space coupling hazards at runtime. The method we propose demonstrates superior modeling of safety-critical real-time systems as it can specify resource allocations in both time and space domains. TCCP-NETs can effectively detect underlying time–space coupling hazards.

  3. Combined fluvial and pluvial urban flood hazard analysis: concept development and application to Can Tho city, Mekong Delta, Vietnam

    Science.gov (United States)

    Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen

    2016-04-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of

  4. A Preliminary Analysis of a Behavioral Classrooms Needs Assessment

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McCray, Cynthia; Lamkins, Carol; Taubman, Mitchell; McEachin, John; Cihon, Joseph H.

    2016-01-01

    Today many special education classrooms implement procedures based upon the principles of Applied Behavior Analysis (ABA) to establish educationally relevant skills and decrease aberrant behaviors. However, it is difficult for school staff and consultants to evaluate the implementation of various components of ABA and general classroom set up. In…

  5. A Preliminary MANPRINT Evaluation of the All Source Analysis (ASAS)

    Science.gov (United States)

    1988-11-01

    Rear (CEWI) FSIC ............................ 2 CEWI ( TCAE ) AIM(6) ........................... 2 DTOC AIM(6...Sensors and the Ml Battalion TCAE ..... ............... . 13 2. Ratings of Understanding of Tasks Required at the Completion of Training and at the...for transmission to the sensors and jammers. CEWI Tactical Control and Analysis Element ( TCAE ) AIM(6) The AIM module consists of a VAX 750R computer

  6. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    Directory of Open Access Journals (Sweden)

    Yu-Ting Hung

    2015-09-01

    Full Text Available To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management.

  7. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  8. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA): towards PTHA assessment for the coasts of Italy

    Science.gov (United States)

    Selva, Jacopo; Tonini, Roberto; Molinari, Irene; Tiberti, Mara M.; Romano, Fabrizio; Grezio, Anita; Melini, Daniele; Piatanesi, Alessio; Basili, Roberto; Lorito, Stefano

    2016-04-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes. Differently from classical approaches that commonly adopt the hazard integral and logic tree, we use an event tree approach and ensemble modelling. The procedure was developed in the framework of the EC projects ASTARTE and STREST, of the Italian National Flagship project RITMARE, and of the agreement between Italian Civil Protection and INGV. A total of about 2 × 107 different potential seismic sources covering the entire Mediterranean Sea, and more than 1 × 105 alternative model implementations have been considered to quantify both the aleatory variability and the epistemic uncertainty. A set of hazard curves is obtained along the coasts of the entire Italian territory. They are the prototype of the first homogeneous Italian national SPTHA map.

  9. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Harper, Bryan [Oregon State University (United States); Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan [Pacific Northwest National Laboratory (United States); Tang, Kaizhi [Intelligent Automation, Inc. (United States); Heredia-Langner, Alejandro [Pacific Northwest National Laboratory (United States); Lins, Roberto [CPqAM, Oswaldo Cruz Foundation, FIOCRUZ-PE (Brazil); Harper, Stacey, E-mail: stacey.harper@oregonstate.edu [Oregon State University (United States)

    2015-06-15

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships.

  10. Preliminary shielding analysis for the CSNS target station monolith

    Institute of Scientific and Technical Information of China (English)

    张斌; 陈义学; 杨寿海; 吴军; 殷雯; 梁天骄; 贾学军

    2010-01-01

    The construction of the China Spallation Neutron Source (CSNS) has been initiated at Dongguan,Guangdong,China.In spallation neutron sources the target station monolith is contaminated by a large number of fast neutrons whose energies can be as large as those of the protons of the proton beam directed towards the tungsten target.A detailed radiation transport analysis of the target station monolith is important for the construction of the CSNS.The analysis is performed using the coupled Monte Carlo and multi-dimensional discrete ordinates method.Successful elimination of the primary ray effects via the two-dimensional uncollided flux and first collision source methodology is also illustrated.The dose at the edge of the monolith is calculated.The results demonstrate that the doses received by the hall staff members are below the required standard limit.

  11. Preliminary analysis of productivity of fruiting fungi on Strzeleckie meadows

    Directory of Open Access Journals (Sweden)

    Barbara Sadowska

    2014-11-01

    Full Text Available Analysis demonstrated that the fresh ahd dry weight as well as the ash content of fungal fruit bodies collected on a forest-surrounded unmown meadow (Stellario-Deschampsietum Freitag 1957 and Caricetum elatae W.Koch 1926 were lower than the same values for a plot of exploited mown meadow and higher than on an exploited unmown meadow (Arrhenatheretum medioeuropaeum (Br.-Bl. Oberd. 1952.

  12. Preliminary analysis of knee stress in Full Extension Landing

    Directory of Open Access Journals (Sweden)

    Majid Davoodi Makinejad

    2013-09-01

    Full Text Available OBJECTIVE: This study provides an experimental and finite element analysis of knee-joint structure during extended-knee landing based on the extracted impact force, and it numerically identifies the contact pressure, stress distribution and possibility of bone-to-bone contact when a subject lands from a safe height. METHODS: The impact time and loads were measured via inverse dynamic analysis of free landing without knee flexion from three different heights (25, 50 and 75 cm, using five subjects with an average body mass index of 18.8. Three-dimensional data were developed from computed tomography scans and were reprocessed with modeling software before being imported and analyzed by finite element analysis software. The whole leg was considered to be a fixed middle-hinged structure, while impact loads were applied to the femur in an upward direction. RESULTS: Straight landing exerted an enormous amount of pressure on the knee joint as a result of the body's inability to utilize the lower extremity muscles, thereby maximizing the threat of injury when the load exceeds the height-safety threshold. CONCLUSIONS: The researchers conclude that extended-knee landing results in serious deformation of the meniscus and cartilage and increases the risk of bone-to-bone contact and serious knee injury when the load exceeds the threshold safety height. This risk is considerably greater than the risk of injury associated with walking downhill or flexion landing activities.

  13. Preliminary analysis of the mitochondrial genome evolutionary pattern in primates

    Institute of Scientific and Technical Information of China (English)

    Liang ZHAO; Xingtao ZHANG; Xingkui TAO; Weiwei WANG; Ming LI

    2012-01-01

    Since the birth of molecular evolutionary analysis,primates have been a central focus of study and mitochondrial DNA is well suited to these endeavors because of its unique features.Surprisingly,to date no comprehensive evaluation of the nucleotide substitution patterns has been conducted on the mitochondrial genome of primates.Here,we analyzed the evolutionary patterns and evaluated selection and recombination in the mitochondrial genomes of 44 Primates species downloaded from GenBank.The results revealed that a strong rate heterogeneity occurred among sites and genes in all comparisons.Likewise,an obvious decline in primate nucleotide diversity was noted in the subunit rRNAs and tRNAs as compared to the protein-coding genes.Within 13 protein-coding genes,the pattern of nonsynonymous divergence was similar to that of overall nucleotide divergence,while synonymous changes differed only for individual genes,indicating that the rate heterogeneity may result from the rate of change at nonsynonymous sites.Codon usage analysis revealed that there was intermediate codon usage bias in primate protein-coding genes,and supported the idea that GC mutation pressure might determine codon usage and that positive selection is not the driving force for the codon usage bias.Neutrality tests using site-specific positive selection from a Bayesian framework indicated no sites were under positive selection for any gene,consistent with near neutrality.Recombination tests based on the pairwise homoplasy test statistic supported complete linkage even for much older divergent primate species.Thus,with the exception of rate heterogeneity among mitochondrial genes,evaluating the validity assumed complete linkage and selective neutrality in primates prior to phylogenetic or phylogeographic analysis seems unnecessary.

  14. Preliminary analysis of the mitochondrial genome evolutionary pattern in primates.

    Science.gov (United States)

    Zhao, Liang; Zhang, Xingtao; Tao, Xingkui; Wang, Weiwei; Li, Ming

    2012-08-01

    Since the birth of molecular evolutionary analysis, primates have been a central focus of study and mitochondrial DNA is well suited to these endeavors because of its unique features. Surprisingly, to date no comprehensive evaluation of the nucleotide substitution patterns has been conducted on the mitochondrial genome of primates. Here, we analyzed the evolutionary patterns and evaluated selection and recombination in the mitochondrial genomes of 44 Primates species downloaded from GenBank. The results revealed that a strong rate heterogeneity occurred among sites and genes in all comparisons. Likewise, an obvious decline in primate nucleotide diversity was noted in the subunit rRNAs and tRNAs as compared to the protein-coding genes. Within 13 protein-coding genes, the pattern of nonsynonymous divergence was similar to that of overall nucleotide divergence, while synonymous changes differed only for individual genes, indicating that the rate heterogeneity may result from the rate of change at nonsynonymous sites. Codon usage analysis revealed that there was intermediate codon usage bias in primate protein-coding genes, and supported the idea that GC mutation pressure might determine codon usage and that positive selection is not the driving force for the codon usage bias. Neutrality tests using site-specific positive selection from a Bayesian framework indicated no sites were under positive selection for any gene, consistent with near neutrality. Recombination tests based on the pairwise homoplasy test statistic supported complete linkage even for much older divergent primate species. Thus, with the exception of rate heterogeneity among mitochondrial genes, evaluating the validity assumed complete linkage and selective neutrality in primates prior to phylogenetic or phylogeographic analysis seems unnecessary.

  15. Preliminary safety analysis for key design features of KALIMER-600

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Y. B.; Chang, W. P.; Suk, S. D.; Ha, K. S.; Jeong, H. Y.; Heo, S

    2004-03-01

    KAERI is developing the conceptual design of a Liquid Metal Reactor, KALIMER-600 (Korea Advanced LIquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER-600 addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, key safety design features are described and safety analyses results for typical ATWS accidents in the KALIMER design with breakeven core are presented. First, the basic approach to achieve the safety goal is introduced in Chapter 1, and the event categorization and acceptance criteria for the KALIMER-600 safety analysis are described in Chapter 2. In Chapter 3, results of inherent safety evaluations for the KALIMER-600 conceptual design are presented. The KALIMER-600 core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated Anticipated Transient Without Scram (ATWS) have been performed using the SSC-K code to investigate the KALIMER-600 system response to the events. They are categorized as Bounding Events (BEs) because of their low probability of occurrence. In Chapter 4, the analysis of flow blockage for KALIMER-600 with the MATRA-LMR-FB code, which has been developed for the internal flow blockage in a LMR subassembly. The cases with a blockage of 6-subchannel, 24-subchannel, and 54-subchannel are analyzed.The performance analysis of the KALIMER-600 containment and some evaluations for the behaviors during HCDA will be performed later.

  16. Preliminary safety analysis for key design features of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, D. H.; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, S. O.; Lee, Y. B.; Jeong, K. S

    2000-07-01

    KAERI is currently developing the conceptual design of a liquid metal reactor, KALIMER(Korea Advanced Liquid Metal Reactor) under the long-term nuclear R and D program. In this report, descriptions of the KALIMER safety design features and safety analyses results for selected ATWS accidents are presented. First, the basic approach to achieve the safety goal is introduced in chapter 1, and the safety evaluation procedure for the KALIMER design is described in chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events. In chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure design performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram(ATWS) have been performed to investigate the KALIMER system response to the events. They are categorized as bounding events(BEs) because of their low probability of occurrence. In chapter 4, the design of the KALIMER containment dome and the results of its performance analysis are presented. The designs of the existing LMR containment and the KALIMER containment dome have been compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core kinetics and hydraulic behavior during HCDA in chapter 5. Mathematical formulations have been developed in the framework of the modified bethe-tait method, and scoping analyses have been performed for the KALIMER core behavior during super-prompt critical excursions.

  17. Preliminary RAMI analysis of DFLL TBS for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Dagui [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); University of Science and Technology of China, Hefei, Anhui, 230031 (China); Yuan, Run [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); Wang, Jiaqun, E-mail: jiaqun.wang@fds.org.cn [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); Wang, Fang; Wang, Jin [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China)

    2016-11-15

    Highlights: • We performed the functional analysis of the DFLL TBS. • We performed a failure mode analysis of the DFLL TBS. • We estimated the reliability and availability of the DFLL TBS. • The ITER RAMI approach was applied to the DFLL TBS for technical risk control in the design phase. - Abstract: ITER is the first fusion machine fully designed to prove the physics and technological basis for next fusion power plants. Among the main technical objectives of ITER is to test and validate design concepts of tritium breeding blankets relevant to the fusion power plants. To achieve this goal, China has proposed the dual functional lithium-lead test blanket module (DFLL TBM) concept design. The DFLL TBM and its associated ancillary system were called DFLL TBS. The DFLL TBS play a key role in next fusion reactor. In order to ensure reliable and available of DFLL TBS, the risk control project of DFLL TBS has been put on the schedule. As the stage of the ITER technical risk control policy, the RAMI (Reliability, Availability, Maintainability, Inspectability) approach was used to control the technical risk of ITER. In this paper, the RAMI approach was performed on the conceptual design of DFLL TBS. A functional breakdown was prepared on DFLL TBS, and the system was divided into 3 main functions and 72 basic functions. Based on the result of functional breakdown of DFLL TBS, the reliability block diagrams were prepared to estimate the reliability and availability of each function under the stipulated operating conditions. The inherent availability of the DFLL TBS expected after implementation of mitigation actions was calculated to be 98.57% over 2 years based on the ITER reliability database. A Failure Modes Effects and Criticality Analysis (FMECA) was performed with criticality charts highlighting the risk level of the different failure modes with regard to their probability of occurrence and their effects on the availability.

  18. Macroalgae as a Biomass Feedstock: A Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Roesijadi, Guritno; Jones, Susanne B.; Snowden-Swan, Lesley J.; Zhu, Yunhua

    2010-09-26

    A thorough of macroalgae analysis as a biofuels feedstock is warranted due to the size of this biomass resource and the need to consider all potential sources of feedstock to meet current biomass production goals. Understanding how to harness this untapped biomass resource will require additional research and development. A detailed assessment of environmental resources, cultivation and harvesting technology, conversion to fuels, connectivity with existing energy supply chains, and the associated economic and life cycle analyses will facilitate evaluation of this potentially important biomass resource.

  19. Statistical Analysis of Time Series Data (STATS). Users Manual (Preliminary)

    Science.gov (United States)

    1987-05-01

    15, 30. 60, 90, 120, andL -!/14:X.... 183 days are presently used. auto Page 1 of 10 wrpy *VtsE0> J1 record (continued) Field Variab Vlue D 2 NPRDS ...each event. 6 JEND + Order number of last period in time series to ( NPRDS ) select for analysis. If blank, the last period is assumed. 7 JPPF Plotting...values. 2 NPRDS + Actual number of periods for the event following on ’INO records until the next ID, BF, or LI record. IN record - T:E SERIES DATA

  20. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    Energy Technology Data Exchange (ETDEWEB)

    (NOEMAIL), R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  1. A Preliminary Genetic Analysis of Complement 3 Gene and Schizophrenia.

    Directory of Open Access Journals (Sweden)

    Jianliang Ni

    Full Text Available Complement pathway activation was found to occur frequently in schizophrenia, and complement 3 (C3 plays a major role in this process. Previous studies have provided evidence for the possible role of C3 in the development of schizophrenia. In this study, we hypothesized that the gene encoding C3 (C3 may confer susceptibility to schizophrenia in Han Chinese. We analyzed 7 common single nucleotide polymorphisms (SNPs of C3 in 647 schizophrenia patients and 687 healthy controls. Peripheral C3 mRNA expression level was measured in 23 drug-naïve patients with schizophrenia and 24 controls. Two SNPs (rs1047286 and rs2250656 that deviated from Hardy-Weinberg equilibrium were excluded for further analysis. Among the remaining 5 SNPs, there was no significant difference in allele and genotype frequencies between the patient and control groups. Logistic regression analysis showed no significant SNP-gender interaction in either dominant model or recessive model. There was no significant difference in the level of peripheral C3 expression between the drug-naïve schizophrenia patients and healthy controls. In conclusion, the results of this study do not support C3 as a major genetic susceptibility factor in schizophrenia. Other factors in AP may have critical roles in schizophrenia and be worthy of further investigation.

  2. Cadmium and lead residue control in a hazard analysis and critical control point (HACCP) environment.

    Science.gov (United States)

    Pagan-Rodríguez, Doritza; O'Keefe, Margaret; Deyrup, Cindy; Zervos, Penny; Walker, Harry; Thaler, Alice

    2007-02-21

    In 2003-2004, the U.S. Department of Agriculture Food Safety and Inspection Service (FSIS) conducted an exploratory assessment to determine the occurrence and levels of cadmium and lead in randomly collected samples of kidney, liver, and muscle tissues of mature chickens, boars/stags, dairy cows, and heifers. The data generated in the study were qualitatively compared to data that FSIS gathered in a 1985-1986 study in order to identify trends in the levels of cadmium and lead in meat and poultry products. The exploratory assessment was necessary to verify that Hazard Analysis and Critical Control Point plans and efforts to control exposure to these heavy metals are effective and result in products that meet U.S. export requirements. A comparison of data from the two FSIS studies suggests that the incidence and levels of cadmium and lead in different slaughter classes have remained stable since the first study was conducted in 1985-1986. This study was conducted to fulfill FSIS mandate to ensure that meat, poultry, and egg products entering commerce in the United States are free of adulterants, including elevated levels of environmental contaminants such as cadmium and lead.

  3. Hazard analysis and possibilities for preventing botulism originating from meat products

    Directory of Open Access Journals (Sweden)

    Vasilev Dragan

    2008-01-01

    Full Text Available The paper presents the more important data on the bacteria Clostridium botulinum, the appearance of botulism, hazard analysis and the possibilities for preventing botulism. Proteolytic strains of C.botulinum Group I, whose spores are resistant to heat, create toxins predominantly in cans containing slightly sour food items, in the event that the spores are not inactivated in the course of sterilization. Non-proteolytic strains of Group II are more sensitive to high temperatures, but they have the ability to grow and create toxins at low temperatures. Type E most often creates a toxin in vacuum-packed smoked fish, and the non-proteolytic strain type B in dried hams and certain pasteurized meat products. The following plays an important role in the prevention of botulism: reducing to a minimum meat contamination with spores of clostridia, implementing good hygiene measures and production practice during the slaughter of animals, the inactivation of spores of C. botulinum during sterilization (F>3, and, in dried hams and pasteurized products, the prevention of bacterial growth and toxin forming by maintaining low temperatures in the course of production and storage, as well as the correct use of substances that inhibit the multiplication of bacteria and the production of toxins (nitrites, table salt, etc..

  4. The influence of Alpine soil properties on shallow movement hazards, investigated through factor analysis

    Directory of Open Access Journals (Sweden)

    S. Stanchi

    2012-06-01

    shallow soil movements involving the upper soil horizons. We assessed a great number of soil properties that are known to be related to vulnerability to the main hazards present in the area. These properties were evaluated at the two depths and a factor analysis was performed to simplify the dataset interpretation, and to hypothesise the most decisive parameters that were potentially related to vulnerability. The factors (soil structure, aggregation, consistency, texture and parent material, cation exchange complex and other chemical properties were a first step towards identifying soil quality indexes in the studied environment.

  5. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    Science.gov (United States)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  6. Preliminary Rock Physics Analysis on Lodgepole Formation in Manitoba, Canada

    Science.gov (United States)

    Kim, N.; Keehm, Y.

    2012-12-01

    We present rock physics analysis results of Lodgepole Formation, a carbonate reservoir in Daly Field, Manitoba, Canada. We confirmed that the Lodgepole Formation can be divided into six units in the study area: Basal Limestone, Cromer Shale, Cruickshank Crinoidal, Cruickshank Shale, Daly member and Flossie Lake member from the bottom, using eight well log data and previous works. We then performed rock physics analyses on four carbonate units (Basal Limestone, Cruickshank Crinoidal, Daly and Flossie Lake), such as Vp-porosity, AI-porosity, DEM (differential effective medium) modeling, and fluid substitution analysis. In Vp-porosity domain, the top unit, Flossie Lake member has lower porosity and higher velocity, while the other units show similar porosity and velocity. We think that this results from the diagenesis of Flossie Lake member since it bounds with unconformity. However, the four units show very similar trend in Vp-porosity domain, and we can report one Vp-porosity relation for all carbonate units of the Lodgepole formation. We also found that the acoustic impedance varies more than 10% from low porosity zone (3-6%) to high porosity zone (9-12%) from AI-porosity analysis. Thus one can delineate high porosity zone from seismic impedance data. DEM modeling showed that Flossie Lake would have relatively low aspect ratio of pores than the others, which implies that the top unit has been influenced by diagenesis. To determine fluid sensitivity of carbonate units, we conducted fluid substitution on four units from 100% water to 100% oil. The top unit, Flossie Lake, showed slight increase of Vp, which seems to be density effect. The others showed small decrease of Vp, but not significant. If we observe Vp/Vs rather than Vp, the sensitivity increases. However, fluid discrimination would be difficult because of high stiffness of rock frame. In summary, three lower carbonate units of Lodgepole Formation would be prospective and high porosity zone can be delineated

  7. Brain hemisphere dominance and vocational preference: a preliminary analysis.

    Science.gov (United States)

    Szirony, Gary Michael; Pearson, L Carolyn; Burgin, John S; Murray, Gerald C; Elrod, Lisa Marie

    2007-01-01

    Recent developments in split-brain theory add support to the concept of specialization within brain hemispheres. Holland's vocational personality theory may overlap with Human Information Processing (HIP) characteristics. Holland's six RIASEC codes were developed to identify vocational personality characteristics, and HIP scales were designed to measure hemispheric laterality. Relationships between the two scales were evaluated through canonical correlation with some significant results, however not all Holland scale scores correlated with left, right, or integrated hemispheric preference. Additional findings related to participants self-perception of music and math ability were also correlated. Findings on this added analysis revealed a high correlation between perception of musical ability and right brain function but not between mathematical concept and left brain alone. Implications regarding vocational choice and work are discussed.

  8. City of Hoboken Energy Surety Analysis: Preliminary Design Summary

    Energy Technology Data Exchange (ETDEWEB)

    Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Baca, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Munoz-Ramos, Karina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Schenkman, Benjamin L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Readiness and Sustainment Technology Dept.; Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Readiness and Sustainment Technology Dept.; Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electric Power Systems Research Dept.; Henry, Jordan M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Critical Infrastructure Systems Dept.; Jensen, Richard Pearson [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geomechanics Dept.

    2014-09-01

    In 2012, Hurricane Sandy devastated much of the U.S. northeast coastal areas. Among those hardest hit was the small community of Hoboken, New Jersey, located on the banks of the Hudson River across from Manhattan. This report describes a city-wide electrical infrastructure design that uses microgrids and other infrastructure to ensure the city retains functionality should such an event occur in the future. The designs ensure that up to 55 critical buildings will retain power during blackout or flooded conditions and include analysis for microgrid architectures, performance parameters, system control, renewable energy integration, and financial opportunities (while grid connected). The results presented here are not binding and are subject to change based on input from the Hoboken stakeholders, the integrator selected to manage and implement the microgrid, or other subject matter experts during the detailed (final) phase of the design effort.

  9. Analysis of organochlorine pesticides in human milk: preliminary results.

    Science.gov (United States)

    Campoy, C; Jiménez, M; Olea-Serrano, M F; Moreno-Frías, M; Cañabate, F; Olea, N; Bayés, R; Molina-Font, J A

    2001-11-01

    In the face of evidence of human milk contamination by organochlorine pesticides, an analysis was performed on samples of milk obtained from healthy lactating women in the provinces of Granada and Almeria in Southern Spain. The samples were obtained by the Neonate Section of the Department of Pediatrics of Granada University Hospital (Neonatology Division) and by the Neonatal Service of Poniente Hospital in El Ejido, Almería. A liquid-liquid extraction procedure was performed. The cleaning of the sample before gas chromatography-mass spectrometry (GC-MS) used silica Sep-Pak. Among other pesticides, aldrin, dieldrin, DDT and its metabolites, lindane, methoxychlor and endosulfan were identified. The presence of these products was confirmed by mass spectrometry. The identification and quantification of these organochlorine molecules is important because they have estrogenic effects.

  10. Preliminary Analysis of a Fully Solid State Magnetocaloric Refrigeration

    Energy Technology Data Exchange (ETDEWEB)

    Abdelaziz, Omar [ORNL

    2016-01-01

    Magnetocaloric refrigeration is an alternative refrigeration technology with significant potential energy savings compared to conventional vapor compression refrigeration technology. Most of the reported active magnetic regenerator (AMR) systems that operate based on the magnetocaloric effect use heat transfer fluid to exchange heat, which results in complicated mechanical subsystems and components such as rotating valves and hydraulic pumps. In this paper, we propose an alternative mechanism for heat transfer between the AMR and the heat source/sink. High-conductivity moving rods/sheets (e.g. copper, brass, iron, graphite, aluminum or composite structures from these) are utilized instead of heat transfer fluid significantly enhancing the heat transfer rate hence cooling/heating capacity. A one-dimensional model is developed to study the solid state AMR. In this model, the heat exchange between the solid-solid interfaces is modeled via a contact conductance, which depends on the interface apparent pressure, material hardness, thermal conductivity, surface roughness, surface slope between the interfaces, and material filled in the gap between the interfaces. Due to the tremendous impact of the heat exchange on the AMR cycle performance, a sensitivity analysis is conducted employing a response surface method, in which the apparent pressure, effective surface roughness and grease thermal conductivity are the uncertainty factors. COP and refrigeration capacity are presented as the response in the sensitivity analysis to reveal the important factors influencing the fully solid state AMR and optimize the solid state AMR efficiency. The performances of fully solid state AMR and traditional AMR are also compared and discussed in present work. The results of this study will provide general guidelines for designing high performance solid state AMR systems.

  11. Preliminary analysis of cerebrospinal fluid proteome in patients with neurocysticercosis

    Institute of Scientific and Technical Information of China (English)

    TIAN Xiao-jun; LI Jing-yi; HUANG Yong; XUE Yan-ping

    2009-01-01

    Background Neurocysticercosis is the infection of the nervous system by the larvae of Taenia solium (T. solium). Despite continuous effort, the experimental diagnosis of neurocysticercosis remains unresolved. Since the cerebrospinal fluid (CSF) contacts with the brain, dynamic information about pathological processes of the brain is likely to be reflected in CSF. Therefore, CSF may serve as a rich source of putative biomarkers related to neurocysticercosis. Comparative proteomic analysis of CSF of neurocysticercosis patients and control subjects may find differentially expressed proteins. Methods Two-dimensional difference in gel electrophoresis (2D-DIGE) was used to investigate differentially expressed proteins in CSF of patients with neurocysticercosis by comparing the protein profile of CSF from neurocysticercosis patients with that from control subjects. The differentially expressed spots/proteins were recognized with matrix-assisted laser desorption/ionization-time of flight-time of flight (MALDI-TOF-TOF) mass spectrometry. Results Forty-four enzyme digested peptides were obtained from 4 neurocysticercotic patients. Twenty-three were identified through search of the NCBI protein database with Mascot software, showing 19 up-expressed and 4 down-expressed. Of these proteins, 26S proteosome related to ATP- and ubiquitin-dependent degradation of proteins and lipocalin type prostaglandin D synthase involved in PGD2-synthesis and extracellular transporter activities were up-expressed, while transferrin related to iron metabolism within the brain was down-expressed. Conclusions This study established the proteomic profile of pooled CSF from 4 patients with neurocysticercosis, suggesting the potential value of proteomic analysis for the study of candidate biomarkers involved in the diagnosis or pathogenesis of neurocysticercosis.

  12. Preliminary Design and Analysis of the GIFTS Instrument Pointing System

    Science.gov (United States)

    Zomkowski, Paul P.

    2003-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Instrument is the next generation spectrometer for remote sensing weather satellites. The GIFTS instrument will be used to perform scans of the Earth s atmosphere by assembling a series of field-of- views (FOV) into a larger pattern. Realization of this process is achieved by step scanning the instrument FOV in a contiguous fashion across any desired portion of the visible Earth. A 2.3 arc second pointing stability, with respect to the scanning instrument, must be maintained for the duration of the FOV scan. A star tracker producing attitude data at 100 Hz rate will be used by the autonomous pointing algorithm to precisely track target FOV s on the surface of the Earth. The main objective is to validate the pointing algorithm in the presence of spacecraft disturbances and determine acceptable disturbance limits from expected noise sources. Proof of concept validation of the pointing system algorithm is carried out with a full system simulation developed using Matlab Simulink. Models for the following components function within the full system simulation: inertial reference unit (IRU), attitude control system (ACS), reaction wheels, star tracker, and mirror controller. With the spacecraft orbital position and attitude maintained to within specified limits the pointing algorithm receives quaternion, ephemeris, and initialization data that are used to construct the required mirror pointing commands at a 100 Hz rate. This comprehensive simulation will also aid in obtaining a thorough understanding of spacecraft disturbances and other sources of pointing system errors. Parameter sensitivity studies and disturbance analysis will be used to obtain limits of operability for the GIFTS instrument. The culmination of this simulation development and analysis will be used to validate the specified performance requirements outlined for this instrument.

  13. Parameter estimation in Probabilistic Seismic Hazard Analysis: current problems and some solutions

    Science.gov (United States)

    Vermeulen, Petrus

    2017-04-01

    A typical Probabilistic Seismic Hazard Analysis (PSHA) comprises identification of seismic source zones, determination of hazard parameters for these zones, selection of an appropriate ground motion prediction equation (GMPE), and integration over probabilities according the Cornell-McGuire procedure. Determination of hazard parameters often does not receive the attention it deserves, and, therefore, problems therein are often overlooked. Here, many of these problems are identified, and some of them addressed. The parameters that need to be identified are those associated with the frequency-magnitude law, those associated with earthquake recurrence law in time, and the parameters controlling the GMPE. This study is concerned with the frequency-magnitude law and temporal distribution of earthquakes, and not with GMPEs. TheGutenberg-Richter frequency-magnitude law is usually adopted for the frequency-magnitude law, and a Poisson process for earthquake recurrence in time. Accordingly, the parameters that need to be determined are the slope parameter of the Gutenberg-Richter frequency-magnitude law, i.e. the b-value, the maximum value at which the Gutenberg-Richter law applies mmax, and the mean recurrence frequency,λ, of earthquakes. If, instead of the Cornell-McGuire, the "Parametric-Historic procedure" is used, these parameters do not have to be known before the PSHA computations, they are estimated directly during the PSHA computation. The resulting relation for the frequency of ground motion vibration parameters has an analogous functional form to the frequency-magnitude law, which is described by parameters γ (analogous to the b¬-value of the Gutenberg-Richter law) and the maximum possible ground motion amax (analogous to mmax). Originally, the approach was possible to apply only to the simple GMPE, however, recently a method was extended to incorporate more complex forms of GMPE's. With regards to the parameter mmax, there are numerous methods of estimation

  14. Hazard analysis in active landslide areas in the State of Veracruz, Mexico

    Science.gov (United States)

    Wilde, Martina; Morales Barrera, Wendy V.; Rodriguez Elizarrarás, Sergio R.; Solleiro Rebolledo, Elizabeth; Sedov, Sergey; Terhorst, Birgit

    2016-04-01

    mass movements are analyzed in order to reconstruct complex interrelations of the causes and effects of landslide events. One of the major objectives of this research is to evaluate the potential hazard of active landslide areas. Detailed field analyzes were performed to investigate the situations and dynamics of the slope movements. Therefore, geomorphological mapping, sediment characterization as well as geophysical methods are applied. On the one hand, a detailed sediment characterization aims to identify the type of material (e.g. geotechnical attributes), on the other sediments can provide information on different activity phases, respectively movement processes in slide masses. Furthermore, the focus is placed on the determination of landslide relevant parameters and thresholds. Digital elevation models, which were generated before the onset of slope movements, are integrated in the geomorphological analysis. The poster presents the specific study sites in Veracruz and the situation of endangered slopes before and after the landslide events. It is planned to use this knowledge to model susceptibility maps for the region in the future. Moreover, field data will be used as basic information for further monitoring plans. Resulting susceptibility maps will be provided to the responsible authorities in order to support sustainable planning of settlements and infrastructure in hazardous regions.

  15. Hazardous waste status of discarded electronic cigarettes

    Energy Technology Data Exchange (ETDEWEB)

    Krause, Max J.; Townsend, Timothy G., E-mail: ttown@ufl.edu

    2015-05-15

    Highlights: • Electronic cigarettes were tested using TCLP and WET. • Several electronic cigarette products leached lead at hazardous waste levels. • Lead was the only element that exceeded hazardous waste concentration thresholds. • Nicotine solution may cause hazardous waste classification when discarded unused. - Abstract: The potential for disposable electronic cigarettes (e-cigarettes) to be classified as hazardous waste was investigated. The Toxicity Characteristic Leaching Procedure (TCLP) was performed on 23 disposable e-cigarettes in a preliminary survey of metal leaching. Based on these results, four e-cigarette products were selected for replicate analysis by TCLP and the California Waste Extraction Test (WET). Lead was measured in leachate as high as 50 mg/L by WET and 40 mg/L by TCLP. Regulatory thresholds were exceeded by two of 15 products tested in total. Therefore, some e-cigarettes would be toxicity characteristic (TC) hazardous waste but a majority would not. When disposed in the unused form, e-cigarettes containing nicotine juice would be commercial chemical products (CCP) and would, in the United States (US), be considered a listed hazardous waste (P075). While household waste is exempt from hazardous waste regulation, there are many instances in which such waste would be subject to regulation. Manufactures and retailers with unused or expired e-cigarettes or nicotine juice solution would be required to manage these as hazardous waste upon disposal. Current regulations and policies regarding the availability of nicotine-containing e-cigarettes worldwide were reviewed. Despite their small size, disposable e-cigarettes are consumed and discarded much more quickly than typical electronics, which may become a growing concern for waste managers.

  16. Advanced Rapid Imaging & Analysis for Monitoring Hazards (ARIA-MH) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a service-oriented hazard/disaster monitoring data system enabling both science and decision-support communities to monitor ground motion in areas of...

  17. Preliminary analysis of the use of smartwatches for longitudinal health monitoring.

    Science.gov (United States)

    Jovanov, Emil

    2015-08-01

    New generations of smartwatches feature continuous measurement of physiological parameters, such as heart rate, galvanic skin resistance (GSR), and temperature. In this paper we present the results of preliminary analysis of the use of Basis Peak smartwatch for longitudinal health monitoring during a 4 month period. Physiological measurements during sleep are validated using Zephyr Bioharness 3 monitor and SOMNOscreen+ polysomnographic monitoring system from SOMNOmedics. Average duration of sequences with no missed data was 49.9 minutes, with maximum length of 17 hours, and they represent 88.88% of recording time. Average duration of the charging event was 221.9 min, and average time between charges was 54 hours, with maximum duration of the charging event of 16.3 hours. Preliminary results indicate that the physiological monitoring performance of existing smartwatches provides sufficient performance for longitudinal monitoring of health status and analysis of health and wellness trends.

  18. Relative risk analysis in regulating the use of radiation-emitting medical devices. A preliminary application

    Energy Technology Data Exchange (ETDEWEB)

    Jones, E.D.; Banks, W.W.; Altenbach, T.J.; Fischer, L.E. [Lawrence Livermore National Lab., CA (United States)

    1995-09-01

    This report describes a preliminary application of an analysis approach for assessing relative risks in the use of radiation- emitting medical devices. Results are presented on human-initiated actions and failure modes that are most likely to occur in the use of the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step in a US Nuclear Regulatory Commission (NRC) plan to evaluate the potential role of risk analysis in regulating the use of nuclear medical devices. For this preliminary application of risk assessment, the focus was to develop a basic process using existing techniques for identifying the most likely risk contributors and their relative importance. The approach taken developed relative risk rankings and profiles that incorporated the type and quality of data available and could present results in an easily understood form. This work was performed by the Lawrence Livermore National Laboratory for the NRC.

  19. SCEC Community Modeling Environment (SCEC/CME) - Seismic Hazard Analysis Applications and Infrastructure

    Science.gov (United States)

    Maechling, P. J.; Jordan, T. H.; Kesselman, C.; Moore, R.; Minster, B.; SCEC ITR Collaboration

    2003-12-01

    The Southern California Earthquake Center (SCEC) has formed a Geoscience/IT partnership to develop an advanced information infrastructure for system-level earthquake science in Southern California. This SCEC/ITR partnership comprises SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. This collaboration recently completed the second year in a five-year National Science Foundation (NSF) funded ITR project called the SCEC Community Modeling Environment (SCEC/CME). The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed by project collaborators include a Probabilistic Seismic Hazard Analysis system called OpenSHA [Field et al., this meeting]. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERF's). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. A Rupture Dynamic Model (RDM) has also been developed that couples a rupture dynamics simulation into an anelastic wave model. The collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of SHA programs. To support computationally expensive simulations, we have constructed a grid-based system utilizing Globus software [Kesselman et al., this meeting]. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC, NPACI and Teragrid High Performance Computing Centers. We have

  20. Analysis of Risks in Hainan Island Typhoon Hazard Factor Based on GIS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The aim of this paper was to analyze the risks in the typhoon hazard factors in Hainan Island. [Method] Taking the theory and method of natural disasters evaluation as starting point and supporting point, and selecting Hainan province as the research target, where the typhoon disaster occurred relatively serious, based on the typhoon data during 1958-2008, with happening frequency of typhoon hazard-formative factors, maximum rainfall, potentially devastating effects of typhoon winds as evaluatio...

  1. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    Science.gov (United States)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  2. A comparative analysis of hazard models for predicting debris flows in Madison County, VA

    Science.gov (United States)

    Morrissey, Meghan M.; Wieczorek, Gerald F.; Morgan, Benjamin A.

    2001-01-01

    During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a sixteen-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, Virginia. Field studies showed that the majority (70%) of these debris flows initiated with a thickness of 0.5 to 3.0 m in colluvium on slopes from 17 o to 41 o (Wieczorek et al., 2000). This paper evaluated and compared the approaches of SINMAP, LISA, and Iverson's (2000) transient response model for slope stability analysis by applying each model to the landslide data from Madison County. Of these three stability models, only Iverson's transient response model evaluated stability conditions as a function of time and depth. Iverson?s model would be the preferred method of the three models to evaluate landslide hazards on a regional scale in areas prone to rain-induced landslides as it considers both the transient and spatial response of pore pressure in its calculation of slope stability. The stability calculation used in SINMAP and LISA is similar and utilizes probability distribution functions for certain parameters. Unlike SINMAP that only considers soil cohesion, internal friction angle and rainfall-rate distributions, LISA allows the use of distributed data for all parameters, so it is the preferred model to evaluate slope stability over SINMAP. Results from all three models suggested similar soil and hydrologic properties for triggering the landslides that occurred during the 1995 storm in Madison County, Virginia. The colluvium probably had cohesion of less than 2KPa. The root-soil system is above the failure plane and consequently root strength and tree surcharge had negligible effect on slope stability. The result that the final location of the water table was near the ground surface is supported by the water budget analysis of the rainstorm conducted by Smith et al. (1996).

  3. Readiness to implement Hazard Analysis and Critical Control Point (HACCP) systems in Iowa schools.

    Science.gov (United States)

    Henroid, Daniel; Sneed, Jeannie

    2004-02-01

    To evaluate current food-handling practices, food safety prerequisite programs, and employee knowledge and food safety attitudes and provide baseline data for implementing Hazard Analysis and Critical Control Point (HACCP) systems in school foodservice. One member of the research team visited each school to observe food-handling practices and assess prerequisite programs using a structured observation form. A questionnaire was used to determine employees' attitudes, knowledge, and demographic information. A convenience sample of 40 Iowa schools was recruited with input from the Iowa Department of Education. Descriptive statistics were used to summarize data. One-way analysis of variance was used to assess differences in attitudes and food safety knowledge among managers, cooks, and other foodservice employees. Multiple linear regression assessed the relationship between manager and school district demographics and the food safety practice score. Proper food-handling practices were not being followed in many schools and prerequisite food safety programs for HACCP were found to be inadequate for many school foodservice operations. School foodservice employees were found to have a significant amount of food safety knowledge (15.9+/-2.4 out of 20 possible points). School districts with managers (P=.019) and employees (P=.030) who had a food handler certificate were found to have higher food safety practice scores. Emphasis on implementing prerequisite programs in preparation for HACCP is needed in school foodservice. Training programs, both basic food safety such as ServSafe and HACCP, will support improvement of food-handling practices and implementation of prerequisite programs and HACCP.

  4. Sammon mapping for preliminary analysis in Hyperspectral Imagery

    Directory of Open Access Journals (Sweden)

    Nicolae APOSTOLESCU

    2016-03-01

    Full Text Available The main goal of this paper is to present the implementation of the Sammon algorithm developed for finding N points in a lower m-dimensional subspace, where the original points are from a high n-dimensional space. This mapping is done so interpoints Euclidian distances in m-space correspond to the distances measured in the n-dimensional space. This method known as non-linear projection method or multidimensional scaling (MDS aims to preserve the global properties of points. The method is based on the idea of transforming the original, n-dimensional input space into a reduced, m-dimensional one, where mAnalysis (PCA may be applied as a pre-processing procedure for starting, in order to obtain the N points in the lower subspace. The algorithm was tested on hyperspectral data with spectra of various lengths. Depending of the size of the input data (number of points, the number of learning iterations and computational facilities available, Sammon mapping might be computationally expensive.

  5. A simplified procedure of linear regression in a preliminary analysis

    Directory of Open Access Journals (Sweden)

    Silvia Facchinetti

    2013-05-01

    Full Text Available The analysis of a statistical large data-set can be led by the study of a particularly interesting variable Y – regressed – and an explicative variable X, chosen among the remained variables, conjointly observed. The study gives a simplified procedure to obtain the functional link of the variables y=y(x by a partition of the data-set into m subsets, in which the observations are synthesized by location indices (mean or median of X and Y. Polynomial models for y(x of order r are considered to verify the characteristics of the given procedure, in particular we assume r= 1 and 2. The distributions of the parameter estimators are obtained by simulation, when the fitting is done for m= r + 1. Comparisons of the results, in terms of distribution and efficiency, are made with the results obtained by the ordinary least square methods. The study also gives some considerations on the consistency of the estimated parameters obtained by the given procedure.

  6. Preliminary Analysis of Slope Stability in Kuok and Surrounding Areas

    Directory of Open Access Journals (Sweden)

    Dewandra Bagus Eka Putra

    2016-12-01

    Full Text Available The level of slope influenced by the condition of the rocks beneath the surface. On high level of slopes, amount of surface runoff and water transport energy is also enlarged. This caused by greater gravity, in line with the surface tilt from the horizontal plane. In other words, topsoil eroded more and more. When the slope becomes twice as steep, then the amount of erosion per unit area be 2.0 - 2.5 times more. Kuok and surrounding area is the road access between the West Sumatra and Riau which plays an important role economies of both provinces. The purpose of this study is to map the locations that have fairly steep slopes and potential mode of landslides. Based on SRTM data obtained,  the roads in Kuok area has a minimum elevation of + 33 m and a maximum  + 217.329 m. Rugged road conditions with slope ranging from 24.08 ° to 44.68 ° causing this area having frequent landslides. The result of slope stability analysis in a slope near the Water Power Plant Koto Panjang, indicated that mode of active failure is toppling failure or rock fall and the potential zone of failure is in the center part of the slope.

  7. Social network analysis in identifying influential webloggers: A preliminary study

    Science.gov (United States)

    Hasmuni, Noraini; Sulaiman, Nor Intan Saniah; Zaibidi, Nerda Zura

    2014-12-01

    In recent years, second generation of internet-based services such as weblog has become an effective communication tool to publish information on the Web. Weblogs have unique characteristics that deserve users' attention. Some of webloggers have seen weblogs as appropriate medium to initiate and expand business. These webloggers or also known as direct profit-oriented webloggers (DPOWs) communicate and share knowledge with each other through social interaction. However, survivability is the main issue among DPOW. Frequent communication with influential webloggers is one of the way to keep survive as DPOW. This paper aims to understand the network structure and identify influential webloggers within the network. Proper understanding of the network structure can assist us in knowing how the information is exchanged among members and enhance survivability among DPOW. 30 DPOW were involved in this study. Degree centrality and betweenness centrality measurement in Social Network Analysis (SNA) were used to examine the strength relation and identify influential webloggers within the network. Thus, webloggers with the highest value of these measurements are considered as the most influential webloggers in the network.

  8. Seismic hazard analysis with PSHA method in four cities in Java.

    Science.gov (United States)

    Elistyawati, Y.; Palupi, I. R.; Suharsono

    2016-11-01

    In this study the tectonic earthquakes was observed through the peak ground acceleration through the PSHA method by dividing the area of the earthquake source. This study applied the earthquake data from 1965 - 2015 that has been analyzed the completeness of the data, location research was the entire Java with stressed in four large cities prone to earthquakes. The results were found to be a hazard map with a return period of 500 years, 2500 years return period, and the hazard curve were four major cities (Jakarta, Bandung, Yogyakarta, and the city of Banyuwangi). Results Java PGA hazard map 500 years had a peak ground acceleration within 0 g ≥ 0.5 g, while the return period of 2500 years had a value of 0 to ≥ 0.8 g. While, the PGA hazard curves on the city's most influential source of the earthquake was from sources such as fault Cimandiri backgroud, for the city of Bandung earthquake sources that influence the seismic source fault dent background form. In other side, the city of Yogyakarta earthquake hazard curve of the most influential was the source of the earthquake background of the Opak fault, and the most influential hazard curve of Banyuwangi earthquake was the source of Java and Sumba megatruts earthquake.

  9. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  10. Flood hazard zoning in Yasooj region, Iran, using GIS and multi-criteria decision analysis

    Directory of Open Access Journals (Sweden)

    Omid Rahmati

    2016-05-01

    Full Text Available Flood is considered to be the most common natural disaster worldwide during the last decades. Flood hazard potential mapping is required for management and mitigation of flood. The present research was aimed to assess the efficiency of analytical hierarchical process (AHP to identify potential flood hazard zones by comparing with the results of a hydraulic model. Initially, four parameters via distance to river, land use, elevation and land slope were used in some part of the Yasooj River, Iran. In order to determine the weight of each effective factor, questionnaires of comparison ratings on the Saaty's scale were prepared and distributed to eight experts. The normalized weights of criteria/parameters were determined based on Saaty's nine-point scale and its importance in specifying flood hazard potential zones using the AHP and eigenvector methods. The set of criteria were integrated by weighted linear combination method using ArcGIS 10.2 software to generate flood hazard prediction map. The inundation simulation (extent and depth of flood was conducted using hydrodynamic program HEC-RAS for 50- and 100-year interval floods. The validation of the flood hazard prediction map was conducted based on flood extent and depth maps. The results showed that the AHP technique is promising of making accurate and reliable prediction for flood extent. Therefore, the AHP and geographic information system (GIS techniques are suggested for assessment of the flood hazard potential, specifically in no-data regions.

  11. Preliminary Failure Modes and Effects Analysis of the US Massive Gas Injection Disruption Mitigation System Design

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2013-10-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a candidate design for the ITER Disruption Mitigation System. This candidate is the Massive Gas Injection System that provides machine protection in a plasma disruption event. The FMEA was quantified with “generic” component failure rate data as well as some data calculated from operating facilities, and the failure events were ranked for their criticality to system operation.

  12. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2007-08-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with “generic” component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance.

  13. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2010-06-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with “generic” component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance.

  14. PRELIMINARY PHYTOCHEMICAL ANALYSIS AND ACUTE ORAL TOXICITY STUDY OF CLITORIA TERNATEA LINN. ROOTS IN ALBINO MICE

    Directory of Open Access Journals (Sweden)

    Deka Manalisha

    2011-12-01

    Full Text Available Clitoria ternatea has been using since the ancient times for its medicinal values. Almost all the parts of the plant have medicinal property. The root of the plant is reported to have anti diarrheal, Anti histamic, cholinergic activity etc. Traditionally the root has been using for the treatment of many diseases like leucorrhoea, diarrhea, urinary problems, diuretic, impotency, stomach trouble etc. The present study was designed to investigate the preliminary phytochemical analysis and acute oral toxicity of the root of the plant. The shed dried materials were grinded and used in the study. The preliminary phytochemical analysis was done by following standard protocols. For acute oral toxicity study, methanolic extract of the root was used. The extract was prepared by standard protocol. The preliminary phytochemical analysis showed the presence of proteins, carbohydrates, glycosides, resins, saponin, flavonoid, alkaloids, steroids and phenol. The acute oral toxicity study showed no mortality up to a dose of 3000 mg per kg body weight. The presence of plant chemicals revealed the medicinal values and the non toxic property of the plant indicated the value of the plant as medicine. Thus we can conclude that, the root of the plant can be used as a safe drug against many diseases.

  15. PRELIMINARY PHYTOCHEMICAL ANALYSIS AND ACUTE ORAL TOXICITY STUDY OF MUCUNA PRURIENS LINN. IN ALBINO MICE

    Directory of Open Access Journals (Sweden)

    Deka Manalisha

    2012-02-01

    Full Text Available Mucuna Pruriens Linn. is an annual, climbing shrub which has an important place among aphrodisiac herbs in India since the ancient times. The plant has been using traditionally for many medicinal purposes such as Infertility, Parkinson’s disease, Loss of libido, Antioxidant, Anti venom, Anti microbial etc. The present study was carried out to investigate the preliminary phytochemical analysis and acute oral toxicity of the seeds of M.pruriens on albino mice. Matured seeds of M.pruriens were dried in shed and grinded in a mechanical grinder. The preliminary phytochemical analysis was done by following standard protocols. For acute oral toxicity study, methanolic extract of the seeds were used. The extract was prepared in a Soxlet apparatus. The preliminary phytochemical analysis showed the presence of protein, carbohydrates, glycosides, alkaloids, steroids, flavonoids, phenols and tannins. The acute oral toxicity study showed no mortality up to a dose of 4000 mg per kg body weight. The presence of plant chemicals revealed the medicinal values and the non toxic property of the plant indicated the value of the plant as medicine. Thus, we can conclude that, the seed of the plant can be used as a safe drug against many diseases.

  16. Potential biological hazard of importance for HACCP plans in fresh fish processing

    Directory of Open Access Journals (Sweden)

    Baltić Milan Ž.

    2009-01-01

    Full Text Available The Hazard Analysis and Critical Control Point (HACCP system is scientifically based and focused on problem prevention in order to assure the produced food products are safe to consume. Prerequisite programs such as GMP (Good Manufacturing Practices, GHP (Good Hygienic Practices are an essential foundation for the development and implementation of successful HACCP plans. One of the preliminary tasks in the development of HACCP plan is to conduct a hazard analysis. The process of conducting a hazard analysis involves two stages. The first is hazard identification and the second stage is the HACCP team decision which potential hazards must be addressed in the HACCP plan. By definition, the HACCP concept covers all types of potential food safety hazards: biological, chemical and physical, whether they are naturally occurring in the food, contributed by the environment or generated by a mistake in the manufacturing process. In raw fish processing, potential significant biological hazards which are reasonably likely to cause illness of humans are parasites (Trematodae, Nematodae, Cestodae, bacteria (Salmonella, E. coli, Vibrio parahemolyticus, Vibrio vulnificus, Listeria monocytogenes, Clostridium botulinum, Staphyloccocus aureus, viruses (Norwalk virus, Entero virusesi, Hepatitis A, Rotovirus and bio-toxins. Upon completion of hazard analysis, any measure(s that are used to control the hazard(s should be described.

  17. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    Science.gov (United States)

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters.

  18. Hazard Evaluation for a Salt Well Centrifugal Pump Design Using Service Water for Lubrication and Cooling

    Energy Technology Data Exchange (ETDEWEB)

    GRAMS, W.H.

    2000-10-09

    This report documents the results of a preliminary hazard analysis (PHA) covering the new salt well pump design. The PHA identified ten hazardous conditions mapped to four analyzed accidents: flammable gas deflagrations, fire in contaminated area, tank failure due to excessive loads, and waste transfer leaks. This document also presents the results of the control decision/allocation process. A backflow preventer and associated limiting condition were assigned.

  19. Preliminary CFD Analysis for HVAC System Design of a Containment Building

    Energy Technology Data Exchange (ETDEWEB)

    Son, Sung Man; Choi, Choengryul [ELSOLTEC, Yongin (Korea, Republic of); Choo, Jae Ho; Hong, Moonpyo; Kim, Hyungseok [KEPCO Engineering and Construction, Gimcheon (Korea, Republic of)

    2016-10-15

    HVAC (Heating, Ventilation, Air Conditioning) system has been mainly designed based on overall heat balance and averaging concepts, which is simple and useful for designing overall system. However, such a method has the disadvantage that cannot predict the local flow and temperature distributions in a containment building. In this study, a CFD (Computational Fluid Dynamics) preliminary analysis is carried out to obtain detailed flow and temperature distributions in a containment building and to ensure that such information can be obtained via CFD analysis. This approach can be useful for hydrogen analysis in an accident related to hydrogen released into a containment building. In this study, CFD preliminary analysis has been performed to obtain the detailed information of the reactor containment building by using the CFD analysis techniques and to ensure that such information can be obtained via CFD analysis. We confirmed that CFD analysis can offer enough detailed information about flow patterns and temperature field and that CFD technique is a useful tool for HVAC design of nuclear power plants.

  20. Hazardous waste status of discarded electronic cigarettes.

    Science.gov (United States)

    Krause, Max J; Townsend, Timothy G

    2015-05-01

    The potential for disposable electronic cigarettes (e-cigarettes) to be classified as hazardous waste was investigated. The Toxicity Characteristic Leaching Procedure (TCLP) was performed on 23 disposable e-cigarettes in a preliminary survey of metal leaching. Based on these results, four e-cigarette products were selected for replicate analysis by TCLP and the California Waste Extraction Test (WET). Lead was measured in leachate as high as 50mg/L by WET and 40mg/L by TCLP. Regulatory thresholds were exceeded by two of 15 products tested in total. Therefore, some e-cigarettes would be toxicity characteristic (TC) hazardous waste but a majority would not. When disposed in the unused form, e-cigarettes containing nicotine juice would be commercial chemical products (CCP) and would, in the United States (US), be considered a listed hazardous waste (P075). While household waste is exempt from hazardous waste regulation, there are many instances in which such waste would be subject to regulation. Manufactures and retailers with unused or expired e-cigarettes or nicotine juice solution would be required to manage these as hazardous waste upon disposal. Current regulations and policies regarding the availability of nicotine-containing e-cigarettes worldwide were reviewed. Despite their small size, disposable e-cigarettes are consumed and discarded much more quickly than typical electronics, which may become a growing concern for waste managers.

  1. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  2. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    Science.gov (United States)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  3. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

  4. AschFlow - A dynamic landslide run-out model for medium scale hazard analysis.

    Science.gov (United States)

    Luna, Byron Quan; Blahut, Jan; van Asch, Theo; van Westen, Cees; Kappes, Melanie

    2015-04-01

    Landslides and debris flow hazard assessments require a scale-dependent analysis in order to mitigate damage and other negative consequences at the respective scales of occurrence. Medium or large scale landslide run-out modelling for many possible landslide initiation areas has been a cumbersome task in the past. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the run-out models to compute the displacement with a large amount of individual initiation areas (computational exhaustive). Most of the existing physically based run-out models have complications in handling such situations and therefore empirical methods have been used as a practical mean to predict landslides mobility at a medium scale (1:10,000 to 1:50,000). In this context, a simple medium scale numerical model for rapid mass movements in urban and mountainous areas was developed. The deterministic nature of the approach makes it possible to calculate the velocity, height and increase in mass by erosion, resulting in the estimation of various forms of impacts exerted by debris flows at the medium scale The established and implemented model ("AschFlow") is a 2-D one-phase continuum model that simulates, the entrainment, spreading and deposition process of a landslide or debris flow at a medium scale. The flow is thus treated as a single phase material, whose behavior is controlled by rheology (e.g. Voellmy or Bingham). The developed regional model "AschFlow" was applied and evaluated in well documented areas with known past debris flow events.

  5. Simulating Social and Political Influences on Hazard Analysis through a Classroom Role Playing Exercise

    Science.gov (United States)

    Hales, T. C.; Cashman, K. V.

    2006-12-01

    Geological hazard mitigation is a complicated process that involves both detailed scientific research and negotiations between community members with competing interests in the solution. Geological hazards classes based around traditional lecture methods have difficulty conveying the decision-making processes that go into these negotiations. To address this deficiency, we have spent five years developing and testing a role- playing exercise based on mitigation of a dam outburst hazard on Ruapehu volcano, New Zealand. In our exercise, students are asked to undertake one of five different roles and decide the best way to mitigate the hazard. Over the course of their discussion students are challenged to reach a consensus decision despite the presence of strongly opposed positions. Key to the success of the exercise are (1) the presence of a facilitator and recorder for each meeting, (2) the provision of unique information for each interested party, and (3) the division of the class into multiple meeting groups, such that everyone is required to participate and individual groups can evolve to different conclusions. The exercise can be completed in a single hour and twenty minute classroom session that is divided into four parts: an introduction, a meeting between members of the same interested party to discuss strategy, a meeting between different interested parties, and a debriefing session. This framework can be readily translated to any classroom hazard problem. In our experience, students have responded positively to the use of role-playing to supplement lectures.

  6. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  7. GC-MS analysis, preliminary phytochemical screening, physicochemical analysis and anti-diabetic activity of ethanol extract of Jasminum cuspidatum leaves

    National Research Council Canada - National Science Library

    Singumsetty Vinay; Shaik Karimulla; Devarajan Saravanan

    2014-01-01

    The purpose of the present study was investigating the GC-MS analysis, preliminary phytochemical screening, physicochemical analysis and anti-diabetic activity of ethanol extract of the leaves of Jasminum cuspidatum...

  8. ANSI/ASHRAE/IESNA Standard 90.1-2010 Preliminary Determination Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Liu, Bing; Rosenberg, Michael I.

    2010-11-01

    The United States (U.S.) Department of Energy (DOE) conducted a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2010 (ASHRAE Standard 90.1-2010, Standard 90.1-2010, or 2010 edition) would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2007(ASHRAE Standard 90.1-2007, Standard 90.1-2007, or 2007 edition). The preliminary analysis considered each of the 109 addenda to ASHRAE Standard 90.1-2007 that were included in ASHRAE Standard 90.1-2010. All 109 addenda processed by ASHRAE in the creation of Standard 90.1-2010 from Standard 90.1-2007 were reviewed by DOE, and their combined impact on a suite of 16 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s preliminary determination. However, out of the 109 addenda, 34 were preliminarily determined to have measureable and quantifiable impact.

  9. Regional-scale analysis of lake outburst hazards in the southwestern Pamir, Tajikistan, based on remote sensing and GIS

    Directory of Open Access Journals (Sweden)

    M. Mergili

    2011-05-01

    Full Text Available This paper presents an analysis of the hazards emanating from the sudden drainage of alpine lakes in South-Western Tajik Pamir. In the last 40 yr, several new lakes have formed in the front of retreating glacier tongues, and existing lakes have grown. Other lakes are dammed by landslide deposits or older moraines. In 2002, sudden drainage of a glacial lake in the area triggered a catastrophic debris flow. Building on existing approaches, a rating scheme was devised allowing quick, regional-scale identification of potentially hazardous lakes and possible impact areas. This approach relies on GIS, remote sensing and empirical modelling, largely based on medium-resolution international datasets. Out of the 428 lakes mapped in the area, 6 were rated very hazardous and 34 hazardous. This classification was used for the selection of lakes requiring in-depth investigation. Selected cases are presented and discussed in order to understand the potentials and limitations of the approach used. Such an understanding is essential for the appropriate application of the methodology for risk mitigation purposes.

  10. Preliminary phytochemical analysis and DPPH free radical scavenging activity of Trewia nudiflora Linn. roots and leaves.

    Science.gov (United States)

    Balakrishnan, N; Srivastava, Mayank; Tiwari, Pallavi

    2013-11-01

    Oxidative stress is one of the major causative factors of many chronic and degenerative diseases. Plants have been used in traditional medicine in different parts of world for thousands of years and continue to provide new remedies for human kind. The present study was to investigate the preliminary phytochemical analysis of various extracts of roots and leaves of Trewia nudiflora (Euphorbiaceae) and antioxidant activity by 1,1,diphenyl-2-picryl hydrazyl (DPPH) radical scavenging method. The preliminary phytochemical screening showed the presence of several phytochemicals including alkaloids, glycosides, flavonoids, steroids, phenolic compounds and tannins. The ethanol and aqueous extracts of roots and leaves of Trewia nudiflora showed significant antioxidant activity compared to standard drug ascorbic acid.

  11. GIS-Based Spatial Analysis and Modeling for Landslide Hazard Assessment: A Case Study in Upper Minjiang River Basin

    Institute of Scientific and Technical Information of China (English)

    FENG Wenlan; ZHOU Qigang; ZHANG Baolei; ZHOU Wancun; LI Ainong; ZHANG Haizhen; XIAN Wei

    2006-01-01

    By analyzing the topographic features of past landslides since 1980s and the main land-cover types (including change information) in landslide-prone area, modeled spatial distribution of landslide hazard in upper Minjiang River Basin was studied based on spatial analysis of GIS in this paper. Results of GIS analysis showed that landslide occurrence in this region closely related to topographic feature. Most areas with high hazard probability were deep-sheared gorge. Most of them in investigation occurred assembly in areas with elevation lower than 3 000 m, due to fragile topographic conditions and intensive human disturbances. Land-cover type, including its change information, was likely an important environmental factor to trigger landslide. Destroy of vegetation driven by increase of population and its demands augmented the probability of landslide in steep slope.

  12. CONSTRUCTION OF THE CHINESE LEARNERS' PARALLEL CORPUS OF JAPANESE AND ITS PRELIMINARY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Masatake Dantsuji

    2004-01-01

    Full Text Available This study aims to introduce the project to construct the Chinese learners' corpus (LC of Japanese at Dalian University of Technology (DUT, and detail the LC construction, development of DUT Corpus Linguistics Tools, and contribution to the education of Japanese as a second language. The outstanding characteristic of the LC is its parallel form with learners' Japanese texts and their Chinese translation, which enables us to make comprehensive analysis of the influence of Chinese (L1 to Japanese (L2. We have made a preliminary analysis of the errors contained.

  13. Preliminary Design and Analysis of the ARES Atmospheric Flight Vehicle Thermal Control System

    Science.gov (United States)

    Gasbarre, J. F.; Dillman, R. A.

    2003-01-01

    The Aerial Regional-scale Environmental Survey (ARES) is a proposed 2007 Mars Scout Mission that will be the first mission to deploy an atmospheric flight vehicle (AFV) on another planet. This paper will describe the preliminary design and analysis of the AFV thermal control system for its flight through the Martian atmosphere and also present other analyses broadening the scope of that design to include other phases of the ARES mission. Initial analyses are discussed and results of trade studies are presented which detail the design process for AFV thermal control. Finally, results of the most recent AFV thermal analysis are shown and the plans for future work are discussed.

  14. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  15. Ground landslide hazard potency using geoelectrical resistivity analysis and VS30, case study at geophysical station, Lembang, Bandung

    Science.gov (United States)

    Rohadi, Supriyanto; Sakya, Andi Eka; Masturyono, Murjaya, Jaya; Sunardi, Bambang; Rasmid, Ngadmanto, Drajat; Susilanto, Pupung; Nugraha, Jimmi; Pakpahan, Suliyanti

    2017-07-01

    We have conducted geoelectric resistivity and shear wave velocity (Vs30) study to identify the landslide potential hazard, around Geophysics Station Lembang, Bandung (107,617° E and 6,825° S). The the geoelectric analysis using Dipole-Dipole resitivity configuration, while shear wave velocity analysis performed using the Multichannel Analysis of Surface Wave (MASW). The study results indicate that the assumed soil or clay depth from the electrical resistivity observation was in accordance with the confirmed soil or clay depth by the MASW investigation. Based on these conditions, indicate the high potential of landsliding in this area, landslide potential supported by high slope angle in this area.

  16. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Gürpinar, Aybars, E-mail: aybarsgurpinar2007@yahoo.com [Nuclear & Risk Consultancy, Anisgasse 4, 1221 Vienna (Austria); Serva, Leonello, E-mail: lserva@alice.it [Independent Consultant, Via dei Dauni 1, 00185 Rome (Italy); Livio, Franz, E-mail: franz.livio@uninsubria.it [Dipartimento di Scienza ed Alta Tecnologia, Università degli Studi dell’Insubria, Via Velleggio, 11, 22100 Como (Italy); Rizzo, Paul C., E-mail: paul.rizzo@rizzoasoc.com [RIZZO Associates, 500 Penn Center Blvd., Suite 100, Pittsburgh, PA 15235 (United States)

    2017-01-15

    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  17. International collaboration towards a global analysis of volcanic hazards and risk

    Science.gov (United States)

    Loughlin, Susan; Duncan, Melanie; Volcano Model Network, Global

    2017-04-01

    Approximately 800 million people live within 100km of an active volcano and such environments are often subject to multiple natural hazards. Volcanic eruptions and related volcanic hazards are less frequent than many other natural hazards but when they occur they can have immediate and long-lived impacts so it is important that they are not overlooked in a multi-risk assessment. Based on experiences to date, it's clear that natural hazards communities need to address a series of challenges in order to move to a multi-hazard approach to risk assessment. Firstly, the need to further develop synergies and coordination within our own communities at local to global scales. Secondly, we must collaborate and identify opportunities for harmonisation across natural hazards communities: for instance, by ensuring our databases are accessible and meet certain standards, a variety of users will be then able to contribute and access data. Thirdly, identifying the scale and breadth of multi-risk assessments needs to be co-defined with decision-makers, which will constrain the relevant potential cascading/compounding hazards to consider. Fourthly, and related to all previous points, multi-risk assessments require multi-risk knowledge, requiring interdisciplinary perspectives, as well as discipline specific expertise. The Global Volcano Model network (GVM) is a growing international network of (public and private) institutions and organisations, which have the collective aim of identifying and reducing volcanic risks. GVM's values embody collaboration, scientific excellence, open-access (wherever possible) and, above all, public good. GVM highlights and builds on the best research available within the volcanological community, drawing on the work of IAVCEI Commissions and other research initiatives. It also builds on the local knowledge of volcano observatories and collaborating scientists, ensuring that global efforts are underpinned by local evidence. Some of GVM's most

  18. Flood hazards analysis based on changes of hydrodynamic processes in fluvial systems of Sao Paulo, Brazil.

    Science.gov (United States)

    Simas, Iury; Rodrigues, Cleide

    2016-04-01

    The metropolis of Sao Paulo, with its 7940 Km² and over 20 million inhabitants, is increasingly being consolidated with disregard for the dynamics of its fluvial systems and natural limitations imposed by fluvial terraces, floodplains and slopes. Events such as floods and flash floods became particularly persistent mainly in socially and environmentally vulnerable areas. The Aricanduva River basin was selected as the ideal area for the development of the flood hazard analysis since it presents the main geological and geomorphological features found in the urban site. According to studies carried out by Anthropic Geomorphology approach in São Paulo, to study this phenomenon is necessary to take into account the original hydromorphological systems and its functional conditions, as well as in which dimensions the Anthropic factor changes the balance between the main variables of surface processes. Considering those principles, an alternative model of geographical data was proposed and enabled to identify the role of different driving forces in terms of spatial conditioning of certain flood events. Spatial relationships between different variables, such as anthropogenic and original morphology, were analyzed for that purpose in addition to climate data. The surface hydrodynamic tendency spatial model conceived for this study takes as key variables: 1- The land use present at the observed date combined with the predominant lithological group, represented by a value ranging 0-100, based on indexes of the National Soil Conservation Service (NSCS-USA) and the Hydraulic Technology Center Foundation (FCTH-Brazil) to determine the resulting balance of runoff/infiltration. 2- The original slope, applying thresholds from which it's possible to determine greater tendency for runoff (in percents). 3- The minimal features of relief, combining the curvature of surface in plant and profile. Those three key variables were combined in a Geographic Information System in a series of

  19. Use of remote sensing and seismotectonic parameters for seismic hazard analysis of Bangalore

    Directory of Open Access Journals (Sweden)

    T. G. Sitharam

    2006-01-01

    Full Text Available Deterministic Seismic Hazard Analysis (DSHA for the Bangalore, India has been carried out by considering the past earthquakes, assumed subsurface fault rupture lengths and point source synthetic ground motion model. The sources have been identified using satellite remote sensing images and seismotectonic atlas map of India and relevant field studies. Maximum Credible Earthquake (MCE has been determined by considering the regional seismotectonic activity in about 350 km radius around Bangalore. The seismotectonic map has been prepared by considering the faults, lineaments, shear zones in the area and past moderate earthquakes of more than 470 events having the moment magnitude of 3.5 and above. In addition, 1300 number of earthquake tremors having moment magnitude of less than 3.5 has been considered for the study. Shortest distance from the Bangalore to the different sources is measured and then Peak Horizontal Acceleration (PHA is calculated for the different sources and moment magnitude of events using regional attenuation relation for peninsular India. Based on Wells and Coppersmith (1994 relationship, subsurface fault rupture length of about 3.8% of total length of the fault shown to be matching with past earthquake events in the area. To simulate synthetic ground motions, Boore (1983, 2003 SMSIM programs have been used and the PHA for the different locations is evaluated. From the above approaches, the PHA of 0.15 g was established. This value was obtained for a maximum credible earthquake having a moment magnitude of 5.1 for a source Mandya-Channapatna-Bangalore lineament. This particular source has been identified as a vulnerable source for Bangalore. From this study, it is very clear that Bangalore area can be described as seismically moderately active region. It is also recommended that southern part of Karnataka in particular Bangalore, Mandya and Kolar, need to be upgraded from current Indian Seismic Zone II to Seismic Zone III

  20. 核电厂DCS安全级应用软件开发的危险分析%Hazard analysis of application software development for nuclear power plant DCS safety system

    Institute of Scientific and Technical Information of China (English)

    艾九斤; 李运坚; 李相建

    2012-01-01

    为了减小或避免因控制系统软件而导致的核电厂安全性降低的不良后果,提出了对核电厂数字控制系统安全级应用软件开发过程进行危险分析的活动.采用验证和确认的方法,并结合安全保护层模型、预先危险分析方法(PHA)、故障树分析等方法对应用软件开发过程中的系统设计、软件设计、软件实现各个阶段的危险进行分析.通过CPR1000项目工程实践表明,采用验证和确认的方法能有效地减小软件开发过程中的危险以提高应用软件的安全性,从而最终提高核电厂的安全性.%In order to reduce or avoid the bad consequences of nuclear power plant security reduction caused by the control system software, the hazard analysis activity for the application software development process of nuclear power plant digital control system is put forward. The verification and validation method combined with the safety protection layer model. the preliminary hazard analysis, the event tree analysis model and so on is used to analyze the hazards of application software development process during the system design, software design and software realization phases. The practice of the CPR1000 project indicate that the verification and validation method can effectively reduce the hazards of software development process to enhance the security of the application software, finally the security of the nuclear power plant is enhanced.

  1. Radiation dose assessment methodology and preliminary dose estimates to support US Department of Energy radiation control criteria for regulated treatment and disposal of hazardous wastes and materials

    Energy Technology Data Exchange (ETDEWEB)

    Aaberg, R.L.; Baker, D.A.; Rhoads, K.; Jarvis, M.F.; Kennedy, W.E. Jr.

    1995-07-01

    This report provides unit dose to concentration levels that may be used to develop control criteria for radionuclide activity in hazardous waste; if implemented, these criteria would be developed to provide an adequate level of public and worker health protection, for wastes regulated under U.S, Environmental Protection Agency (EPA) requirements (as derived from the Resource Conservation and Recovery Act [RCRA] and/or the Toxic Substances Control Act [TSCA]). Thus, DOE and the US Nuclear Regulatory Commission can fulfill their obligation to protect the public from radiation by ensuring that such wastes are appropriately managed, while simultaneously reducing the current level of dual regulation. In terms of health protection, dual regulation of very small quantities of radionuclides provides no benefit.

  2. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    Science.gov (United States)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  3. Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation

    Science.gov (United States)

    Faied, D.; Sanchez, A.

    2009-04-01

    Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) Dohy.Faied@masters.isunet.edu While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. VIDA also offers substantial educational potential: the framework includes a centralized clearinghouse for volcanology data which could support education at a variety of levels. Basic geophysical data, satellite maps, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.

  4. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Science.gov (United States)

    2010-01-01

    ... fully loaded propellant storage tanks or pressurized motor segments. (vii) Worst case combustion or... of each accident experienced by the launch operator involving the release of a toxic propellant; and..., including the launch operator's ground safety plan, hazard area surveillance and clearance plan,...

  5. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation.

    Science.gov (United States)

    Liu, Xiang; Saat, Mohd Rapik; Barkan, Christopher P L

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2010-07-14

    ... procedures to fruit and vegetable juice processing. HACCP is a preventative system of hazard control that can be used by all food processors to ensure the safety of their products to consumers. A HACCP system of... and recordkeeping are essential parts of any HACCP system. The information collection requirements...

  7. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2013-11-20

    ... principles to the processing of fruit and vegetable juices. HACCP is a preventive system of hazard control... regulations for the efficient enforcement of that act. The rationale in establishing an HACCP system of... development and recordkeeping are essential parts of any HACCP system. The information collection...

  8. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiang, E-mail: liu94@illinois.edu; Saat, Mohd Rapik, E-mail: mohdsaat@illinois.edu; Barkan, Christopher P.L., E-mail: cbarkan@illinois.edu

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail.

  9. Waste Feed Delivery System Phase 1 Preliminary RAM Analysis [SEC 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    DYKES, A.A.

    2000-10-11

    This report presents the updated results of the preliminary reliability, availability, and maintainability (RAM) analysis of selected waste feed delivery (WFD) operations to be performed by the Tank Farm Contractor (TFC) during Phase I activities in support of the Waste Treatment and Immobilization Plant (WTP). For planning purposes, waste feed tanks are being divided into five classes in accordance with the type of waste in each tank and the activities required to retrieve, qualify, and transfer waste feed. This report reflects the baseline design and operating concept, as of the beginning of Fiscal Year 2000, for the delivery of feed from three of these classes, represented by source tanks 241-AN-102, 241-AZ-101 and 241-AN-105. The preliminary RAM analysis quantifies the potential schedule delay associated with operations and maintenance (OBM) field activities needed to accomplish these operations. The RAM analysis is preliminary because the system design, process definition, and activity planning are in a state of evolution. The results are being used to support the continuing development of an O&M Concept tailored to the unique requirements of the WFD Program, which is being documented in various volumes of the Waste Feed Delivery Technical Basis (Carlson. 1999, Rasmussen 1999, and Orme 2000). The waste feed provided to the WTP must: (1) meet limits for chemical and radioactive constituents based on pre-established compositional envelopes (i.e., feed quality); (2) be in acceptable quantities within a prescribed sequence to meet feed quantities; and (3) meet schedule requirements (i.e., feed timing). In the absence of new criteria related to acceptable schedule performance due to the termination of the TWRS Privatization Contract, the original criteria from the Tank Waste Remediation System (77443s) Privatization Contract (DOE 1998) will continue to be used for this analysis.

  10. On the predictive information criteria for model determination in seismic hazard analysis

    Science.gov (United States)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  11. Risk prediction of Critical Infrastructures against extreme natural hazards: local and regional scale analysis

    Science.gov (United States)

    Rosato, Vittorio; Hounjet, Micheline; Burzel, Andreas; Di Pietro, Antonio; Tofani, Alberto; Pollino, Maurizio; Giovinazzi, Sonia

    2016-04-01

    Natural hazard events can induce severe impacts on the built environment; they can hit wide and densely populated areas, where there is a large number of (inter)dependent technological systems whose damages could cause the failure or malfunctioning of further different services, spreading the impacts on wider geographical areas. The EU project CIPRNet (Critical Infrastructures Preparedness and Resilience Research Network) is realizing an unprecedented Decision Support System (DSS) which enables to operationally perform risk prediction on Critical Infrastructures (CI) by predicting the occurrence of natural events (from long term weather to short nowcast predictions, correlating intrinsic vulnerabilities of CI elements with the different events' manifestation strengths, and analysing the resulting Damage Scenario. The Damage Scenario is then transformed into an Impact Scenario, where punctual CI element damages are transformed into micro (local area) or meso (regional) scale Services Outages. At the smaller scale, the DSS simulates detailed city models (where CI dependencies are explicitly accounted for) that are of important input for crisis management organizations whereas, at the regional scale by using approximate System-of-Systems model describing systemic interactions, the focus is on raising awareness. The DSS has allowed to develop a novel simulation framework for predicting earthquakes shake maps originating from a given seismic event, considering the shock wave propagation in inhomogeneous media and the subsequent produced damages by estimating building vulnerabilities on the basis of a phenomenological model [1, 2]. Moreover, in presence of areas containing river basins, when abundant precipitations are expected, the DSS solves the hydrodynamic 1D/2D models of the river basins for predicting the flux runoff and the corresponding flood dynamics. This calculation allows the estimation of the Damage Scenario and triggers the evaluation of the Impact Scenario

  12. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Torrential processes like flooding, heavy bedload transport or debris flows in steep mountain channels emerge during intense, highly localized rainfall events. They pose a serious risk on the densely populated Alpine region. Hydrogeomorphic hazards are profoundly nonlinear, threshold mediated phenomena frequently causing costly damage to infrastructure and people. Thus, in the context of climate change, there is an ever rising interest in whether sediment cascades of small alpine catchments react to changing precipitation patterns and how the climate signal is propagated through the fluvial system. We intend to answer the following research questions: (i) What are critical meteorological characteristics triggering torrential events in the Eastern Alps of Austria? (ii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which factors control the internal susceptibility? (iii) Do torrential processes show an increase in magnitude and frequency or a shift in seasonality in the recent past? (iv) Which future changes can be expected under different climate scenarios? Quantifications of bedload transport in small alpine catchments are rare and often associated with high uncertainties. Detailed knowledge though exists for the Schöttlbach catchment, a 71 km2 study area in Styria in the Eastern Alps. The torrent is monitored since a heavy precipitation event resulted in a disastrous flood in July 2011. Sediment mobilisation from slopes as well as within-channel storage and fluxes are regularly measured by photogrammetric methods and sediment impact sensors (SIS). The associated hydro-meteorological conditions are known from a dense station network. Changing states of connectivity can thus be related to precipitation and internal dynamics (sediment availability, cut-and-fill cycles). The site-specific insights are then conceptualized for application to a broader scale. Therefore, a Styria wide database of torrential

  13. Debris-flow susceptibility and hazard assessment at a regional scale from GIS analysis

    Science.gov (United States)

    Bertrand, M.; Liébault, F.; Piégay, H.

    2012-12-01

    Small torrents of the Southern French Alps are prone to extreme events. Depending on the rainfall conditions, the sediment supply from hillslopes, and the gravitational energy, these events can occur under different forms, from floods to debris-flows. Debris-flows are recognized as the most dangerous phenomena and may have dramatic consequences for exposed people and infrastructures. As a first step of hazard assessment, we evaluated the debris-flow susceptibility, i.e. the likelihood that an event occurs in an area under particular physical conditions, not including the temporal dimension. The susceptibility is determined by (i) the morphometric controls of small upland catchments for debris-flows triggering and propagation, and by (ii) sediment supply conditions, i.e. erosion patterns feeding the channels. The morphometric controls are evaluated with indicators calculated from basic topographic variables. The sediment supply is evaluated by considering the cumulated surface of erosion area connected to the hydrographic network. We developed a statistical model to predict the geomorphic responses of the catchments (fluvial vs. debris-flow) and we apply this model within a GIS for regional-scale prediction. The model is based on two morphometric indicators, i.e. fan / channel slope and the Melton ruggedness index, and is based on a wide set of data including the Southern French Alps. We developed a GIS procedure to extract the indicators automatically using a 25m DEM and the hydrographic network as raw data. This model and its application have been validated with historical data. Sediment sources feeding debris-flow prone torrents are identified by first automatically mapping the erosion patches from the infrared orthophotos analysis then identifying the ones connected to the stream network. A classification method has been developed (segmentation into homogeneous objects classified with a neural network algorithm) and validated with expert interpretation on the

  14. Hazard Identification and Risk Assessment of Health and Safety Approach JSA (Job Safety Analysis) in Plantation Company

    Science.gov (United States)

    Sugarindra, Muchamad; Ragil Suryoputro, Muhammad; Tiya Novitasari, Adi

    2017-06-01

    Plantation company needed to identify hazard and perform risk assessment as an Identification of Hazard and Risk Assessment Crime and Safety which was approached by using JSA (Job Safety Analysis). The identification was aimed to identify the potential hazards that might be the risk of workplace accidents so that preventive action could be taken to minimize the accidents. The data was collected by direct observation to the workers concerned and the results were recorded on a Job Safety Analysis form. The data were as forklift operator, macerator worker, worker’s creeper, shredder worker, workers’ workshop, mechanical line worker, trolley cleaning workers and workers’ crepe decline. The result showed that shredder worker value was 30 and had the working level with extreme risk with the risk value range was above 20. So to minimize the accidents could provide Personal Protective Equipment (PPE) which were appropriate, information about health and safety, the company should have watched the activities of workers, and rewards for the workers who obey the rules that applied in the plantation.

  15. The impact of overlapping processes on rockfall hazard analysis - the Bolonia Bay study (southern Spain)

    Science.gov (United States)

    Fernandez-Steeger, T.; Grützner, C.; Reicherter, K.; Braun, A.; Höbig, N.

    2009-04-01

    from the described investigation show that on a screening and planning level the results of the empirical methods are quite good. Especially for numerical simulation, where back analysis is common to parameterize the models, the identification of "ideal" rockfalls is essential for a good simulation performance and subsequently for an appropriate planning of protection measures. References Corominas, J. 1996. The angle of reach as a mobility index for small and large landslides. Canadian Geotechnical Journal, 33, 260 - 271. Dorren, L.K. 2003. A review of rockfall mechanics and modeling approaches. Progress in Physical Geography, 27 (1), 69 - 87. Evans, S. & Hungr, O. 1993. The assessment of rockfall hazard at the base of talus slopes. Canadian Geotechnical Journal, 30, 620 - 636. Heim, A. 1932. Bergsturz und Menschenleben. Vjschr. d. Naturforsch Ges. Zürich, 216 pp. Silva P.G., Reicherter K., Grützner C., Bardají T., Lario J., Goy J.L., Zazo C., & Becker-Heidmann P. 2009. Surface and subsurface paleoseismic records at the ancient Roman city of Baelo Claudia and the Bolonia Bay area, Cádiz (South Spain). Geol Soc of London Spec. Vol.: Paleoseismology: Historical and prehistorical records of earthquake ground effects for seismic hazard assessment. In press. Spang, R. M. & Sonser, Th. 1995. Optimized rockfall protection by "ROCKFALL". Proc 8th Int Congress Rock Mechanics, 3, 1233-1242.

  16. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant.

    Science.gov (United States)

    Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen

    2015-09-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.

  17. Preliminary analysis of Alvito-Odivelas reservoir system operation under climate change scenarios

    OpenAIRE

    2008-01-01

    The present study provides a preliminary analysis of the impact of climate change on a water resources system of Alentejo region in the South of Portugal. Regional climate model HadRM3P forced by the Global Circulation Model HadAM3P A2 of the Hadley Centre, is used to derive temperature and precipitation data, which in turn is used as input to hydrological model (SHETRAN) for simulation of future streamflow. Dynamic programming based models are used for operation of reservoir system in order ...

  18. Stock assessment of Haliporoides triarthrus (Fam. Solenoceridae) off Mozambique: a preliminary analysis

    OpenAIRE

    Torstensen, E.; Pacule, H.

    1992-01-01

    The pink shrimp, Haliporoides triarthrus, is an important species in the deep-water shrimp fishery in Mozambique. Total catches are in the range of 1,500 to 2,700 tons, with the pink shrimp accounting for 70-90%. Estimates of growth parameters and of natural mortality are used for a preliminary assessment of the fishery, based on length-structured virtual population analysis and yield-per-recruit analyses. With an arbitrarily chosen terminal fishing mortality F, the results indicate a situati...

  19. Preliminary Analysis of Liquid Metal MHD Pressure Drop in the Blanket for the FDS

    Institute of Scientific and Technical Information of China (English)

    王红艳; 吴宜灿; 何晓雄

    2002-01-01

    Preliminary analysis and calculation of liquid metal Li17Pb83 magnetohydrodynamic (MHD) pressure drop in the blanket for the FDS have been presented to evaluate the significance of MHD effects on the thermal-hydraulic design of the blanket. To decrease the liquid metal MHD pressure drop, Al2O3 is applied as an electronically insulated coating onto the inner surface of the ducts. The requirement for the insulated coating to reduce the additional leakage pressure drop caused by coating imperfections has been analyzed. Finally, the total liquid metal MHD pressure drop and magnetic pump power in the FDS blanket have been given.

  20. Preliminary performance analysis of a transverse flow spectrally selective two-slab packed bed volumetric receiver

    CSIR Research Space (South Africa)

    Roos, TH

    2016-05-01

    Full Text Available stream_source_info Roos_2016_ABSTRACT.pdf.txt stream_content_type text/plain stream_size 2694 Content-Encoding UTF-8 stream_name Roos_2016_ABSTRACT.pdf.txt Content-Type text/plain; charset=UTF-8 21st SolarPACES... International Conference (SolarPACES 2015), 13-16 October 2015 Preliminary Performance Analysis of a Transverse Flow Spectrally Selective Two-slab Packed Bed Volumetric Receiver Thomas H. Roos1, a) and Thomas M. Harms2, b) 1Aeronautical Systems...

  1. Preliminary Report: Analysis of the baseline study on the prevalence of Salmonella in laying hen flocks of Gallus gallus

    DEFF Research Database (Denmark)

    Hald, Tine

    This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary for the establ......This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary...

  2. Seismic hazard analysis of nuclear installations in France. Current practice and research

    Energy Technology Data Exchange (ETDEWEB)

    Mohammadioun, B. [CEA Centre d`Etudes de Fontenay-aux-Roses, 92 (France). Inst. de Protection et de Surete Nucleaire

    1997-03-01

    The methodology put into practice in France for the evaluation of seismic hazard on the sites of nuclear facilities is founded on data assembled country-wide over the past 15 years, in geology, geophysics and seismology. It is appropriate to the regional seismotectonic context (interplate), characterized notably by diffuse seismicity. Extensive use is made of information drawn from historical seismicity. The regulatory practice described in the RFS I.2.c is reexamined periodically and is subject to up-dating so as to take advantage of new earthquake data and of the results gained from research work. Acquisition of the basic data, such as the identification of active faults and the quantification of site effect, which will be needed to achieve improved preparedness versus severe earthquake hazard in the 21st century, will necessarily be the fruit of close international cooperation and collaboration, which should accordingly be actively promoted. (J.P.N.)

  3. Analysis of root causes of major hazard precursors (hydrocarbon leaks) in the Norwegian offshore petroleum industry

    Energy Technology Data Exchange (ETDEWEB)

    Vinnem, Jan Erik, E-mail: jev@preventor.n [Preventor AS/University of Stavanger, Rennebergstien 30, 4021 Stavanger (Norway); Hestad, Jon Andreas [Safetec Nordic AS, Bergen (Norway); Kvaloy, Jan Terje [Department of Mathematics and Natural Sciences, University of Stavanger (Norway); Skogdalen, Jon Espen [Department of Industrial Economics, Risk Management and Planning, University of Stavanger (Norway)

    2010-11-15

    The offshore petroleum industry in Norway reports major hazard precursors to the authorities, and data are available for the period 1996 through 2009. Barrier data have been reported since 2002, as have data from an extensive questionnaire survey covering working environment, organizational culture and perceived risk among all employees on offshore installations. Several attempts have been made to analyse different data sources in order to discover relations that may cast some light on possible root causes of major hazard precursors. These previous attempts were inconclusive. The study presented in this paper is the most extensive study performed so far. The data were analysed using linear regression. The conclusion is that there are significant correlations between number of leaks and safety climate indicators. The discussion points to possible root causes of major accidents.

  4. Non-parametric seismic hazard analysis in the presence of incomplete data

    Science.gov (United States)

    Yazdani, Azad; Mirzaei, Sajjad; Dadkhah, Koroush

    2017-01-01

    The distribution of earthquake magnitudes plays a crucial role in the estimation of seismic hazard parameters. Due to the complexity of earthquake magnitude distribution, non-parametric approaches are recommended over classical parametric methods. The main deficiency of the non-parametric approach is the lack of complete magnitude data in almost all cases. This study aims to introduce an imputation procedure for completing earthquake catalog data that will allow the catalog to be used for non-parametric density estimation. Using a Monte Carlo simulation, the efficiency of introduced approach is investigated. This study indicates that when a magnitude catalog is incomplete, the imputation procedure can provide an appropriate tool for seismic hazard assessment. As an illustration, the imputation procedure was applied to estimate earthquake magnitude distribution in Tehran, the capital city of Iran.

  5. On adjustment for auxiliary covariates in additive hazard models for the analysis of randomized experiments

    DEFF Research Database (Denmark)

    Vansteelandt, S.; Martinussen, Torben; Tchetgen, E. J Tchetgen

    2014-01-01

    's dependence on time or on the auxiliary covariates is misspecified, and even away from the null hypothesis of no treatment effect. We furthermore show that adjustment for auxiliary baseline covariates does not change the asymptotic variance of the estimator of the effect of a randomized treatment. We conclude......We consider additive hazard models (Aalen, 1989) for the effect of a randomized treatment on a survival outcome, adjusting for auxiliary baseline covariates. We demonstrate that the Aalen least-squares estimator of the treatment effect parameter is asymptotically unbiased, even when the hazard...... that, in view of its robustness against model misspecification, Aalen least-squares estimation is attractive for evaluating treatment effects on a survival outcome in randomized experiments, and the primary reasons to consider baseline covariate adjustment in such settings could be interest in subgroup...

  6. Hazardous Waste Management in South African Mining ; A CGE Analysis of the Economic Impacts

    OpenAIRE

    Wiebelt, Manfred

    1999-01-01

    There is no doubt that an improved hazardous waste management in mining and mineral processing will reduce environmental and health risks in South Africa. However, skeptics fear that waste reduction, appropriate treatment and disposal are not affordable within the current economic circumstances, neither from an economic nor from a social point of view. This paper mainly deals with the first aspect and touches upon the second. It investigates the short-run and long-run sectoral impacts of an e...

  7. DOWNFLOW code and LIDAR technology for lava flow analysis and hazard assessment at Mount Etna

    OpenAIRE

    Alessandro Fornaciai; Simone Tarquini; Massimiliano Favalli

    2011-01-01

    The use of a lava-flow simulation (DOWNFLOW) probabilistic code and airborne light detection and ranging (LIDAR) technology are combined to analyze the emplacement of compound lava flow fields at Mount Etna (Sicily, Italy). The goal was to assess the hazard posed by lava flows. The LIDAR-derived time series acquired during the 2006 Mount Etna eruption records the changing topography of an active lava-flow field. These short-time-interval, high-resolution topographic surveys provide a detailed...

  8. Safety, Health and Environmental Hazards Associated with Composites: A Complete Analysis

    Science.gov (United States)

    1992-11-01

    Cincinnati, OH: 1992. Warner, John D., Alan G. Miller. "Advanced Composite Use Experiences. The Basis for Future Applications". SAE Technical Paper Series. SAE...Proceedings. Vol. 35 Book 2. Covina, CA: 1990. Young, Stephen L., John W. Brelsford , and Michal S. Wogalter. "Judgements of Hazard, Risk, and Danger. Do They Differ?" Proceedings of the Human Factors Society. Santa Monica, CA: 1990.

  9. A framework for the assessment and analysis of multi-hazards induced risk resulting from space vehicles operations

    Science.gov (United States)

    Sala-Diakanda, Serge N.

    2007-12-01

    With the foreseeable increase in traffic frequency to and from orbit, the safe operation of current and future space vehicles at designated spaceports has become a serious concern. Due to their high explosive energy potential, operating those launch vehicles presents a real risk to: (1) the spaceport infrastructure and personnel, (2) the communities surrounding the spaceport and (3) the flying aircrafts whose routes could be relatively close to spaceport launch and reentry routes. Several computer models aimed at modeling the effects of the different hazards generated by the breakup of such vehicles (e.g., fragmentation of debris, release of toxic gases, propagation of blast waves, etc.) have been developed, and are used to assist in Go-No Go launch decisions. They can simulate a total failure scenario of the vehicle and, estimate a number of casualties to be expected as a result of such failure. However, as all of these models---which can be very elaborate and complex---consider only one specific explosion hazard in their simulations, the decision of whether or not a launch should occur is currently based on the evaluation of several estimates of an expected number of casualties. As such, current practices ignore the complex, nonlinear interactions between the different hazards as well as the interdependencies between the estimates. In this study, we developed a new framework which makes use of information fusion theory, hazards' dispersion modeling and, geographical statistical analysis and visualization capabilities of geographical information systems to assess the risk generated by the operation of space launch vehicles. A new risk metric, which effectively addresses the lack of a common risk metric with current methods, is also proposed. A case study, based on a proposed spaceport in the state of Oklahoma showed that the estimates we generate through our framework consistently outperform estimates provided by any individual hazard, or by the independent

  10. Medieval monastic mortality: hazard analysis of mortality differences between monastic and nonmonastic cemeteries in England.

    Science.gov (United States)

    DeWitte, Sharon N; Boulware, Jessica C; Redfern, Rebecca C

    2013-11-01

    Scholarship on life in medieval European monasteries has revealed a variety of factors that potentially affected mortality in these communities. Though there is some evidence based on age-at-death distributions from England that monastic males lived longer than members of the general public, what is missing from the literature is an explicit examination of how the risks of mortality within medieval monastic settings differed from those within contemporaneous lay populations. This study examines differences in the hazard of mortality for adult males between monastic cemeteries (n = 528) and non-monastic cemeteries (n = 368) from London, all of which date to between AD 1050 and 1540. Age-at-death data from all cemeteries are pooled to estimate the Gompertz hazard of mortality, and "monastic" (i.e., buried in a monastic cemetery) is modeled as a covariate affecting this baseline hazard. The estimated effect of the monastic covariate is negative, suggesting that individuals in the monastic communities faced reduced risks of dying compared to their peers in the lay communities. These results suggest better diets, the positive health benefits of religious behavior, better living conditions in general in monasteries, or selective recruitment of healthy or higher socioeconomic status individuals.

  11. Using Monte Carlo techniques and parallel processing for debris hazard analysis of rocket systems

    Energy Technology Data Exchange (ETDEWEB)

    LaFarge, R.A.