WorldWideScience

Sample records for earthquake hazard analysis

  1. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  2. Probabilistic earthquake hazard analysis for Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  3. Updated earthquake catalogue for seismic hazard analysis in Pakistan

    Science.gov (United States)

    Khan, Sarfraz; Waseem, Muhammad; Khan, Muhammad Asif; Ahmed, Waqas

    2018-03-01

    A reliable and homogenized earthquake catalogue is essential for seismic hazard assessment in any area. This article describes the compilation and processing of an updated earthquake catalogue for Pakistan. The earthquake catalogue compiled in this study for the region (quadrangle bounded by the geographical limits 40-83° N and 20-40° E) includes 36,563 earthquake events, which are reported as 4.0-8.3 moment magnitude (M W) and span from 25 AD to 2016. Relationships are developed between the moment magnitude and body, and surface wave magnitude scales to unify the catalogue in terms of magnitude M W. The catalogue includes earthquakes from Pakistan and neighbouring countries to minimize the effects of geopolitical boundaries in seismic hazard assessment studies. Earthquakes reported by local and international agencies as well as individual catalogues are included. The proposed catalogue is further used to obtain magnitude of completeness after removal of dependent events by using four different algorithms. Finally, seismicity parameters of the seismic sources are reported, and recommendations are made for seismic hazard assessment studies in Pakistan.

  4. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  5. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    International Nuclear Information System (INIS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-01-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh. (paper)

  6. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    Science.gov (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  7. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    Science.gov (United States)

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  8. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  9. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    Science.gov (United States)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  10. A procedure for the determination of scenario earthquakes for seismic design based on probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Hirose, Jiro; Muramatsu, Ken

    2002-03-01

    This report presents a study on the procedures for the determination of scenario earthquakes for seismic design of nuclear power plants (NPPs) based on probabilistic seismic hazard analysis (PSHA). In the recent years, the use of PSHA, which is a part of seismic probabilistic safety assessment (PSA), to determine the design basis earthquake motions for NPPs has been proposed. The identified earthquakes are called probability-based scenario earthquakes (PBSEs). The concept of PBSEs originates both from the study of US NRC and from Ishikawa and Kameda. The assessment of PBSEs is composed of seismic hazard analysis and identification of dominant earthquakes. The objectives of this study are to formulate the concept of PBSEs and to examine the procedures for determining the PBSEs for a domestic NPP site. This report consists of three parts, namely, procedures to compile analytical conditions for PBSEs, an assessment to identify PBSEs for a model site using the Ishikawa's concept and the examination of uncertainties involved in analytical conditions. The results obtained from the examination of PBSEs using Ishikawa's concept are as follows. (a) Since PBSEs are expressed by hazard-consistent magnitude and distance in terms of a prescribed reference probability, it is easy to obtain a concrete image of earthquakes that determine the ground response spectrum to be considered in the design of NPPs. (b) Source contribution factors provide the information on the importance of the earthquake source regions and/or active faults, and allows the selection of a couple of PBSEs based on their importance to the site. (c) Since analytical conditions involve uncertainty, sensitivity analyses on uncertainties that would affect seismic hazard curves and identification of PBSEs were performed on various aspects and provided useful insights for assessment of PBSEs. A result from this sensitivity analysis was that, although the difference in selection of attenuation equations led to a

  11. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  12. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Gürpinar, Aybars, E-mail: aybarsgurpinar2007@yahoo.com [Nuclear & Risk Consultancy, Anisgasse 4, 1221 Vienna (Austria); Serva, Leonello, E-mail: lserva@alice.it [Independent Consultant, Via dei Dauni 1, 00185 Rome (Italy); Livio, Franz, E-mail: franz.livio@uninsubria.it [Dipartimento di Scienza ed Alta Tecnologia, Università degli Studi dell’Insubria, Via Velleggio, 11, 22100 Como (Italy); Rizzo, Paul C., E-mail: paul.rizzo@rizzoasoc.com [RIZZO Associates, 500 Penn Center Blvd., Suite 100, Pittsburgh, PA 15235 (United States)

    2017-01-15

    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  13. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Gürpinar, Aybars; Serva, Leonello; Livio, Franz; Rizzo, Paul C.

    2017-01-01

    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  14. Seismic Hazard Analysis based on Earthquake Vulnerability and Peak Ground Acceleration using Microseismic Method at Universitas Negeri Semarang

    Science.gov (United States)

    Sulistiawan, H.; Supriyadi; Yulianti, I.

    2017-02-01

    Microseismic is a harmonic vibration of land that occurs continuously at a low frequency. The characteristics of microseismic represents the characteristics of the soil layer based on the value of its natural frequency. This paper presents the analysis of seismic hazard at Universitas Negeri Semarang using microseismic method. The data acquisition was done at 20 points with distance between points 300 m by using three component’s seismometer. The data was processed using Horizontal to Vertical Spectral Ratio (HVSR) method to obtain the natural frequency and amplification value. The value of the natural frequency and amplification used to determine the value of the earthquake vulnerability and peak ground acceleration (PGA). The result shows then the earthquake vulnerability value range from 0.2 to 7.5, while the value of the average peak ground acceleration (PGA) is in the range 10-24 gal. Therefore, the average peak ground acceleration equal to earthquake intensity IV MMI scale.

  15. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  16. Earthquake hazard analysis for the different regions in and around Ağrı

    Energy Technology Data Exchange (ETDEWEB)

    Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    We investigated earthquake hazard parameters for Eastern part of Turkey by determining the a and b parameters in a Gutenberg–Richter magnitude–frequency relationship. For this purpose, study area is divided into seven different source zones based on their tectonic and seismotectonic regimes. The database used in this work was taken from different sources and catalogues such as TURKNET, International Seismological Centre (ISC), Incorporated Research Institutions for Seismology (IRIS) and The Scientific and Technological Research Council of Turkey (TUBITAK) for instrumental period. We calculated the a value, b value, which is the slope of the frequency–magnitude Gutenberg–Richter relationship, from the maximum likelihood method (ML). Also, we estimated the mean return periods, the most probable maximum magnitude in the time period of t-years and the probability for an earthquake occurrence for an earthquake magnitude ≥ M during a time span of t-years. We used Zmap software to calculate these parameters. The lowest b value was calculated in Region 1 covered Cobandede Fault Zone. We obtain the highest a value in Region 2 covered Kagizman Fault Zone. This conclusion is strongly supported from the probability value, which shows the largest value (87%) for an earthquake with magnitude greater than or equal to 6.0. The mean return period for such a magnitude is the lowest in this region (49-years). The most probable magnitude in the next 100 years was calculated and we determined the highest value around Cobandede Fault Zone. According to these parameters, Region 1 covered the Cobandede Fault Zone and is the most dangerous area around the Eastern part of Turkey.

  17. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  18. The analysis of historical seismograms: an important tool for seismic hazard assessment. Case histories from French and Italian earthquakes

    International Nuclear Information System (INIS)

    Pino, N.A.

    2011-01-01

    Seismic hazard assessment relies on the knowledge of the source characteristics of past earthquakes. Unfortunately, seismic waveform analysis, representing the most powerful tool for the investigation of earthquake source parameters, is only possible for events occurred in the last 100-120 years, i.e., since seismographs with known response function were developed. Nevertheless, during this time significant earthquakes have been recorded by such instruments and today, also thanks to technological progress, these data can be recovered and analysed by means of modern techniques. In this paper, aiming at giving a general sketch of possible analyses and attainable results in historical seismogram studies, I briefly describe the major difficulties in processing the original waveforms and present a review of the results that I obtained from previous seismogram analysis of selected significant historical earthquakes occurred during the first decades of the 20. century, including (A) the December 28, 1908, Messina straits (southern Italy), (B) the June 11, 1909, Lambesc (southern France) - both of which are the strongest ever recorded instrumentally in their respective countries - and (C) the July 13, 1930, Irpinia (southern Italy) events. For these earthquakes, the major achievements are represented by the assessment of the seismic moment (A, B, C), the geometry and kinematics of faulting (B, C), the fault length and an approximate slip distribution (A, C). The source characteristics of the studied events have also been interpreted in the frame of the tectonic environment active in the respective region of interest. In spite of the difficulties inherent to the investigation of old seismic data, these results demonstrate the invaluable and irreplaceable role of historical seismogram analysis in defining the local seismo-genic potential and, ultimately, for assessing the seismic hazard. The retrieved information is crucial in areas where important civil engineering works

  19. Analysis on Two Typical Landslide Hazard Phenomena in The Wenchuan Earthquake by Field Investigations and Shaking Table Tests

    Directory of Open Access Journals (Sweden)

    Changwei Yang

    2015-08-01

    Full Text Available Based on our field investigations of landslide hazards in the Wenchuan earthquake, some findings can be reported: (1 the multi-aspect terrain facing empty isolated mountains and thin ridges reacted intensely to the earthquake and was seriously damaged; (2 the slope angles of most landslides was larger than 45°. Considering the above disaster phenomena, the reasons are analyzed based on shaking table tests of one-sided, two-sided and four-sided slopes. The analysis results show that: (1 the amplifications of the peak accelerations of four-sided slopes is stronger than that of the two-sided slopes, while that of the one-sided slope is the weakest, which can indirectly explain the phenomena that the damage is most serious; (2 the amplifications of the peak accelerations gradually increase as the slope angles increase, and there are two inflection points which are the point where the slope angle is 45° and where the slope angle is 50°, respectively, which can explain the seismic phenomenon whereby landslide hazards mainly occur on the slopes whose slope angle is bigger than 45°. The amplification along the slope strike direction is basically consistent, and the step is smooth.

  20. Investigation of tectonics and statistical analysis of earthquake hazard in Tange Sorkh dam

    OpenAIRE

    ZOLFAGHARI, Sayyed Yaghoub; RAFIEE, A.; HADI, S. M.R.; TAHERMANESH, R.

    2015-01-01

    Abstract. Today, most understood the importance of the risk of earthquakes with the intensification of the country's development, the rise in urbanization, the concentration of population and material and intellectual capital and increased vulnerability of the capital in the Iran seismic zone. Iran, as one of the most seismic countries in the world, in recent years has witnessed the devastating earthquake, for example can be pointed to earthquakes of Rudbar - Manjil, Bojnoord, Zir Kouh Ghaena...

  1. Earthquake Hazard for Aswan High Dam Area

    Science.gov (United States)

    Ismail, Awad

    2016-04-01

    Earthquake activity and seismic hazard analysis are important components of the seismic aspects for very essential structures such as major dams. The Aswan High Dam (AHD) created the second man-made reservoir in the world (Lake Nasser) and is constructed near urban areas pose a high-risk potential for downstream life and property. The Dam area is one of the seismically active regions in Egypt and is occupied with several cross faults, which are dominant in the east-west and north-south. Epicenters were found to cluster around active faults in the northern part of Lake and AHD location. The space-time distribution and the relation of the seismicity with the lake water level fluctuations were studied. The Aswan seismicity separates into shallow and deep seismic zones, between 0 and 14 and 14 and 30 km, respectively. These two seismic zones behave differently over time, as indicated by the seismicity rate, lateral extent, b-value, and spatial clustering. It is characterized by earthquake swarm sequences showing activation of the clustering-events over time and space. The effect of the North African drought (1982 to present) is clearly seen in the reservoir water level. As it decreased and left the most active fault segments uncovered, the shallow activity was found to be more sensitive to rapid discharging than to the filling. This study indicates that geology, topography, lineations in seismicity, offsets in the faults, changes in fault trends and focal mechanisms are closely related. No relation was found between earthquake activity and both-ground water table fluctuations and water temperatures measured in wells located around the Kalabsha area. The peak ground acceleration is estimated in the dam site based on strong ground motion simulation. This seismic hazard analyses have indicated that AHD is stable with the present seismicity. The earthquake epicenters have recently took place approximately 5 km west of the AHD structure. This suggests that AHD dam must be

  2. ASSESSMENT OF EARTHQUAKE HAZARDS ON WASTE LANDFILLS

    DEFF Research Database (Denmark)

    Zania, Varvara; Tsompanakis, Yiannis; Psarropoulos, Prodromos

    Earthquake hazards may arise as a result of: (a) transient ground deformation, which is induced due to seismic wave propagation, and (b) permanent ground deformation, which is caused by abrupt fault dislocation. Since the adequate performance of waste landfills after an earthquake is of outmost...... importance, the current study examines the impact of both types of earthquake hazards by performing efficient finite-element analyses. These took also into account the potential slip displacement development along the geosynthetic interfaces of the composite base liner. At first, the development of permanent...

  3. Lower bound earthquake magnitude for probabilistic seismic hazard evaluation

    International Nuclear Information System (INIS)

    McCann, M.W. Jr.; Reed, J.W.

    1990-01-01

    This paper presents the results of a study that develops an engineering and seismological basis for selecting a lower-bound magnitude (LBM) for use in seismic hazard assessment. As part of a seismic hazard analysis the range of earthquake magnitudes that are included in the assessment of the probability of exceedance of ground motion must be defined. The upper-bound magnitude is established by earth science experts based on their interpretation of the maximum size of earthquakes that can be generated by a seismic source. The lower-bound or smallest earthquake that is considered in the analysis must also be specified. The LBM limits the earthquakes that are considered in assessing the probability that specified ground motion levels are exceeded. In the past there has not been a direct consideration of the appropriate LBM value that should be used in a seismic hazard assessment. This study specifically looks at the selection of a LBM for use in seismic hazard analyses that are input to the evaluation/design of nuclear power plants (NPPs). Topics addressed in the evaluation of a LBM are earthquake experience data at heavy industrial facilities, engineering characteristics of ground motions associated with small-magnitude earthquakes, probabilistic seismic risk assessments (seismic PRAs), and seismic margin evaluations. The results of this study and the recommendations concerning a LBM for use in seismic hazard assessments are discussed. (orig.)

  4. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  5. Assessment of seismic hazard for NPP sites in France analysis of several aftershocks of November 8, 1983, Liege earthquake

    International Nuclear Information System (INIS)

    Mohammadioun, B.; Mohammadioun, G.; Bresson, A.

    1984-03-01

    Current French practice for assessing seismic hazard on the sites of nuclear facilities is outlined. The procedure calls for as rich and varied an assortment of actual earthquake recordings as can be procured, including earthquakes in France itself and in nearby countries, recorded by the CEA/IPSN's own staff. Following the November 8, 1983, Liege earthquake, suitably equipped, temporary recording stations were set up in the epicentral area in order to record its aftershocks. Ground motion time histories and response spectra were computed for several of these, and a quality factor Q was derived from these data for the most superficial sedimentary layers of the area. The values obtained show reasonable agreement with ones found for similar materials in other regions

  6. Playing against nature: improving earthquake hazard mitigation

    Science.gov (United States)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the

  7. Fragility analysis of flood protection structures in earthquake and flood prone areas around Cologne, Germany for multi-hazard risk assessment

    Science.gov (United States)

    Tyagunov, Sergey; Vorogushyn, Sergiy; Munoz Jimenez, Cristina; Parolai, Stefano; Fleming, Kevin; Merz, Bruno; Zschau, Jochen

    2013-04-01

    The work presents a methodology for fragility analyses of fluvial earthen dikes in earthquake and flood prone areas. Fragility estimates are being integrated into the multi-hazard (earthquake-flood) risk analysis being undertaken within the framework of the EU FP7 project MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) for the city of Cologne, Germany. Scenarios of probable cascading events due to the earthquake-triggered failure of flood protection dikes and the subsequent inundation of surroundings are analyzed for the area between the gauges Andernach and Düsseldorf along the Rhine River. Along this river stretch, urban areas are partly protected by earthen dikes, which may be prone to failure during exceptional floods and/or earthquakes. The seismic fragility of the dikes is considered in terms of liquefaction potential (factor of safety), estimated by the use of the simplified procedure of Seed and Idriss. It is assumed that initiation of liquefaction at any point throughout the earthen dikes' body corresponds to the failure of the dike and, therefore, this should be taken into account for the flood risk calculations. The estimated damage potential of such structures is presented as a two-dimensional surface (as a function of seismic hazard and water level). Uncertainties in geometrical and geotechnical dike parameters are considered within the framework of Monte Carlo simulations. Taking into consideration the spatial configuration of the existing flood protection system within the area under consideration, seismic hazard curves (in terms of PGA) are calculated for sites along the river segment of interest at intervals of 1 km. The obtained estimates are used to calculate the flood risk when considering the temporal coincidence of seismic and flood events. Changes in flood risk for the considered hazard cascade scenarios are quantified and compared to the single-hazard scenarios.

  8. Probabilistic Tsunami Hazard Analysis of the Pacific Coast of Mexico: Case Study Based on the 1995 Colima Earthquake Tsunami

    Directory of Open Access Journals (Sweden)

    Nobuhito Mori

    2017-06-01

    Full Text Available This study develops a novel computational framework to carry out probabilistic tsunami hazard assessment for the Pacific coast of Mexico. The new approach enables the consideration of stochastic tsunami source scenarios having variable fault geometry and heterogeneous slip that are constrained by an extensive database of rupture models for historical earthquakes around the world. The assessment focuses upon the 1995 Jalisco–Colima Earthquake Tsunami from a retrospective viewpoint. Numerous source scenarios of large subduction earthquakes are generated to assess the sensitivity and variability of tsunami inundation characteristics of the target region. Analyses of nine slip models along the Mexican Pacific coast are performed, and statistical characteristics of slips (e.g., coherent structures of slip spectra are estimated. The source variability allows exploring a wide range of tsunami scenarios for a moment magnitude (Mw 8 subduction earthquake in the Mexican Pacific region to conduct thorough sensitivity analyses and to quantify the tsunami height variability. The numerical results indicate a strong sensitivity of maximum tsunami height to major slip locations in the source and indicate major uncertainty at the first peak of tsunami waves.

  9. National Earthquake Hazards Program at a Crossroads

    Science.gov (United States)

    Showstack, Randy

    The U.S.National Earthquake Hazards Reduction Program, which turns 25 years old on 1 October 2003, is passing through two major transitions, which experts said either could weaken or strengthen the program. On 1 March, a federal government reorganization placed NEHRP's lead agency,the Federal Emergency Management Agency (FEMA),within the new Department of Homeland Security (DHS). A number of earthquake scientists and engineers expressed concern that NEHRP, which already faces budgetary and organizational challenges, and lacks visibility,could end up being marginalized in the bureaucratic shuffle. Some experts, though,as well as agency officials, said they hope DHS will recognize synergies between dealing with earthquakes and terrorist attacks.

  10. Earthquake Hazard and Risk in Alaska

    Science.gov (United States)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  11. Aftershock Duration of the 1976 Ms 7.8 Tangshan Earthquake: Implication for the Seismic Hazard Model with a Sensitivity Analysis

    Science.gov (United States)

    Zhong, Q.; Shi, B.

    2011-12-01

    described as a function of background seismicity rate. From this approach, a two-dimensionally spatial distribution pattern of the aftershock duration can be obtained, and the resultant time length of aftershock sequences is about 130-160 years, which is much larger than that given by previous two approaches. The conclusively noting gives that the current earthquakes occurred in Tangshan region after the 1976 main event could still be the aftershocks of the 1976 main event. Based on the distributed seismic hazard approach, a sensitivity analysis has been carried out in the PSHA calculation by using different earthquake catalogs. The calculated result shows that the seismic hazard level in Tangshan, based on the seismic data before 1976 main event, is much smaller than that derived from the earthquake catalog before 2006. But when the Tangshan mainshock and other two largest aftershocks are added to the earthquake catalog before 1976 main event, the seismic hazard level in Tangshan area does not change significantly. Therefore, we conclude that single or individual earthquake has almost no significant contribution to the change of seismic hazard level, and the aftershock sequences involved in the earthquake catalogs could make regional seismic hazard level to be overestimated.

  12. Up-to-date Probabilistic Earthquake Hazard Maps for Egypt

    Science.gov (United States)

    Gaber, Hanan; El-Hadidy, Mahmoud; Badawy, Ahmed

    2018-04-01

    An up-to-date earthquake hazard analysis has been performed in Egypt using a probabilistic seismic hazard approach. Through the current study, we use a complete and homogenous earthquake catalog covering the time period between 2200 BC and 2015 AD. Three seismotectonic models representing the seismic activity in and around Egypt are used. A logic-tree framework is applied to allow for the epistemic uncertainty in the declustering parameters, minimum magnitude, seismotectonic setting and ground-motion prediction equations. The hazard analysis is performed for a grid of 0.5° × 0.5° in terms of types of rock site for the peak ground acceleration (PGA) and spectral acceleration at 0.2-, 0.5-, 1.0- and 2.0-s periods. The hazard is estimated for three return periods (72, 475 and 2475 years) corresponding to 50, 10 and 2% probability of exceedance in 50 years. The uniform hazard spectra for the cities of Cairo, Alexandria, Aswan and Nuwbia are constructed. The hazard maps show that the highest ground acceleration values are expected in the northeastern part of Egypt around the Gulf of Aqaba (PGA up to 0.4 g for return period 475 years) and in south Egypt around the city of Aswan (PGA up to 0.2 g for return period 475 years). The Western Desert of Egypt is characterized by the lowest level of hazard (PGA lower than 0.1 g for return period 475 years).

  13. Earthquake Hazard and Risk in New Zealand

    Science.gov (United States)

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates

  14. Insights into earthquake hazard map performance from shaking history simulations

    Science.gov (United States)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher

  15. Tiechanshan-Tunghsiao anticline earthquake analysis: Implications for northwestern Taiwan potential carbon dioxide storage site seismic hazard

    Directory of Open Access Journals (Sweden)

    Ruey-Juin Rau

    2017-01-01

    Full Text Available We analyze the seismicity and earthquake focal mechanisms beneath the Tiechanshan-Tunghsiao (TCS-TH anticline over the last two decades for seismic hazard evaluation of a potential carbon dioxide storage site in northwestern Taiwan. Seismicity in the TCS-TH anticline indicates both spatial and temporal clustering at a depth range of 7 - 12 km. Thirteen 3.0 ≤ ML ≤ 5.2 earthquake focal mechanisms show a combination of thrust, strike-slip, and normal faulting mechanisms under the TCS-TH anticline. A 1992 ML 5.2 earthquake with a focal depth of ~10 km, the largest event ever recorded beneath the TCS-TH anticline during the last two decades, has a normal fault mechanism with the T-axis trending NNE-SSW and nodal planes oriented NNW-SSE, dipping either gently to the NNE or steeply to the SSW. Thrust fault mechanisms that occurred with mostly E-W or NWW-SEE striking P-axes and strike-slip faulting events indicate NWW-SEE striking P-axes and NNE-SSW trending T-axes, which are consistent with the regional plate convergence direction. For the strike-slip faulting events, if we take the N-S or NNW-SSE striking nodal planes as the fault planes, the strike-slip faults are sinistral motions and correspond to the Tapingting fault, which is a strike-slip fault reactivated from the inherited normal fault and intersects the Tiechanshan and Tunghsiao anticlines.

  16. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  17. Earthquake Hazard Assessment: an Independent Review

    Science.gov (United States)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  18. Global Positioning System data collection, processing, and analysis conducted by the U.S. Geological Survey Earthquake Hazards Program

    Science.gov (United States)

    Murray, Jessica R.; Svarc, Jerry L.

    2017-01-01

    The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.

  19. Decision making based on analysis of benefit versus costs of preventive retrofit versus costs of repair after earthquake hazards

    Science.gov (United States)

    Bostenaru Dan, M.

    2012-04-01

    In this presentation interventions on seismically vulnerable early reinforced concrete skeleton buildings, from the interwar time, at different performance levels, from avoiding collapse up to assuring immediate post-earthquake functionality are considered. Between these two poles there are degrees of damage depending on the performance aim set. The costs of the retrofit and post-earthquake repair differ depending on the targeted performance. Not only an earthquake has impact on a heritage building, but also the retrofit measure, for example on its appearance or its functional layout. This way criteria of the structural engineer, the investor, the architect/conservator/urban planner and the owner/inhabitants from the neighbourhood are considered for taking a benefit-cost decision. Benefit-cost analysis based decision is an element in a risk management process. A solution must be found on how much change to accept for retrofit and how much repairable damage to take into account. There are two impact studies. Numerical simulation was run for the building typology considered for successive earthquakes, selected in a deterministic way (1977, 1986 and two for 1991 from Vrancea, Romania and respectively 1978 Thessaloniki, Greece), considering also the case when retrofit is done between two earthquakes. The typology of buildings itself was studied not only for Greece and Romania, but for numerous European countries, including Italy. The typology was compared to earlier reinforced concrete buildings, with Hennebique system, in order to see to which amount these can belong to structural heritage and to shape the criteria of the architect/conservator. Based on the typology study two model buildings were designed, and for one of these different retrofit measures (side walls, structural walls, steel braces, steel jacketing) were considered, while for the other one of these retrofit techniques (diagonal braces, which permits adding also active measures such as energy

  20. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    Science.gov (United States)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  1. Analysis of earthquake parameters to generate hazard maps by integrating AHP and GIS for Küçükçekmece region

    Directory of Open Access Journals (Sweden)

    T. Erden

    2012-02-01

    Full Text Available Definition of an earthquake includes parameters with respect to region of interest. Each of those parameters has different weights on the earthquake ground motion and effect. This study examines the weight of common parameters that have an influence on the effects of earthquakes. The Analytic Hierarchy Process (AHP is used for factor weighting of each parameter and Geographic Information Systems (GIS are used for simulating the results of the AHP on a spatial environment. In this study, it is aimed to generate a hierarchical structure of the model for the simulation of an earthquake hazard map (EHM. The parameters of the EHM, which are selected by the criterion of non-correlated factors, are: topography, distance to epicenter, soil classification, liquefaction, and fault/focal mechanism. As a result of the study, weights of the parameters that affect the earthquake ground motion at the study area are determined and compared with a selected attenuation relation map.

  2. Geotechnical approach for occupational safety risk analysis of critical slope in open pit mining as implication for earthquake hazard

    Science.gov (United States)

    Munirwansyah; Irsyam, Masyhur; Munirwan, Reza P.; Yunita, Halida; Zulfan Usrina, M.

    2018-05-01

    Occupational safety and health (OSH) is a planned effort to prevent accidents and diseases caused by work. In conducting mining activities often occur work accidents caused by unsafe field conditions. In open mine area, there is often a slump due to unstable slopes, which can disrupt the activities and productivity of mining companies. Based on research on stability of open pit slopes conducted by Febrianti [8], the Meureubo coal mine located in Aceh Barat district, on the slope of mine was indicated unsafe slope conditions, it will be continued research on OSH for landslide which is to understand the stability of the excavation slope and the shape of the slope collapse. Plaxis software was used for this research. After analyzing the slope stability and the effect of landslide on OSH with Job Safety Analysis (JSA) method, to identify the hazard to work safety, risk management analysis will be conducted to classified hazard level and its handling technique. This research aim is to know the level of risk of work accident at the company and its prevention effort. The result of risk analysis research is very high-risk value that is > 350 then the activity must be stopped until the risk can be reduced to reach the risk value limit < 20 which is allowed or accepted.

  3. A global probabilistic tsunami hazard assessment from earthquake sources

    Science.gov (United States)

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  4. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    Science.gov (United States)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  5. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    Science.gov (United States)

    McNamara, D. E.; Yeck, W. L.; Barnhart, W. D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, A.; Hough, S. E.; Benz, H. M.; Earle, P. S.

    2017-09-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard. Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10-15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  6. Geotechnical hazards from large earthquakes and heavy rainfalls

    CERN Document Server

    Kazama, Motoki; Lee, Wei

    2017-01-01

    This book is a collection of papers presented at the International Workshop on Geotechnical Natural Hazards held July 12–15, 2014, in Kitakyushu, Japan. The workshop was the sixth in the series of Japan–Taiwan Joint Workshops on Geotechnical Hazards from Large Earthquakes and Heavy Rainfalls, held under the auspices of the Asian Technical Committee No. 3 on Geotechnology for Natural Hazards of the International Society for Soil Mechanics and Geotechnical Engineering. It was co-organized by the Japanese Geotechnical Society and the Taiwanese Geotechnical Society. The contents of this book focus on geotechnical and natural hazard-related issues in Asia such as earthquakes, tsunami, rainfall-induced debris flows, slope failures, and landslides. The book contains the latest information and mitigation technology on earthquake- and rainfall-induced geotechnical natural hazards. By dissemination of the latest state-of-the-art research in the area, the information contained in this book will help researchers, des...

  7. Seismic hazard maps for earthquake-resistant construction designs

    International Nuclear Information System (INIS)

    Ohkawa, Izuru

    2004-01-01

    Based on the idea that seismic phenomena in Japan varying in different localities are to be reflected in designing specific nuclear facilities in specific site, the present research program started to make seismic hazard maps representing geographical distribution of seismic load factors. First, recent research data on historical earthquakes and materials on active faults in Japan have been documented. Differences in character due to different localities are expressed by dynamic load in consideration of specific building properties. Next, hazard evaluation corresponding to seismic-resistance factor is given as response index (spectrum) of an adequately selected building, for example a nuclear power station, with the help of investigation results of statistical analysis. (S. Ohno)

  8. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    Science.gov (United States)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  9. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    Science.gov (United States)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  10. Cascading hazards: Understanding triggering relations between wet tropical cyclones, landslides, and earthquakes

    Science.gov (United States)

    Wdowinski, S.; Peng, Z.; Ferrier, K.; Lin, C. H.; Hsu, Y. J.; Shyu, J. B. H.

    2017-12-01

    Earthquakes, landslides, and tropical cyclones are extreme hazards that pose significant threats to human life and property. Some of the couplings between these hazards are well known. For example, sudden, widespread landsliding can be triggered by large earthquakes and by extreme rainfall events like tropical cyclones. Recent studies have also shown that earthquakes can be triggered by erosional unloading over 100-year timescales. In a NASA supported project, titled "Cascading hazards: Understanding triggering relations between wet tropical cyclones, landslides, and earthquake", we study triggering relations between these hazard types. The project focuses on such triggering relations in Taiwan, which is subjected to very wet tropical storms, landslides, and earthquakes. One example for such triggering relations is the 2009 Morakot typhoon, which was the wettest recorded typhoon in Taiwan (2850 mm of rain in 100 hours). The typhoon caused widespread flooding and triggered more than 20,000 landslides, including the devastating Hsiaolin landslide. Six months later, the same area was hit by the 2010 M=6.4 Jiashian earthquake near Kaohsiung city, which added to the infrastructure damage induced by the typhoon and the landslides. Preliminary analysis of temporal relations between main-shock earthquakes and the six wettest typhoons in Taiwan's past 50 years reveals similar temporal relations between M≥5 events and wet typhoons. Future work in the project will include remote sensing analysis of landsliding, seismic and geodetic monitoring of landslides, detection of microseismicity and tremor activities, and mechanical modeling of crustal stress changes due to surface unloading.

  11. How to eliminate non-damaging earthquakes from the results of a probabilistic seismic hazard analysis (PSHA)-A comprehensive procedure with site-specific application

    International Nuclear Information System (INIS)

    Kluegel, Jens-Uwe

    2009-01-01

    The results of probabilistic seismic hazard analyses are frequently presented in terms of uniform hazard spectra or hazard curves with spectral accelerations as the output parameter. The calculation process is based on the evaluation of the probability of exceedance of specified acceleration levels without consideration of the damaging effects of the causative earthquakes. The same applies to the empirical attenuation equations for spectral accelerations used in PSHA models. This makes interpreting and using the results in engineering or risk applications difficult. Uniform hazard spectra and the associated hazard curves may contain a significant amount of contributions of weak, low-energy earthquakes not able to damage the seismically designed structures of nuclear power plants. For the development of realistic engineering designs and for realistic seismic probabilistic risk assessments (seismic PRA) it is necessary to remove the contribution of non-damaging earthquakes from the results of a PSHA. A detailed procedure for the elimination of non-damaging earthquakes based on the CAV (Cumulative Absolute Velocity)-filtering approach was developed and applied to the results of the large-scale PEGASOS probabilistic seismic hazard study for the site of the Goesgen nuclear power plant. The procedure considers the full scope of epistemic uncertainty and aleatory variability present in the PEGASOS study. It involves the development of a set of empirical correlations for CAV and the subsequent development of a composite distribution for the probability of exceedance of the damaging threshold of 0.16 gs. Additionally, a method was developed to measure the difference in the damaging effects of earthquakes of different strengths by the ratio of a power function of ARIAS-intensity or, in the ideal case, by the ratio of the square roots of the associated strong motion durations. The procedure was applied for the update of the Goesgen seismic PRA and for the confirmation of a

  12. Roaming earthquakes in China highlight midcontinental hazards

    Science.gov (United States)

    Liu, Mian; Wang, Hui

    2012-11-01

    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  13. Earthquake hazard zonation using peak ground acceleration (PGA) approach

    International Nuclear Information System (INIS)

    Irwansyah, E; Winarko, E; Rasjid, Z E; Bekti, R D

    2013-01-01

    The objective of this research is to develop seismic hazard area zones in the building infrastructure of the Banda Aceh City Indonesia using peak ground acceleration (PGA) measured using global and local attenuation function. PGA is calculated using attenuation function that describes the correlation between the local ground movement intensity the earthquake magnitude and the distance from the earthquake's epicentre. The data used comes from the earthquake damage catalogue available from the Indonesia meteorology, climatology and geophysics agency (BMKG) with range from year 1973 – 2011. The research methodology consists of six steps, which is developing the grid, calculation of the distance from the epicentre to the centroid of the grid, calculation of PGA values, developing the computer application, plotting the PGA values to the centroid grid, and developing the earthquake hazard zones using kriging algorithm. The conclusion of this research is that the global attenuation function that was developed by [20] can be applied to calculate the PGA values in the city of Banda Aceh. Banda Aceh city in micro scale can be divided into three hazard zones which is low hazard zone with PGA value of 0.8767 gals up to 0.8780 gals, medium hazard zone with PGA values of 0.8781 up to 0.8793 gals and high hazard zone with PGA values of 0.8794 up to 0.8806 gals.

  14. Deterministic Earthquake Hazard Assessment by Public Agencies in California

    Science.gov (United States)

    Mualchin, L.

    2005-12-01

    Even in its short recorded history, California has experienced a number of damaging earthquakes that have resulted in new codes and other legislation for public safety. In particular, the 1971 San Fernando earthquake produced some of the most lasting results such as the Hospital Safety Act, the Strong Motion Instrumentation Program, the Alquist-Priolo Special Studies Zone Act, and the California Department of Transportation (Caltrans') fault-based deterministic seismic hazard (DSH) map. The latter product provides values for earthquake ground motions based on Maximum Credible Earthquakes (MCEs), defined as the largest earthquakes that can reasonably be expected on faults in the current tectonic regime. For surface fault rupture displacement hazards, detailed study of the same faults apply. Originally, hospital, dam, and other critical facilities used seismic design criteria based on deterministic seismic hazard analyses (DSHA). However, probabilistic methods grew and took hold by introducing earthquake design criteria based on time factors and quantifying "uncertainties", by procedures such as logic trees. These probabilistic seismic hazard analyses (PSHA) ignored the DSH approach. Some agencies were influenced to adopt only the PSHA method. However, deficiencies in the PSHA method are becoming recognized, and the use of the method is now becoming a focus of strong debate. Caltrans is in the process of producing the fourth edition of its DSH map. The reason for preferring the DSH method is that Caltrans believes it is more realistic than the probabilistic method for assessing earthquake hazards that may affect critical facilities, and is the best available method for insuring public safety. Its time-invariant values help to produce robust design criteria that are soundly based on physical evidence. And it is the method for which there is the least opportunity for unwelcome surprises.

  15. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  16. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Science.gov (United States)

    2010-08-17

    ... accommodate Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov... of Technology, 365 Innovation Drive, Memphis, TN 38152-3115. Please note admittance instructions...: Trends and developments in the science and engineering of earthquake hazards reduction; The effectiveness...

  17. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    Science.gov (United States)

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated

  18. The 2016 Kumamoto Earthquakes: Cascading Geological Hazards and Compounding Risks

    Directory of Open Access Journals (Sweden)

    Katsuichiro Goda

    2016-08-01

    Full Text Available A sequence of two strike-slip earthquakes occurred on 14 and 16 April 2016 in the intraplate region of Kyushu Island, Japan, apart from subduction zones, and caused significant damage and disruption to the Kumamoto region. The analyses of regional seismic catalog and available strong motion recordings reveal striking characteristics of the events, such as migrating seismicity, earthquake surface rupture, and major foreshock-mainshock earthquake sequences. To gain valuable lessons from the events, a UK Earthquake Engineering Field Investigation Team (EEFIT was dispatched to Kumamoto, and earthquake damage surveys were conducted to relate observed earthquake characteristics to building and infrastructure damage caused by the earthquakes. The lessons learnt from the reconnaissance mission have important implications on current seismic design practice regarding the required seismic resistance of structures under multiple shocks and the seismic design of infrastructure subject to large ground deformation. The observations also highlight the consequences of cascading geological hazards on community resilience. To share the gathered damage data widely, geo-tagged photos are organized using Google Earth and the kmz file is made publicly available.

  19. Earthquake induced landslide hazard field observatory in the Avcilar peninsula

    Science.gov (United States)

    Bigarre, Pascal; Coccia, Stella; Theoleyre, Fiona; Ergintav, Semih; Özel, Oguz; Yalçinkaya, Esref; Lenti, Luca; Martino, Salvatore; Gamba, Paolo; Zucca, Francesco; Moro, Marco

    2015-04-01

    Earthquake-triggered landslides have an increasing disastrous impact in seismic regions due to the fast growing urbanization and infrastructures. Just considering disasters from the last fifteen years, among which the 1999 Chi-Chi earthquake, the 2008 Wenchuan earthquake, and the 2011 Tohoku earthquake, these events generated tens of thousands of coseismic landslides. Those resulted in amazing death toll and considerable damages, affecting the regional landscape including its hydrological main features. Despite a strong impetus in research during past decades, knowledge on those geohazards is still fragmentary, while databases of high quality observational data are lacking. These phenomena call for further collaborative researches aiming eventually to enhance preparedness and crisis management. The MARSITE project gathers research groups in a comprehensive monitoring activity developed in the Sea of Marmara Region, one of the most densely populated parts of Europe and rated at high seismic risk level since the 1999 Izmit and Duzce devastating earthquakes. Besides the seismic threat, landslides in Turkey and in this region constitute an important source of loss. The 6th Work Package of MARSITE project gathers 9 research groups to study earthquake-induced landslides focusing on two sub-regional areas of high interest among which the Cekmece-Avcilar peninsula, located westwards of Istanbul, as a highly urbanized concentrated landslide prone area, showing high susceptibility to both rainfalls while affected by very significant seismic site effects. A multidisciplinary research program based on pre-existing studies has been designed with objectives and tasks linked to constrain and tackle progressively some challenging issues related to data integration, modeling, monitoring and mapping technologies. Since the start of the project, progress has been marked on several important points as follows. The photogeological interpretation and analysis of ENVISAT-ERS DIn

  20. Job Hazard Analysis

    National Research Council Canada - National Science Library

    1998-01-01

    .... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...

  1. Monitoring Geologic Hazards and Vegetation Recovery in the Wenchuan Earthquake Region Using Aerial Photography

    Directory of Open Access Journals (Sweden)

    Zhenwang Li

    2014-03-01

    Full Text Available On 12 May 2008, the 8.0-magnitude Wenchuan earthquake occurred in Sichuan Province, China, triggering thousands of landslides, debris flows, and barrier lakes, leading to a substantial loss of life and damage to the local environment and infrastructure. This study aimed to monitor the status of geologic hazards and vegetation recovery in a post-earthquake disaster area using high-resolution aerial photography from 2008 to 2011, acquired from the Center for Earth Observation and Digital Earth (CEODE, Chinese Academy of Sciences. The distribution and range of hazards were identified in 15 large, representative geologic hazard areas triggered by the Wenchuan earthquake. After conducting an overlay analysis, the variations of these hazards between successive years were analyzed to reflect the geologic hazard development and vegetation recovery. The results showed that in the first year after the Wenchuan earthquake, debris flows occurred frequently with high intensity. Resultantly, with the source material becoming less available and the slope structure stabilizing, the intensity and frequency of debris flows gradually decreased with time. The development rate of debris flows between 2008 and 2011 was 3% per year. The lithology played a dominant role in the formation of debris flows, and the topography and hazard size in the earthquake affected area also had an influence on the debris flow development process. Meanwhile, the overall geologic hazard area decreased at 12% per year, and the vegetation recovery on the landslide mass was 15% to 20% per year between 2008 and 2011. The outcomes of this study provide supporting data for ecological recovery as well as debris flow control and prevention projects in hazard-prone areas.

  2. Original earthquake design basis in light of recent seismic hazard studies

    International Nuclear Information System (INIS)

    Petrovski, D.

    1993-01-01

    For the purpose of conceiving the framework within which efforts have been made in the eastern countries to construct earthquake resistant nuclear power plants, a review of the development and application of the seismic zoning map of USSR is given. The normative values of seismic intensity and acceleration are discussed from the aspect of recent probabilistic seismic hazard studies. To that effect, presented briefly in this paper is the methodology of probabilistic seismic hazard analysis. (author)

  3. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  4. Integrating Real-time Earthquakes into Natural Hazard Courses

    Science.gov (United States)

    Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.

    2001-12-01

    Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm

  5. Awareness and understanding of earthquake hazards at school

    Science.gov (United States)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  6. Hazard-consistent response spectra in the Region of Murcia (Southeast Spain): comparison to earthquake-resistant provisions

    OpenAIRE

    Gaspar Escribano, Jorge M.; Benito Oterino, Belen; Garcia Mayordomo, Julian

    2008-01-01

    Hazard-consistent ground-motion characterisations of three representative sites located in the Region of Murcia (southeast Spain) are presented. This is the area where the last three damaging events in Spain occurred and there is a significant amount of data for comparing them with seismic hazard estimates and earthquake-resistant provisions. Results of a probabilistic seismic hazard analysis are used to derive uniform hazard spectra (UHS) for the 475-year return period, on rock and soil cond...

  7. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  8. K Basin Hazard Analysis

    International Nuclear Information System (INIS)

    PECH, S.H.

    2000-01-01

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  9. K Basin Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  10. K Basins Hazard Analysis

    International Nuclear Information System (INIS)

    WEBB, R.H.

    1999-01-01

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  11. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    Science.gov (United States)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  12. Recent research in earth structure, earthquake and mine seismology, and seismic hazard evaluation in South Africa

    CSIR Research Space (South Africa)

    Wright, C

    2003-07-01

    Full Text Available of earthquakes, earthquake hazard and earth structure in South Africa was prepared for the centennial handbook of the Interna- tional Association of Seismology and the Physics of the Earth?s Interior(IASPEI).3 Referencestothesescompletedinthelastfour...

  13. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  14. How Can Museum Exhibits Enhance Earthquake and Tsunami Hazard Resiliency?

    Science.gov (United States)

    Olds, S. E.

    2015-12-01

    Creating a natural disaster-ready community requires interoperating scientific, technical, and social systems. In addition to the technical elements that need to be in place, communities and individuals need to be prepared to react when a natural hazard event occurs. Natural hazard awareness and preparedness training and education often takes place through informal learning at science centers and formal k-12 education programs as well as through awareness raising via strategically placed informational tsunami warning signs and placards. Museums and science centers are influential in raising science literacy within a community, however can science centers enhance earthquake and tsunami resiliency by providing hazard science content and preparedness exhibits? Museum docents and informal educators are uniquely situated within the community. They are transmitters and translators of science information to broad audiences. Through interaction with the public, docents are well positioned to be informants of the knowledge beliefs, and feelings of science center visitors. They themselves are life-long learners, both constantly learning from the museum content around them and sharing this content with visitors. They are also members of a community where they live. In-depth interviews with museum informal educators and docents were conducted at a science center in coastal Pacific Northwest. This region has a potential to be struck by a great 9+ Mw earthquake and subsequent tsunami. During the interviews, docents described how they applied learning from natural hazard exhibits at a science visitor center to their daily lives. During the individual interviews, the museum docents described their awareness (knowledge, attitudes, and behaviors) of natural hazards where they live and work, the feelings evoked as they learned about their hazard vulnerability, the extent to which they applied this learning and awareness to their lives, such as creating an evacuation plan, whether

  15. Development of seismic hazard analysis in Japan

    International Nuclear Information System (INIS)

    Itoh, T.; Ishii, K.; Ishikawa, Y.; Okumura, T.

    1987-01-01

    In recent years, seismic risk assessment of the nuclear power plant have been conducted increasingly in various countries, particularly in the United States to evaluate probabilistically the safety of existing plants under earthquake loading. The first step of the seismic risk assessment is the seismic hazard analysis, in which the relationship between the maximum earthquake ground motions at the plant site and their annual probability of exceedance, i.e. the seismic hazard curve, is estimated. In this paper, seismic hazard curves are evaluated and examined based on historical earthquake records model, in which seismic sources are modeled with area-sources, for several different sites in Japan. A new evaluation method is also proposed to compute the response spectra of the earthquake ground motions in connection with estimating the probabilistic structural response. Finally the numerical result of probabilistic risk assessment for a base-isolated three story RC structure, in which the frequency of seismic induced structural failure is evaluated combining the seismic hazard analysis, is described briefly

  16. Dynamic evaluation of seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    Science.gov (United States)

    Kossobokov, V. G.; Nekrasova, A.

    2016-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A + B•(6 - M) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L, A characterizes the average annual rate of strong (M = 6) earthquakes, B determines the balance between magnitude ranges, and C estimates the fractal dimension of seismic locus in projection to the Earth surface. The parameters A, B, and C of USLE are used to assess, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity or paleo data), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures. The hazard maps for a given territory change dramatically, when the methodology is applied to a certain size moving time window, e.g. about a decade long for an intermediate-term regional assessment or exponentially increasing intervals for a daily local strong aftershock forecasting. The of dynamical seismic hazard and risks assessment is illustrated by applications to the territory of Greater Caucasus and Crimea and the two-year series of aftershocks of the 11 October 2008 Kurchaloy, Chechnya earthquake which case-history appears to be encouraging for further systematic testing as potential short-term forecasting tool.

  17. Disaggregated seismic hazard and the elastic input energy spectrum: An approach to design earthquake selection

    Science.gov (United States)

    Chapman, Martin Colby

    1998-12-01

    The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression

  18. Basic earthquake engineering from seismology to analysis and design

    CERN Document Server

    Sucuoğlu, Halûk

    2014-01-01

    This book provides senior undergraduate students, master students and structural engineers who do not have a background in the field with core knowledge of structural earthquake engineering that will be invaluable in their professional lives. The basics of seismotectonics, including the causes, magnitude, and intensity of earthquakes, are first explained. Then the book introduces basic elements of seismic hazard analysis and presents the concept of a seismic hazard map for use in seismic design. Subsequent chapters cover key aspects of the response analysis of simple systems and building struc­tures to earthquake ground motions, design spectrum, the adoption of seismic analysis procedures in seismic design codes, seismic design principles and seismic design of reinforced concrete structures. Helpful worked examples on seismic analysis of linear, nonlinear and base isolated buildings, earthquake-resistant design of frame and frame-shear wall systems are included, most of which can be solved using a hand calcu...

  19. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Directory of Open Access Journals (Sweden)

    C. H. Nelson

    2012-11-01

    Full Text Available We summarize the importance of great earthquakes (Mw ≳ 8 for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1 radiometric dating (14C method, and (2 relative dating, using hemipelagic sediment thickness and sedimentation rates (H method. The H method provides (1 the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2 the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia or very close (San Andreas to the early window for another great earthquake.

    On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km than on passive margins (~1000 km. The great earthquakes along the Cascadia and northern California margins

  20. Crustal structure and Seismic Hazard studies in Nigeria from ambient noise and earthquakes

    Science.gov (United States)

    Kadiri, U. A.

    2016-12-01

    The crust, upper Mantle and seismic hazard studies have been carried out in Nigeria using noise and earthquake data. The data were acquired from stations in Nigeria and international Agencies. Firstly, known depths of sediments in the Lower Benue Trough (LBT) were collected from wells; Resonance frequency (Fo) and average shear-wave velocities (Vs) were then computed using Matlab. Secondly, average velocities were estimated from noise cross-correlation along seismic stations. Thirdly, the moho depths beneath Ife, Kaduna and Nsukka stations were estimated, as well as Vp/Vs ratio using 2009 earthquake with epicenter in Nigeria. Finally, Statistical and Probabilistic Seismic Hazard Assessment (PSHA) were used to compute seismic hazard parameters in Nigeria and its surroundings. The results showed that, soils on the LBT with average shear wave velocity of about 5684m/s would experience more amplification in case of an earthquake, compared to the basement complex in Nigeria. The Vs beneath the seismic stations in Nigeria were also estimated as 288m/s, 1019m/s, 940.6m/s and 255.02m/s in Ife, Nsukka, Awka, and Abakaliki respectively. The average velocity along the station paths was 4.5km/secs, and the Vp, Vs for depths 100-500km profile in parts of South West Nigeria increased from about 5.83-6.42Km/sec and 3.48-6.31km/s respectively with Vp/Vs ratio decreasing from 1.68 to 1.02. Statistical analysis revealed a trend of increasing earthquake occurrence along the Mid-Atlantic Ridge and tending to West African region. The analysis of PSHA shows the likelihood of earthquakes with different magnitudes occurring in Nigeria and other parts West Africa in future. This work is aimed at addressing critical issues regarding sites effect characterization, improved earthquake location and robust seismic hazards assessment for planning in the choice of sites for critical facilities in Nigeria. Keywords: Sediment thickness, Resonance Frequency, Average Velocity, Seismic Hazard, Nigeria

  1. Geological and Seismological Analysis of the 13 February 2001 Mw 6.6 El Salvador Earthquake: Evidence for Surface Rupture and Implications for Seismic Hazard

    OpenAIRE

    Canora Catalán, Carolina; Martínez Díaz, José J.; Villamor Pérez, María Pilar; Berryman, K.R.; Álvarez Gómez, José Antonio; Pullinger, Carlos; Capote del Villar, Ramón

    2010-01-01

    The El Salvador earthquake of 13 February 2001 (Mw 6.6) caused tectonic rupture on the El Salvador fault zone (ESFZ). Right-lateral strike-slip surface rupture of the east–west trending fault zone had a maximum surface displacement of 0.60 m. No vertical component was observed. The earthquake resulted in widespread landslides in the epicentral area, where bedrock is composed of volcanic sediments, tephra, and weak ignimbrites. In the aftermath of the earthquake, widespread dama...

  2. Application of High Performance Computing to Earthquake Hazard and Disaster Estimation in Urban Area

    Directory of Open Access Journals (Sweden)

    Muneo Hori

    2018-02-01

    Full Text Available Integrated earthquake simulation (IES is a seamless simulation of analyzing all processes of earthquake hazard and disaster. There are two difficulties in carrying out IES, namely, the requirement of large-scale computation and the requirement of numerous analysis models for structures in an urban area, and they are solved by taking advantage of high performance computing (HPC and by developing a system of automated model construction. HPC is a key element in developing IES, as it needs to analyze wave propagation and amplification processes in an underground structure; a model of high fidelity for the underground structure exceeds a degree-of-freedom larger than 100 billion. Examples of IES for Tokyo Metropolis are presented; the numerical computation is made by using K computer, the supercomputer of Japan. The estimation of earthquake hazard and disaster for a given earthquake scenario is made by the ground motion simulation and the urban area seismic response simulation, respectively, for the target area of 10,000 m × 10,000 m.

  3. Coseismic and postseismic deformation associated with the 2016 Mw 7.8 Kaikoura earthquake, New Zealand: fault movement investigation and seismic hazard analysis

    Science.gov (United States)

    Jiang, Zhongshan; Huang, Dingfa; Yuan, Linguo; Hassan, Abubakr; Zhang, Lupeng; Yang, Zhongrong

    2018-04-01

    The 2016 moment magnitude (Mw) 7.8 Kaikoura earthquake demonstrated that multiple fault segments can undergo rupture during a single seismic event. Here, we employ Global Positioning System (GPS) observations and geodetic modeling methods to create detailed images of coseismic slip and postseismic afterslip associated with the Kaikoura earthquake. Our optimal geodetic coseismic model suggests that rupture not only occurred on shallow crustal faults but also to some extent at the Hikurangi subduction interface. The GPS-inverted moment release during the earthquake is equivalent to a Mw 7.9 event. The near-field postseismic deformation is mainly derived from right-lateral strike-slip motions on shallow crustal faults. The afterslip did not only significantly extend northeastward on the Needles fault but also appeared at the plate interface, slowly releasing energy over the past 6 months, equivalent to a Mw 7.3 earthquake. Coulomb stress changes induced by coseismic deformation exhibit complex patterns and diversity at different depths, undoubtedly reflecting multi-fault rupture complexity associated with the earthquake. The Coulomb stress can reach several MPa during coseismic deformation, which can explain the trigger mechanisms of afterslip in two high-slip regions and the majority of aftershocks. Based on the deformation characteristics of the Kaikoura earthquake, interseismic plate coverage, and historical earthquakes, we conclude that Wellington is under higher seismic threat after the earthquake and great attention should be paid to potential large earthquake disasters in the near future.[Figure not available: see fulltext.

  4. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    Science.gov (United States)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  5. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  6. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Science.gov (United States)

    Nelson, C. H.; Gutiérrez Pastor, J.; Goldfinger, C.; Escutia, C.

    2012-11-01

    We summarize the importance of great earthquakes (Mw ≳ 8) for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1) radiometric dating (14C method), and (2) relative dating, using hemipelagic sediment thickness and sedimentation rates (H method). The H method provides (1) the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2) the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia) or very close (San Andreas) to the early window for another great earthquake. On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs) are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km) than on passive margins (~1000 km). The great earthquakes along the Cascadia and northern California margins cause seismic strengthening of the sediment, which

  7. Echo-sounding method aids earthquake hazard studies

    Science.gov (United States)

    ,

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  8. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    Science.gov (United States)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  9. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    Science.gov (United States)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  10. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  11. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    Energy Technology Data Exchange (ETDEWEB)

    Wong, I.G.; Green, R.K.; Sun, J.I. [Woodward-Clyde Federal Services, Oakland, CA (United States); Pezzopane, S.K. [Geological Survey, Denver, CO (United States); Abrahamson, N.A. [Abrahamson (Norm A.), Piedmont, CA (United States); Quittmeyer, R.C. [Woodward-Clyde Federal Services, Las Vegas, NV (United States)

    1996-12-31

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M{sub w}) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M{sub w} 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M{sub w} 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper.

  12. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    International Nuclear Information System (INIS)

    Wong, I.G.; Green, R.K.; Sun, J.I.; Pezzopane, S.K.; Abrahamson, N.A.; Quittmeyer, R.C.

    1996-01-01

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M w ) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M w 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M w 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper

  13. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones

    Science.gov (United States)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana

    2017-11-01

    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  14. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones

    Science.gov (United States)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana

    2018-04-01

    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  15. Assessment of earthquake-induced tsunami hazard at a power plant site

    International Nuclear Information System (INIS)

    Ghosh, A.K.

    2008-01-01

    This paper presents a study of the tsunami hazard due to submarine earthquakes at a power plant site on the east coast of India. The paper considers various sources of earthquakes from the tectonic information, and records of past earthquakes and tsunamis. Magnitude-frequency relationship for earthquake occurrence rate and a simplified model for tsunami run-up height as a function of earthquake magnitude and the distance between the source and site have been developed. Finally, considering equal likelihood of generation of earthquakes anywhere on each of the faults, the tsunami hazard has been evaluated and presented as a relationship between tsunami height and its mean recurrence interval (MRI). Probability of exceedence of a certain wave height in a given period of time is also presented. These studies will be helpful in making an estimate of the tsunami-induced flooding potential at the site

  16. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    Science.gov (United States)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  17. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily

  18. Baseline geophysical data for hazard management in coastal areas in relation to earthquakes and tsunamis

    Digital Repository Service at National Institute of Oceanography (India)

    Murthy, K.S.R.

    is another factor for some of the intraplate earthquakes in the South Indian Shield, which includes the Eastern and Western Continental Margins of India. Baseline geophysical data for hazard management in coastal areas in relation to earthquakes... surge. Keywords Hazard management, marine geophysical data, geomorphology and tsunami surge, coastal seismicity Date received: 7 August 2015; accepted: 15 October 2015 CSIR – National Institute of Oceanography, Visakhapatnam, India Corresponding author...

  19. Tectonic styles of future earthquakes in Italy as input data for seismic hazard

    Science.gov (United States)

    Pondrelli, S.; Meletti, C.; Rovida, A.; Visini, F.; D'Amico, V.; Pace, B.

    2017-12-01

    In a recent elaboration of a new seismogenic zonation and hazard model for Italy, we tried to understand how many indications we have on the tectonic style of future earthquake/rupture. Using all available or recomputed seismic moment tensors for relevant seismic events (Mw starting from 4.5) of the last 100 yrs, first arrival focal mechanisms for less recent earthquakes and also geological data on past activated faults, we collected a database gathering a thousands of data all over the Italian peninsula and regions around it. After several summations of seismic moment tensors, over regular grids of different dimensions and different thicknesses of the seismogenic layer, we applied the same procedure to each of the 50 area sources that were designed in the seismogenic zonation. The results for several seismic zones are very stable, e.g. along the southern Apennines we expect future earthquakes to be mostly extensional, although in the outer part of the chain strike-slip events are possible. In the Northern part of the Apennines we also expect different, opposite tectonic styles for different hypocentral depths. In several zones, characterized by a low seismic moment release, defined for the study region using 1000 yrs of catalog, the next possible tectonic style of future earthquakes is less clear. It is worth to note that for some zones the possible greatest earthquake could be not represented in the available observations. We also add to our analysis the computation of the seismic release rate, computed using a distributed completeness, identified for single great events of the historical seismic catalog for Italy. All these information layers, overlapped and compared, may be used to characterize each new seismogenic zone.

  20. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev

    2010-12-01

    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  1. Composite Earthquake Catalog of the Yellow Sea for Seismic Hazard Studies

    Science.gov (United States)

    Kang, S. Y.; Kim, K. H.; LI, Z.; Hao, T.

    2017-12-01

    The Yellow Sea (a.k.a West Sea in Korea) is an epicontinental and semi-closed sea located between Korea and China. Recent earthquakes in the Yellow Sea including, but not limited to, the Seogyuckryulbi-do (1 April 2014, magnitude 5.1), Heuksan-do (21 April 2013, magnitude 4.9), Baekryung-do (18 May 2013, magnitude 4.9) earthquakes, and the earthquake swarm in the Boryung offshore region in 2013, remind us of the seismic hazards affecting east Asia. This series of earthquakes in the Yellow Sea raised numerous questions. Unfortunately, both governments have trouble in monitoring seismicity in the Yellow Sea because earthquakes occur beyond their seismic networks. For example, the epicenters of the magnitude 5.1 earthquake in the Seogyuckryulbi-do region in 2014 reported by the Korea Meteorological Administration and China Earthquake Administration differed by approximately 20 km. This illustrates the difficulty with seismic monitoring and locating earthquakes in the region, despite the huge effort made by both governments. Joint effort is required not only to overcome the limits posed by political boundaries and geographical location but also to study seismicity and the underground structures responsible. Although the well-established and developing seismic networks in Korea and China have provided unprecedented amount and quality of seismic data, high quality catalog is limited to the recent 10s of years, which is far from major earthquake cycle. It is also noticed the earthquake catalog from either country is biased to its own and cannot provide complete picture of seismicity in the Yellow Sea. In order to understand seismic hazard and tectonics in the Yellow Sea, a composite earthquake catalog has been developed. We gathered earthquake information during last 5,000 years from various sources. There are good reasons to believe that some listings account for same earthquake, but in different source parameters. We established criteria in order to provide consistent

  2. Long Aftershock Sequences within Continents and Implications for Earthquake Hazard Assessment

    Science.gov (United States)

    Stein, S. A.; Liu, M.

    2014-12-01

    Recent seismicity in the Tangshan region in North China has prompted concern about a repetition of the 1976 M7.8 earthquake that destroyed the city, killing more than 242,000 people. However, the decay of seismicity there implies that the recent earthquakes are probably aftershocks of the 1976 event. This 37-year sequence is an example of the phenomenon that aftershock sequences within continents are often significantly longer than the typical 10 years at plate boundaries. The long sequence of aftershocks in continents is consistent with a simple friction-based model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Hence the slowly-deforming continents tend to have aftershock sequences significantly longer than at rapidly-loaded plate boundaries. This effect has two consequences for hazard assessment. First, within the heavily populated continents that are typically within plate interiors, assessments of earthquake hazards rely significantly on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. This assumption would lead to overestimation of the hazard in presently active areas and underestimation elsewhere, if some of these small events are aftershocks. Second, successful attempts to remove aftershocks from catalogs used for hazard assessment would underestimate the hazard, because much of the hazard is due to the aftershocks, and the declustering algorithms implicitly assume short aftershock sequences and thus do not remove long-duration ones.

  3. Kinematics, mechanics, and potential earthquake hazards for faults in Pottawatomie County, Kansas, USA

    Science.gov (United States)

    Ohlmacher, G.C.; Berendsen, P.

    2005-01-01

    Many stable continental regions have subregions with poorly defined earthquake hazards. Analysis of minor structures (folds and faults) in these subregions can improve our understanding of the tectonics and earthquake hazards. Detailed structural mapping in Pottawatomie County has revealed a suite consisting of two uplifted blocks aligned along a northeast trend and surrounded by faults. The first uplift is located southwest of the second. The northwest and southeast sides of these uplifts are bounded by northeast-trending right-lateral faults. To the east, both uplifts are bounded by north-trending reverse faults, and the first uplift is bounded by a north-trending high-angle fault to the west. The structural suite occurs above a basement fault that is part of a series of north-northeast-trending faults that delineate the Humboldt Fault Zone of eastern Kansas, an integral part of the Midcontinent Rift System. The favored kinematic model is a contractional stepover (push-up) between echelon strike-slip faults. Mechanical modeling using the boundary element method supports the interpretation of the uplifts as contractional stepovers and indicates that an approximately east-northeast maximum compressive stress trajectory is responsible for the formation of the structural suite. This stress trajectory suggests potential activity during the Laramide Orogeny, which agrees with the age of kimberlite emplacement in adjacent Riley County. The current stress field in Kansas has a N85??W maximum compressive stress trajectory that could potentially produce earthquakes along the basement faults. Several epicenters of seismic events (

  4. The earthquake of January 13, 1915 and the seismic hazard of the area

    International Nuclear Information System (INIS)

    Scarascia Mugnozza, Gabriele; Hailemikael, Salomon; Martini, Guido

    2015-01-01

    The January 13, 1915, magnitude 7.0 Marsica Earthquake devastated the Fucino basin and surroundings, causing about 30,000 casualties and entirely destroying several towns, among which the major municipality of the area, the town of Avezzano. In this paper, we briefly review the main characteristics of the earthquake and its effects on the environment. Furthermore, based on the Italian building code and ongoing seismic microzonation investigations, we describe the seismic hazard of the area struck by the earthquake in terms of both probabilistic seismic hazard assessment and contribution of site effects on the seismic hazard estimate. All the studies confirm the very high level of seismic hazard of the Fucino territory [it

  5. Earthquake hazard in Northeast India – A seismic microzonation ...

    Indian Academy of Sciences (India)

    microzonation approach with typical case studies from .... the other hand, Guwahati city represents a case of well-formed basin with ... earthquake prone regions towards developing its ... tonic network and the observed seismicity has been.

  6. The Wenchuan, China M8.0 Earthquake: A Lesson and Implication for Seismic Hazard Mitigation

    Science.gov (United States)

    Wang, Z.

    2008-12-01

    The Wenchuan, China M8.0 earthquake caused great damage and huge casualty. 69,197 people were killed, 374,176 people were injured, and 18,341 people are still missing. The estimated direct economic loss is about 126 billion U.S. dollar. The Wenchuan earthquake again demonstrated that earthquake does not kill people, but the built environments and induced hazards, landslides in particular, do. Therefore, it is critical to strengthen the built environments, such buildings and bridges, and to mitigate the induced hazards in order to avoid such disaster. As a part of the so-called North-South Seismic Zone in China, the Wenchuan earthquake occurred along the Longmen Shan thrust belt which forms a boundary between the Qinghai-Tibet Plateau and the Sichuan basin, and there is a long history (~4,000 years) of seismicity in the area. The historical records show that the area experienced high intensity (i.e., greater than IX) in the past several thousand years. In other words, the area is well-known to have high seismic hazard because of its tectonic setting and seismicity. However, only intensity VII (0.1 to 0.15g PGA) has been considered for seismic design for the built environments in the area. This was one of the main reasons that so many building collapses, particularly the school buildings, during the Wenchuan earthquake. It is clear that the seismic design (i.e., the design ground motion or intensity) is not adequate in the Wenchuan earthquake stricken area. A lesson can be learned from the Wenchuan earthquake on the seismic hazard and risk assessment. A lesson can also be learned from this earthquake on seismic hazard mitigation and/or seismic risk reduction.

  7. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    Science.gov (United States)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for

  8. Retrospective analysis of the Spitak earthquake

    Directory of Open Access Journals (Sweden)

    A. K. Tovmassian

    1995-06-01

    Full Text Available Based on the retrospective analysis of numerous data and studies of the Spitak earthquake the present work at- tempts to shed light on different aspects of that catastrophic seismic event which occurred in Northern Arme- nia on December 7, 1988. The authors follow a chronological order of presentation, namely: changes in geo- sphere, atmosphere, biosphere during the preparation of the Spitak earthquake, foreshocks, main shock, after- shocks, focal mechanisms, historical seismicity; seismotectonic position of the source, strong motion records, site effects; the macroseismic effect, collapse of buildings and structures; rescue activities; earthquake conse- quences; and the lessons of the Spitak earthquake.

  9. Seismicity and seismic hazard in Sabah, East Malaysia from earthquake and geodetic data

    Science.gov (United States)

    Gilligan, A.; Rawlinson, N.; Tongkul, F.; Stephenson, R.

    2017-12-01

    While the levels of seismicity are low in most of Malaysia, the state of Sabah in northern Borneo has moderate levels of seismicity. Notable earthquakes in the region include the 1976 M6.2 Lahad Datu earthquake and the 2015 M6 Ranau earthquake. The recent Ranau earthquake resulted in the deaths of 18 people on Mt Kinabalu, an estimated 100 million RM ( US$23 million) damage to buildings, roads, and infrastructure from shaking, and flooding, reduced water quality, and damage to farms from landslides. Over the last 40 years the population of Sabah has increased to over four times what it was in 1976, yet seismic hazard in Sabah remains poorly understood. Using seismic and geodetic data we hope to better quantify the hazards posed by earthquakes in Sabah, and thus help to minimize risk. In order to do this we need to know about the locations of earthquakes, types of earthquakes that occur, and faults that are generating them. We use data from 15 MetMalaysia seismic stations currently operating in Sabah to develop a region-specific velocity model from receiver functions and a pre-existing surface wave model. We use this new velocity model to (re)locate earthquakes that occurred in Sabah from 2005-2016, including a large number of aftershocks from the 2015 Ranau earthquake. We use a probabilistic nonlinear earthquake location program to locate the earthquakes and then refine their relative locations using a double difference method. The recorded waveforms are further used to obtain moment tensor solutions for these earthquakes. Earthquake locations and moment tensor solutions are then compared with the locations of faults throughout Sabah. Faults are identified from high-resolution IFSAR images and subsequent fieldwork, with a particular focus on the Lahad Datau and Ranau areas. Used together, these seismic and geodetic data can help us to develop a new seismic hazard model for Sabah, as well as aiding in the delivery of outreach activities regarding seismic hazard

  10. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008

    Science.gov (United States)

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.

    2009-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: http://earthquake.usgs.gov/regional/ceus/urban_map/st_louis/index.php. This progress report summarizes the

  11. Aftereffects of Subduction-Zone Earthquakes: Potential Tsunami Hazards along the Japan Sea Coast.

    Science.gov (United States)

    Minoura, Koji; Sugawara, Daisuke; Yamanoi, Tohru; Yamada, Tsutomu

    2015-10-01

    The 2011 Tohoku-Oki Earthquake is a typical subduction-zone earthquake and is the 4th largest earthquake after the beginning of instrumental observation of earthquakes in the 19th century. In fact, the 2011 Tohoku-Oki Earthquake displaced the northeast Japan island arc horizontally and vertically. The displacement largely changed the tectonic situation of the arc from compressive to tensile. The 9th century in Japan was a period of natural hazards caused by frequent large-scale earthquakes. The aseismic tsunamis that inflicted damage on the Japan Sea coast in the 11th century were related to the occurrence of massive earthquakes that represented the final stage of a period of high seismic activity. Anti-compressive tectonics triggered by the subduction-zone earthquakes induced gravitational instability, which resulted in the generation of tsunamis caused by slope failing at the arc-back-arc boundary. The crustal displacement after the 2011 earthquake infers an increased risk of unexpected local tsunami flooding in the Japan Sea coastal areas.

  12. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  13. Near real-time aftershock hazard maps for earthquakes

    Science.gov (United States)

    McCloskey, J.; Nalbant, S. S.

    2009-04-01

    Stress interaction modelling is routinely used to explain the spatial relationships between earthquakes and their aftershocks. On 28 October 2008 a M6.4 earthquake occurred near the Pakistan-Afghanistan border killing several hundred and causing widespread devastation. A second M6.4 event occurred 12 hours later 20km to the south east. By making some well supported assumptions concerning the source event and the geometry of any likely triggered event it was possible to map those areas most likely to experience further activity. Using Google earth, it would further have been possible to identify particular settlements in the source area which were particularly at risk and to publish their locations globally within about 3 hours of the first earthquake. Such actions could have significantly focused the initial emergency response management. We argue for routine prospective testing of such forecasts and dialogue between social and physical scientists and emergency response professionals around the practical application of these techniques.

  14. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    Science.gov (United States)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  15. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    Science.gov (United States)

    Boyd, Oliver Salz; Magistrale, Harold

    2011-01-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  16. The wicked problem of earthquake hazard in developing countries: the example of Bangladesh

    Science.gov (United States)

    Steckler, M. S.; Akhter, S. H.; Stein, S.; Seeber, L.

    2017-12-01

    Many developing nations in earthquake-prone areas confront a tough problem: how much of their limited resources to use mitigating earthquake hazards? This decision is difficult because it is unclear when an infrequent major earthquake may happen, how big it could be, and how much harm it may cause. This issue faces nations with profound immediate needs and ongoing rapid urbanization. Earthquake hazard mitigation in Bangladesh is a wicked problem. It is the world's most densely populated nation, with 160 million people in an area the size of Iowa. Complex geology and sparse data make assessing a possibly-large earthquake hazard difficult. Hence it is hard to decide how much of the limited resources available should be used for earthquake hazard mitigation, given other more immediate needs. Per capita GDP is $1200, so Bangladesh is committed to economic growth and resources are needed to address many critical challenges and hazards. In their subtropical environment, rural Bangladeshis traditionally relied on modest mud or bamboo homes. Their rapidly growing, crowded capital, Dhaka, is filled with multistory concrete buildings likely to be vulnerable to earthquakes. The risk is compounded by the potential collapse of services and accessibility after a major temblor. However, extensive construction as the population shifts from rural to urban provides opportunity for earthquake-risk reduction. While this situation seems daunting, it is not hopeless. Robust risk management is practical, even for developing nations. It involves recognizing uncertainties and developing policies that should give a reasonable outcome for a range of the possible hazard and loss scenarios. Over decades, Bangladesh has achieved a thousandfold reduction in risk from tropical cyclones by building shelters and setting up a warning system. Similar efforts are underway for earthquakes. Smart investments can be very effective, even if modest. Hence, we suggest strategies consistent with high

  17. Earthquake induced landslide hazard: a multidisciplinary field observatory in the Marmara SUPERSITE

    Science.gov (United States)

    Bigarré, Pascal

    2014-05-01

    Earthquake-triggered landslides have an increasing disastrous impact in seismic regions due to the fast growing urbanization and infrastructures. Just considering disasters from the last fifteen years, among which the 1999 Chi-Chi earthquake, the 2008 Wenchuan earthquake, and the 2011 Tohoku earthquake, these events generated tens of thousands of coseismic landslides. Those resulted in amazing death toll and considerable damages, affecting the regional landscape including its hydrological main features. Despite a strong impetus in research during past decades, knowledge on those geohazards is still fragmentary, while databases of high quality observational data are lacking. These phenomena call for further collaborative researches aiming eventually to enhance preparedness and crisis management. As one of the three SUPERSITE concept FP7 projects dealing with long term high level monitoring of major natural hazards at the European level, the MARSITE project gathers research groups in a comprehensive monitoring activity developed in the Sea of Marmara Region, one of the most densely populated parts of Europe and rated at high seismic risk level since the 1999 Izmit and Duzce devastating earthquakes. Besides the seismic threat, landslides in Turkey and in this region constitute an important source of loss. The 1999 Earthquake caused extensive landslides while tsunami effects were observed during the post-event surveys in several places along the coasts of the Izmit bay. The 6th Work Package of MARSITE project gathers 9 research groups to study earthquake-induced landslides focusing on two sub-regional areas of high interest. First, the Cekmece-Avcilar peninsula, located westwards of Istanbul, is a highly urbanized concentrated landslide prone area, showing high susceptibility to both rainfalls while affected by very significant seismic site effects. Second, the off-shore entrance of the Izmit Gulf, close to the termination of the surface rupture of the 1999 earthquake

  18. MGR External Events Hazards Analysis

    International Nuclear Information System (INIS)

    Booth, L.

    1999-01-01

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses

  19. Seismic hazard analysis for the NTS spent reactor fuel test site

    International Nuclear Information System (INIS)

    Campbell, K.W.

    1980-01-01

    An experiment is being directed at the Nevada Test Site to test the feasibility for storage of spent fuel from nuclear reactors in geologic media. As part of this project, an analysis of the earthquake hazard was prepared. This report presents the results of this seismic hazard assessment. Two distinct components of the seismic hazard were addressed: vibratory ground motion and surface displacement

  20. Earthquake Prediction Research In Iceland, Applications For Hazard Assessments and Warnings

    Science.gov (United States)

    Stefansson, R.

    Earthquake prediction research in Iceland, applications for hazard assessments and warnings. The first multinational earthquake prediction research project in Iceland was the Eu- ropean Council encouraged SIL project of the Nordic countries, 1988-1995. The path selected for this research was to study the physics of crustal processes leading to earth- quakes. It was considered that small earthquakes, down to magnitude zero, were the most significant for this purpose, because of the detailed information which they pro- vide both in time and space. The test area for the project was the earthquake prone region of the South Iceland seismic zone (SISZ). The PRENLAB and PRENLAB-2 projects, 1996-2000 supported by the European Union were a direct continuation of the SIL project, but with a more multidisciplinary approach. PRENLAB stands for "Earthquake prediction research in a natural labo- ratory". The basic objective was to advance our understanding in general on where, when and how dangerous NH10earthquake motion might strike. Methods were devel- oped to study crustal processes and conditions, by microearthquake information, by continuous GPS, InSAR, theoretical modelling, fault mapping and paleoseismology. New algorithms were developed for short term warnings. A very useful short term warning was issued twice in the year 2000, one for a sudden start of an eruption in Volcano Hekla February 26, and the other 25 hours before a second (in a sequence of two) magnitude 6.6 (Ms) earthquake in the South Iceland seismic zone in June 21, with the correct location and approximate size. A formal short term warning, although not going to the public, was also issued before a magnitude 5 earthquake in November 1998. In the presentation it will be shortly described what these warnings were based on. A general hazard assessmnets was presented in scientific journals 10-15 years ago assessing within a few kilometers the location of the faults of the two 2000 earthquakes and suggesting

  1. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    Science.gov (United States)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  2. An Arduino project to record ground motion and to learn on earthquake hazard at high school

    Science.gov (United States)

    Saraò, Angela; Barnaba, Carla; Clocchiatti, Marco; Zuliani, David

    2015-04-01

    Through a multidisciplinary work that integrates Technology education with Earth Sciences, we implemented an educational program to raise the students' awareness of seismic hazard and to disseminate good practices of earthquake safety. Using free software and low-cost open hardware, the students of a senior class of the high school Liceo Paschini in Tolmezzo (NE Italy) implemented a seismograph using the Arduino open-source electronics platform and the ADXL345 sensors to emulate a low cost seismometer (e.g. O-NAVI sensor of the Quake-Catcher Network, http://qcn.stanford.edu). To accomplish their task the students were addressed to use the web resources for technical support and troubleshooting. Shell scripts, running on local computers under Linux OS, controlled the process of recording and display data. The main part of the experiment was documented using the DokuWiki style. Some propaedeutic lessons in computer sciences and electronics were needed to build up the necessary skills of the students and to fill in the gap of their background knowledge. In addition lectures by seismologists and laboratory activity allowed the class to exploit different aspects of the physics of the earthquake and particularly of the seismic waves, and to become familiar with the topics of seismic hazard through an inquiry-based learning. The Arduino seismograph achieved can be used for educational purposes and it can display tremors on the local network of the school. For sure it can record the ground motion due to a seismic event that can occur in the area, but further improvements are necessary for a quantitative analysis of the recorded signals.

  3. SHEAT: a computer code for probabilistic seismic hazard analysis, user's manual

    International Nuclear Information System (INIS)

    Ebisawa, Katsumi; Kondo, Masaaki; Abe, Kiyoharu; Tanaka, Toshiaki; Takani, Michio.

    1994-08-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. Seismic hazard is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site. With the SHEAT code, seismic hazard is calculated by the following two steps: (1) Modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquakes) is modelled based on the historical earthquake records, active fault data and expert judgement. (2) Calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT code. It includes: (1) Outlines of the code, which include overall concept, logical process, code structure, data file used and special characteristics of the code, (2) Functions of subprograms and analytical models in them, (3) Guidance of input and output data, and (4) Sample run results. The code has widely been used at JAERI to analyze seismic hazard at various nuclear power plant sites in japan. (author)

  4. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  5. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    Science.gov (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  6. Pattern recognition methodologies and deterministic evaluation of seismic hazard: A strategy to increase earthquake preparedness

    International Nuclear Information System (INIS)

    Peresan, Antonella; Panza, Giuliano F.; Gorshkov, Alexander I.; Aoudia, Abdelkrim

    2001-05-01

    Several algorithms, structured according to a general pattern-recognition scheme, have been developed for the space-time identification of strong events. Currently, two of such algorithms are applied to the Italian territory, one for the recognition of earthquake-prone areas and the other, namely CN algorithm, for earthquake prediction purposes. These procedures can be viewed as independent experts, hence they can be combined to better constrain the alerted seismogenic area. We examine here the possibility to integrate CN intermediate-term medium-range earthquake predictions, pattern recognition of earthquake-prone areas and deterministic hazard maps, in order to associate CN Times of Increased Probability (TIPs) to a set of appropriate scenarios of ground motion. The advantage of this procedure mainly consists in the time information provided by predictions, useful to increase preparedness of safety measures and to indicate a priority for detailed seismic risk studies to be performed at a local scale. (author)

  7. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  8. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  9. Earthquake induced liquefaction hazard, probability and risk assessment in the city of Kolkata, India: its historical perspective and deterministic scenario

    Science.gov (United States)

    Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.

    2018-01-01

    Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically

  10. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    Science.gov (United States)

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude

  11. Earthquake and volcano hazard notices: An economic evaluation of changes in risk perceptions

    Science.gov (United States)

    Bernknopf, R.L.; Brookshire, D.S.; Thayer, M.A.

    1990-01-01

    Earthquake and volcano hazard notices were issued for the Mammoth Lakes, California area by the U.S. Geological Survey under the authority granted by the Disaster Relief Act of 1974. The effects on investment, recretion visitation, and risk perceptionsare explored. The hazard notices did not affect recreation visitation, although investment was affected. A perceived loss in the market value of homes was documented. Risk perceptions were altered for property owners. Communication of the probability of an event over time would enhance hazard notices as a policy instrument and would mitigate unnecessary market perturbations. ?? 1990.

  12. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    International Nuclear Information System (INIS)

    Longwell, R.; Keifer, J.; Goodin, S.

    2001-01-01

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events

  13. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    Science.gov (United States)

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  14. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  15. Mathematical models for estimating earthquake casualties and damage cost through regression analysis using matrices

    International Nuclear Information System (INIS)

    Urrutia, J D; Bautista, L A; Baccay, E B

    2014-01-01

    The aim of this study was to develop mathematical models for estimating earthquake casualties such as death, number of injured persons, affected families and total cost of damage. To quantify the direct damages from earthquakes to human beings and properties given the magnitude, intensity, depth of focus, location of epicentre and time duration, the regression models were made. The researchers formulated models through regression analysis using matrices and used α = 0.01. The study considered thirty destructive earthquakes that hit the Philippines from the inclusive years 1968 to 2012. Relevant data about these said earthquakes were obtained from Philippine Institute of Volcanology and Seismology. Data on damages and casualties were gathered from the records of National Disaster Risk Reduction and Management Council. This study will be of great value in emergency planning, initiating and updating programs for earthquake hazard reduction in the Philippines, which is an earthquake-prone country.

  16. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  17. 14 CFR 437.29 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  18. Seismic hazards: New trends in analysis using geologic data

    International Nuclear Information System (INIS)

    Schwartz, D.P.; Coppersmith, K.J.

    1986-01-01

    In the late 1960s and early 1970s, largely in response to expansion of nuclear power plant siting and issuance of a code of federal regullations by the Nuclear Regulatory Commission referred to as Appendix A-10CFR100, the need to characterize the earthquake potential of individual faults for seismic design took on greater importance. Appendix A established deterministic procedures for assessing the seismic hazard at nuclear power plant sites. Bonilla and Buchanan, using data from historical suface-faulting earthquakes, developed a set of statistical correlations relating earthquake magnitude to surface rupture length and to surface displacement. These relationships have been refined and updated along with the relationship between fault area and magnitude and seismic moment and moment magnitude have served as the basis for selecting maximum earthquakes in a wide variety of design situations. In the paper presented, the authors discuss new trends in seismic hazard analysis using geologic data, with special emphasis on fault-zone segmentation and recurrence models and the way in which they provide a basis for evaluating long-term earthquake potential

  19. Earthquake Hazard in the New Madrid Seismic Zone Remains a Concern

    Science.gov (United States)

    Frankel, A.D.; Applegate, D.; Tuttle, M.P.; Williams, R.A.

    2009-01-01

    There is broad agreement in the scientific community that a continuing concern exists for a major destructive earthquake in the New Madrid seismic zone. Many structures in Memphis, Tenn., St. Louis, Mo., and other communities in the central Mississippi River Valley region are vulnerable and at risk from severe ground shaking. This assessment is based on decades of research on New Madrid earthquakes and related phenomena by dozens of Federal, university, State, and consulting earth scientists. Considerable interest has developed recently from media reports that the New Madrid seismic zone may be shutting down. These reports stem from published research using global positioning system (GPS) instruments with results of geodetic measurements of strain in the Earth's crust. Because of a lack of measurable strain at the surface in some areas of the seismic zone over the past 14 years, arguments have been advanced that there is no buildup of stress at depth within the New Madrid seismic zone and that the zone may no longer pose a significant hazard. As part of the consensus-building process used to develop the national seismic hazard maps, the U.S. Geological Survey (USGS) convened a workshop of experts in 2006 to evaluate the latest findings in earthquake hazards in the Eastern United States. These experts considered the GPS data from New Madrid available at that time that also showed little to no ground movement at the surface. The experts did not find the GPS data to be a convincing reason to lower the assessment of earthquake hazard in the New Madrid region, especially in light of the many other types of data that are used to construct the hazard assessment, several of which are described here.

  20. Seismic hazard analysis for Jayapura city, Papua

    International Nuclear Information System (INIS)

    Robiana, R.; Cipta, A.

    2015-01-01

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds

  1. Seismic hazard analysis for Jayapura city, Papua

    Energy Technology Data Exchange (ETDEWEB)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A. [Geological Agency, Diponegoro Road No.57, Bandung, 40122 (Indonesia)

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  2. Earthquake Scenarios Based Upon the Data and Methodologies of the U.S. Geological Survey's National Seismic Hazard Mapping Project

    Science.gov (United States)

    Rukstales, K. S.; Petersen, M. D.; Frankel, A. D.; Harmsen, S. C.; Wald, D. J.; Quitoriano, V. R.; Haller, K. M.

    2011-12-01

    The U.S. Geological Survey's (USGS) National Seismic Hazard Mapping Project (NSHMP) utilizes a database of over 500 faults across the conterminous United States to constrain earthquake source models for probabilistic seismic hazard maps. Additionally, the fault database is now being used to produce a suite of deterministic ground motions for earthquake scenarios that are based on the same fault source parameters and empirical ground motion prediction equations used for the probabilistic hazard maps. Unlike the calculated hazard map ground motions, local soil amplification is applied to the scenario calculations based on the best available Vs30 (average shear-wave velocity down to 30 meters) mapping, or in some cases using topographic slope as a proxy. Systematic outputs include all standard USGS ShakeMap products, including GIS, KML, XML, and HAZUS input files. These data are available from the ShakeMap web pages with a searchable archive. The scenarios are being produced within the framework of a geographic information system (GIS) so that alternative scenarios can readily be produced by altering fault source parameters, Vs30 soil amplification, as well as the weighting of ground motion prediction equations used in the calculations. The alternative scenarios can then be used for sensitivity analysis studies to better characterize uncertainty in the source model and convey this information to decision makers. By providing a comprehensive collection of earthquake scenarios based upon the established data and methods of the USGS NSHMP, we hope to provide a well-documented source of data which can be used for visualization, planning, mitigation, loss estimation, and research purposes.

  3. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    Science.gov (United States)

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    to the NSHM scenario were developed for the Hilton Creek and Hartley Springs Faults to account for different opinions in how far these two faults extend into Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice: the deterministic seismic hazard analysis program developed by Art Frankel of USGS and three Next Generation Ground Motion Attenuation (NGA) models. Ground motion calculations incorporated the potential amplification of seismic shaking by near-surface soils defined by a map of the average shear wave velocity in the uppermost 30 m (VS30) developed by CGS.In addition to ground shaking and shaking-related ground failure such as liquefaction and earthquake induced landslides, earthquakes cause surface rupture displacement, which can lead to severe damage of buildings and lifelines. For each earthquake scenario, potential surface fault displacements are estimated using deterministic and probabilistic approaches. Liquefaction occurs when saturated sediments lose their strength because of ground shaking. Zones of potential liquefaction are mapped by incorporating areas where loose sandy sediments, shallow groundwater, and strong earthquake shaking coincide in the earthquake scenario. The process for defining zones of potential landslide and rockfall incorporates rock strength, surface slope, and existing landslides, with ground motions caused by the scenario earthquake.Each scenario is illustrated with maps of seismic shaking potential and fault displacement, liquefaction, and landslide potential. Seismic shaking is depicted by the distribution of shaking intensity, peak ground acceleration, and 1.0-second spectral acceleration. One-second spectral acceleration correlates well with structural damage to surface facilities. Acceleration greater than 0.2 g is often associated with strong ground shaking and may cause moderate to heavy damage. The extent of strong shaking is influenced by subsurface fault dip and near

  4. Development of uniform hazard response spectra for rock sites considering line and point sources of earthquakes

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Kushwaha, H.S.

    2001-12-01

    Traditionally, the seismic design basis ground motion has been specified by normalised response spectral shapes and peak ground acceleration (PGA). The mean recurrence interval (MRI) used to computed for PGA only. It is shown that the MRI associated with such response spectra are not the same at all frequencies. The present work develops uniform hazard response spectra i.e. spectra having the same MRI at all frequencies for line and point sources of earthquakes by using a large number of strong motion accelerograms recorded on rock sites. Sensitivity of the number of the results to the changes in various parameters has also been presented. This work is an extension of an earlier work for aerial sources of earthquakes. These results will help to determine the seismic hazard at a given site and the associated uncertainities. (author)

  5. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  6. Active tectonics of the Seattle fault and central Puget sound, Washington - Implications for earthquake hazards

    Science.gov (United States)

    Johnson, S.Y.; Dadisman, S.V.; Childs, J. R.; Stanley, W.D.

    1999-01-01

    We use an extensive network of marine high-resolution and conventional industry seismic-reflection data to constrain the location, shallow structure, and displacement rates of the Seattle fault zone and crosscutting high-angle faults in the Puget Lowland of western Washington. Analysis of seismic profiles extending 50 km across the Puget Lowland from Lake Washington to Hood Canal indicates that the west-trending Seattle fault comprises a broad (4-6 km) zone of three or more south-dipping reverse faults. Quaternary sediment has been folded and faulted along all faults in the zone but is clearly most pronounced along fault A, the northernmost fault, which forms the boundary between the Seattle uplift and Seattle basin. Analysis of growth strata deposited across fault A indicate minimum Quaternary slip rates of about 0.6 mm/yr. Slip rates across the entire zone are estimated to be 0.7-1.1 mm/yr. The Seattle fault is cut into two main segments by an active, north-trending, high-angle, strike-slip fault zone with cumulative dextral displacement of about 2.4 km. Faults in this zone truncate and warp reflections in Tertiary and Quaternary strata and locally coincide with bathymetric lineaments. Cumulative slip rates on these faults may exceed 0.2 mm/yr. Assuming no other crosscutting faults, this north-trending fault zone divides the Seattle fault into 30-40-km-long western and eastern segments. Although this geometry could limit the area ruptured in some Seattle fault earthquakes, a large event ca. A.D. 900 appears to have involved both segments. Regional seismic-hazard assessments must (1) incorporate new information on fault length, geometry, and displacement rates on the Seattle fault, and (2) consider the hazard presented by the previously unrecognized, north-trending fault zone.

  7. Protection of the human race against natural hazards (asteroids, comets, volcanoes, earthquakes)

    Science.gov (United States)

    Smith, Joseph V.

    1985-10-01

    Although we justifiably worry about the danger of nuclear war to civilization, and perhaps even to survival of the human race, we tend to consider natural hazards (e.g., comets, asteroids, volcanoes, earthquakes) as unavoidable acts of God. In any human lifetime, a truly catastrophic natural event is very unlikely, but ultimately one will occur. For the first time in human history we have sufficient technical skills to begin protection of Earth from some natural hazards. We could decide collectively throughout the world to reassign resources: in particular, reduction of nuclear and conventional weapons to a less dangerous level would allow concomitant increase of international programs for detection and prevention of natural hazards. Worldwide cooperation to mitigate natural hazards might help psychologically to lead us away from the divisive bickering that triggers wars. Future generations could hail us as pioneers of peace and safety rather than curse us as agents of death and destruction.

  8. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    Science.gov (United States)

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  9. Pakistan’s Earthquake and Tsunami Hazards Potential Impact on Infrastructure

    Directory of Open Access Journals (Sweden)

    GEORGE PARARAS-CARAYANNIS

    2011-06-01

    Full Text Available Interaction of the Indian, Arabian and Eurasian tectonic plates has resulted in the formation of major active fault systems in South Asia. Compression along the tectonic boundaries results in thrust or reverse type of faulting and zones of crustal deformation characterized by high seismic activity and continuing Orogenesis. The more intense seismic activity occurs near regions of thrust faulting which is developing at the Himalayan foothills. In northern Pakistan, the Hindu Kush Mountains converge with the Karakoram Range to form a part of the Himalayan mountain system. Northern, western as well as southern Pakistan, Kashmir and northern India and Afghanistan are along such zones of high seismic activity. In Pakistan, most of the earthquakes occur in the north and western regions along the boundary of the Indian tectonic plate with the Iranian and Afghan micro-plates. The active zone extends from the Makran region in the southwest to the Hazara-Kashmir syntaxial bend in the north. Southwest Pakistan is vulnerable to both earthquake and tsunami hazards. In 2005, earthquakes devastated northern Pakistan and Kashmir and severely affected the cities of Muzaffarabad, Islamadad and Rawalpindi, causing severe destruction to the infrastructure of the northern region. A major earthquake along an extensive transform fault system in 1935 destroyed the city Quetta and the adjoining region. A major earthquake along the northern Arabian sea in 1945 generated a very destructive tsunami along the coasts of Baluchistan and Sindh Provinces. The region near Karachi is vulnerable as it is located near four major faults where destructive earthquakes and tsunamis have occurred in the past. Given Pakistan’s vulnerability and extensive infrastructure development in recent years, the present study reviews briefly the earthquake and tsunami risk factors and assesses the impact that such disasters can have on the country’s critical infrastructure - which includes

  10. Reducing Vulnerability of Ports and Harbors to Earthquake and Tsunami Hazards

    Science.gov (United States)

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.

    2002-01-01

    Recent scientific research suggests the Pacific Northwest could experience catastrophic earthquakes in the near future, both from distant and local sources, posing a significant threat to coastal communities. Damage could result from numerous earthquake-related hazards, such as severe ground shaking, soil liquefaction, landslides, land subsidence/uplift, and tsunami inundation. Because of their geographic location, ports and harbors are especially vulnerable to these hazards. Ports and harbors, however, are important components of many coastal communities, supporting numerous activities critical to the local and regional economy and possibly serving as vital post-event, response-recovery transportation links. A collaborative, multi-year initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to earthquake and tsunami hazards, involving Oregon Sea Grant (OSG), Washington Sea Grant (WSG), the National Oceanic and Atmospheric Administration Coastal Services Center (CSC), and the U.S. Geological Survey Center for Science Policy (CSP). Specific products of this research, planning, and outreach initiative include a regional stakeholder issues and needs assessment, a community-based mitigation planning process, a Geographic Information System (GIS) — based vulnerability assessment methodology, an educational web-site and a regional data archive. This paper summarizes these efforts, including results of two pilot port-harbor community projects, one in Yaquina Bay, Oregon and the other in Sinclair Inlet, Washington. Finally, plans are outlined for outreach to other port and harbor communities in the Pacific Northwest and beyond, using "getting started" workshops and a web-based tutorial.

  11. Earthquake shaking hazard estimates and exposure changes in the conterminous United States

    Science.gov (United States)

    Jaiswal, Kishor S.; Petersen, Mark D.; Rukstales, Kenneth S.; Leith, William S.

    2015-01-01

    A large portion of the population of the United States lives in areas vulnerable to earthquake hazards. This investigation aims to quantify population and infrastructure exposure within the conterminous U.S. that are subjected to varying levels of earthquake ground motions by systematically analyzing the last four cycles of the U.S. Geological Survey's (USGS) National Seismic Hazard Models (published in 1996, 2002, 2008 and 2014). Using the 2013 LandScan data, we estimate the numbers of people who are exposed to potentially damaging ground motions (peak ground accelerations at or above 0.1g). At least 28 million (~9% of the total population) may experience 0.1g level of shaking at relatively frequent intervals (annual rate of 1 in 72 years or 50% probability of exceedance (PE) in 50 years), 57 million (~18% of the total population) may experience this level of shaking at moderately frequent intervals (annual rate of 1 in 475 years or 10% PE in 50 years), and 143 million (~46% of the total population) may experience such shaking at relatively infrequent intervals (annual rate of 1 in 2,475 years or 2% PE in 50 years). We also show that there is a significant number of critical infrastructure facilities located in high earthquake-hazard areas (Modified Mercalli Intensity ≥ VII with moderately frequent recurrence interval).

  12. Geophysical surveying in the Sacramento Delta for earthquake hazard assessment and measurement of peat thickness

    Science.gov (United States)

    Craig, M. S.; Kundariya, N.; Hayashi, K.; Srinivas, A.; Burnham, M.; Oikawa, P.

    2017-12-01

    Near surface geophysical surveys were conducted in the Sacramento-San Joaquin Delta for earthquake hazard assessment and to provide estimates of peat thickness for use in carbon models. Delta islands have experienced 3-8 meters of subsidence during the past century due to oxidation and compaction of peat. Projected sea level rise over the next century will contribute to an ongoing landward shift of the freshwater-saltwater interface, and increase the risk of flooding due to levee failure or overtopping. Seismic shear wave velocity (VS) was measured in the upper 30 meters to determine Uniform Building Code (UBC)/ National Earthquake Hazard Reduction Program (NEHRP) site class. Both seismic and ground penetrating radar (GPR) methods were employed to estimate peat thickness. Seismic surface wave surveys were conducted at eight sites on three islands and GPR surveys were conducted at two of the sites. Combined with sites surveyed in 2015, the new work brings the total number of sites surveyed in the Delta to twenty.Soil boreholes were made at several locations using a hand auger, and peat thickness ranged from 2.1 to 5.5 meters. Seismic surveys were conducted using the multichannel analysis of surface wave (MASW) method and the microtremor array method (MAM). On Bouldin Island, VS of the surficial peat layer was 32 m/s at a site with pure peat and 63 m/s at a site peat with higher clay and silt content. Velocities at these sites reached a similar value, about 125 m/s, at a depth of 10 m. GPR surveys were performed at two sites on Sherman Island using 100 MHz antennas, and indicated the base of the peat layer at a depth of about 4 meters, consistent with nearby auger holes.The results of this work include VS depth profiles and UBC/NEHRP site classifications. Seismic and GPR methods may be used in a complementary fashion to estimate peat thickness. The seismic surface wave method is a relatively robust method and more effective than GPR in many areas with high clay

  13. Approaches that use seismic hazard results to address topics of nuclear power plant seismic safety, with application to the Charleston earthquake issue

    International Nuclear Information System (INIS)

    Sewell, R.T.; McGuire, R.K.; Toro, G.R.; Stepp, J.C.; Cornell, C.A.

    1990-01-01

    Plant seismic safety indicators include seismic hazard at the SSE (safe shut-down earthquake) acceleration, seismic margin, reliability against core damage, and reliability against offsite consequences. This work examines the key role of hazard analysis in evaluating these indicators and in making rational decisions regarding plant safety. The paper outlines approaches that use seismic hazard results as a basis for plant seismic safety evaluation and applies one of these approaches to the Charleston earthquake issue. This approach compares seismic hazard results that account for the Charleston tectonic interpretation, using the EPRI-Seismicity Owners Group (SOG) methodology, with hazard results that are consistent with historical tectonic interpretations accepted in regulation. Based on hazard results for a set of 21 eastern U.S. nuclear power plant sites, the comparison shows that no systematic 'plant-to-plant' increase in hazard accompanies the Charleston hypothesis; differences in mean hazards for the two interpretations are generally insignificant relative to current uncertainties in seismic hazard. (orig.)

  14. Effects of Strike-Slip Fault Segmentation on Earthquake Energy and Seismic Hazard

    Science.gov (United States)

    Madden, E. H.; Cooke, M. L.; Savage, H. M.; McBeck, J.

    2014-12-01

    Many major strike-slip faults are segmented along strike, including those along plate boundaries in California and Turkey. Failure of distinct fault segments at depth may be the source of multiple pulses of seismic radiation observed for single earthquakes. However, how and when segmentation affects fault behavior and energy release is the basis of many outstanding questions related to the physics of faulting and seismic hazard. These include the probability for a single earthquake to rupture multiple fault segments and the effects of segmentation on earthquake magnitude, radiated seismic energy, and ground motions. Using numerical models, we quantify components of the earthquake energy budget, including the tectonic work acting externally on the system, the energy of internal rock strain, the energy required to overcome fault strength and initiate slip, the energy required to overcome frictional resistance during slip, and the radiated seismic energy. We compare the energy budgets of systems of two en echelon fault segments with various spacing that include both releasing and restraining steps. First, we allow the fault segments to fail simultaneously and capture the effects of segmentation geometry on the earthquake energy budget and on the efficiency with which applied displacement is accommodated. Assuming that higher efficiency correlates with higher probability for a single, larger earthquake, this approach has utility for assessing the seismic hazard of segmented faults. Second, we nucleate slip along a weak portion of one fault segment and let the quasi-static rupture propagate across the system. Allowing fractures to form near faults in these models shows that damage develops within releasing steps and promotes slip along the second fault, while damage develops outside of restraining steps and can prohibit slip along the second fault. Work is consumed in both the propagation of and frictional slip along these new fractures, impacting the energy available

  15. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    Science.gov (United States)

    Bostenaru Dan, M.

    2009-04-01

    This special issue includes selected papers on the topic of earthquake impact from the sessions held in 2004 in Nice, France and in 2005 in Vienna, Austria at the first and respectivelly the second European Geosciences Union General Assembly. Since its start in 1999, in the Hague, Netherlands, the hazard of earthquakes has been the most popular of the session. The respective calls in 2004 was for: Nature's forces including earthquakes, floods, landslides, high winds and volcanic eruptions can inflict losses to urban settlements and man-made structures such as infrastructure. In Europe, recent years have seen such significant losses from earthquakes in south and south-eastern Europe, floods in central Europe, and wind storms in western Europe. Meanwhile, significant progress has been made in understanding disasters. Several scientific fields contribute to a holistic approach in the evaluation of capacities, vulnerabilities and hazards, the main factors on mitigating urban disasters due to natural hazards. An important part of the session is devoted to assessment of earthquake shaking and loss scenarios, including both physical damage and human causalities. Early warning and rapid damage evaluation are of utmost importance for addressing the safety of many essential facilities, for emergency management of events and for disaster response. In case of earthquake occurrence strong motion networks, data processing and interpretation lead to preliminary estimation (scenarios) of geographical distribution of damages. Factual information on inflicted damage, like those obtained from shaking maps or aerial imagery permit a confrontation with simulation maps of damage in order to define a more accurate picture of the overall losses. Most recent developments towards quantitative and qualitative simulation of natural hazard impacts on urban areas, which provide decision-making support for urban disaster management, and success stories of and lessons learned from disaster

  16. Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3

    Science.gov (United States)

    Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.

    2017-12-01

    Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.

  17. 2017 One‐year seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes

    Science.gov (United States)

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert; Llenos, Andrea L.; Ellsworth, William L.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2017-01-01

    We produce a one‐year 2017 seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one‐year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic‐hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one year) in five focus areas: Oklahoma–Kansas, the Raton basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 moment magnitude (M) ≥4 and 3 M≥5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma–Kansas focus area, two earthquakes with M≥4 occurred near Trinidad, Colorado (in the Raton basin focus area), but no earthquakes with M≥2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground‐shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared with 2015, which may be related to decreased wastewater injection caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  18. The analysis of historical seismograms: an important tool for seismic hazard assessment. Case histories from French and Italian earthquakes; L'analyse des sismogrammes historiques: un outil important pour l'evaluation de l'alea sismique. Etudes de cas de tremblements de terre en France et en Italie

    Energy Technology Data Exchange (ETDEWEB)

    Pino, N.A. [Istituto Nazionale di Geofisica e Vulcanologia, Osservatorio Vesuviano, Via Diocleziano 328, 80124 Napoli (Italy)

    2011-06-15

    Seismic hazard assessment relies on the knowledge of the source characteristics of past earthquakes. Unfortunately, seismic waveform analysis, representing the most powerful tool for the investigation of earthquake source parameters, is only possible for events occurred in the last 100-120 years, i.e., since seismographs with known response function were developed. Nevertheless, during this time significant earthquakes have been recorded by such instruments and today, also thanks to technological progress, these data can be recovered and analysed by means of modern techniques. In this paper, aiming at giving a general sketch of possible analyses and attainable results in historical seismogram studies, I briefly describe the major difficulties in processing the original waveforms and present a review of the results that I obtained from previous seismogram analysis of selected significant historical earthquakes occurred during the first decades of the 20. century, including (A) the December 28, 1908, Messina straits (southern Italy), (B) the June 11, 1909, Lambesc (southern France) - both of which are the strongest ever recorded instrumentally in their respective countries - and (C) the July 13, 1930, Irpinia (southern Italy) events. For these earthquakes, the major achievements are represented by the assessment of the seismic moment (A, B, C), the geometry and kinematics of faulting (B, C), the fault length and an approximate slip distribution (A, C). The source characteristics of the studied events have also been interpreted in the frame of the tectonic environment active in the respective region of interest. In spite of the difficulties inherent to the investigation of old seismic data, these results demonstrate the invaluable and irreplaceable role of historical seismogram analysis in defining the local seismo-genic potential and, ultimately, for assessing the seismic hazard. The retrieved information is crucial in areas where important civil engineering works

  19. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  20. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau

    Science.gov (United States)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail

    2018-02-01

    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  1. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    Science.gov (United States)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-09-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction

  2. Semi-automated landform classification for hazard mapping of soil liquefaction by earthquake

    Science.gov (United States)

    Nakano, Takayuki

    2018-05-01

    Soil liquefaction damages were caused by huge earthquake in Japan, and the similar damages are concerned in near future huge earthquake. On the other hand, a preparation of soil liquefaction risk map (soil liquefaction hazard map) is impeded by the difficulty of evaluation of soil liquefaction risk. Generally, relative soil liquefaction risk should be able to be evaluated from landform classification data by using experimental rule based on the relationship between extent of soil liquefaction damage and landform classification items associated with past earthquake. Therefore, I rearranged the relationship between landform classification items and soil liquefaction risk intelligibly in order to enable the evaluation of soil liquefaction risk based on landform classification data appropriately and efficiently. And I developed a new method of generating landform classification data of 50-m grid size from existing landform classification data of 250-m grid size by using digital elevation model (DEM) data and multi-band satellite image data in order to evaluate soil liquefaction risk in detail spatially. It is expected that the products of this study contribute to efficient producing of soil liquefaction hazard map by local government.

  3. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    Science.gov (United States)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses

  4. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model

    Science.gov (United States)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza

    2017-08-01

    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  5. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  6. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    Science.gov (United States)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  7. Space-time behavior of continental intraplate earthquakes and implications for hazard assessment in China and the Central U.S.

    Science.gov (United States)

    Stein, Seth; Liu, Mian; Luo, Gang; Wang, Hui

    2014-05-01

    much faster than it accumulates today, suggesting that they result from recent fault activation that releases prestored strain energy in the crust. If so, this earthquake sequence is similar to aftershocks in that the rates of energy release should decay with time and the sequence of earthquakes will eventually end. We use simple physical analysis and numerical simulations to show that the current New Madrid earthquake sequence is likely ending or has ended. Recognizing that mid-continental earthquakes have long aftershock sequences and complex spatiotemporal occurrences is critical to improve hazard assessments

  8. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  9. 14 CFR 437.55 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  10. Geological Deformations and Potential Hazards Triggered by the 01-12-2010 Haiti Earthquake: Insights from Google Earth Imagery

    Science.gov (United States)

    Doblas, M.; Benito, B.; Torres, Y.; Belizaire, D.; Dorfeuille, J.; Aretxabala, A.

    2013-05-01

    In this study we compare the different Google Earth imagery (GEI) available before and after the 01-12-2010 earthquake of Haiti and carry out a detailed analysis of the superficial seismic-related geological deformations in the following sites: 1) the capital Port-Au-Prince and other cities (Carrefour and Gresslier); 2) the mountainous area of the Massif de la Selle which is transected by the "Enriquillo-Plaintain-Garden" (EPG) interplate boundary-fault (that supposedly triggered the seism); 3) some of the most important river channels and their corresponding deltas (Momanche, Grise and Frorse). The initial results of our researches were published in March 2010 in a special web page created by the scientific community to try to mitigate the devastating effects of this catastrophe (http://supersites.earthobservations.org/haiti.php). Six types of superficial geological deformations triggered by the seismic event have been identified with the GEI: liquefaction structures, chaotic rupture zones, coastal and domal uplifts, river-delta turnovers, faults/ruptures and landslides. Potential geological hazards triggered by the Haiti earthquake include landslides, inundations, reactivation of active tectonic elements (e.g., fractures), river-delta turnovers, etc. We analyzed again the GEI after the rain period and, as expected, most of the geological deformations that we initially identified had been erased and/or modified by the water washout or buried by the sediments. In this sense the GEI constitutes an invaluable instrument in the analysis of seismic geological hazards: we still have the possibility to compare all the images before and after the seism that are recorded in its useful "time tool". These are in fact the only witnesses of most of the geological deformations triggered by the Haiti earthquake that remain stored in the virtual archives of the GEI. In fact a field trip to the area today would be useless as most of these structures have disappeared. We will show

  11. Development of direct multi-hazard susceptibility assessment method for post-earthquake reconstruction planning in Nepal

    Science.gov (United States)

    Mavrouli, Olga; Rana, Sohel; van Westen, Cees; Zhang, Jianqiang

    2017-04-01

    After the devastating 2015 Gorkha earthquake in Nepal, reconstruction activities have been delayed considerably, due to many reasons, of a political, organizational and technical nature. Due to the widespread occurrence of co-seismic landslides, and the expectation that these may be aggravated or re-activated in future years during the intense monsoon periods, there is a need to evaluate for thousands of sites whether these are suited for reconstruction. In this evaluation multi-hazards, such as rockfall, landslides, debris flow, and flashfloods should be taken into account. The application of indirect knowledge-based, data-driven or physically-based approaches is not suitable due to several reasons. Physically-based models generally require a large number of parameters, for which data is not available. Data-driven, statistical methods, depend on historical information, which is less useful after the occurrence of a major event, such as an earthquake. Besides, they would lead to unacceptable levels of generalization, as the analysis is done based on rather general causal factor maps. The same holds for indirect knowledge-driven methods. However, location-specific hazards analysis is required using a simple method that can be used by many people at the local level. In this research, a direct scientific method was developed where local level technical people can easily and quickly assess the post-earthquake multi hazards following a decision tree approach, using an app on a smartphone or tablet. The methods assumes that a central organization, such as the Department of Soil Conservation and Watershed Management, generates spatial information beforehand that is used in the direct assessment at a certain location. Pre-earthquake, co-seismic and post-seismic landslide inventories are generated through the interpretation of Google Earth multi-temporal images, using anaglyph methods. Spatial data, such as Digital Elevation Models, land cover maps, and geological maps are

  12. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  13. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    Garrett, R.J.

    2005-01-01

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified

  14. Assessing the Utility of and Improving USGS Earthquake Hazards Program Products

    Science.gov (United States)

    Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.

    2010-12-01

    A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.

  15. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2005-12-01

    The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility

  16. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  17. Network similarity and statistical analysis of earthquake seismic data

    OpenAIRE

    Deyasi, Krishanu; Chakraborty, Abhijit; Banerjee, Anirban

    2016-01-01

    We study the structural similarity of earthquake networks constructed from seismic catalogs of different geographical regions. A hierarchical clustering of underlying undirected earthquake networks is shown using Jensen-Shannon divergence in graph spectra. The directed nature of links indicates that each earthquake network is strongly connected, which motivates us to study the directed version statistically. Our statistical analysis of each earthquake region identifies the hub regions. We cal...

  18. 21 CFR 120.7 - Hazard analysis.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food...

  19. Earthquake risk reduction in the United States: An assessment of selected user needs and recommendations for the National Earthquake Hazards Reduction Program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This Assessment was conducted to improve the National Earthquake Hazards Reduction Program (NEHRP) by providing NEHRP agencies with information that supports their user-oriented setting of crosscutting priorities in the NEHRP strategic planning process. The primary objective of this Assessment was to take a ``snapshot`` evaluation of the needs of selected users throughout the major program elements of NEHRP. Secondary objectives were to conduct an assessment of the knowledge that exists (or is being developed by NEHRP) to support earthquake risk reduction, and to begin a process of evaluating how NEHRP is meeting user needs. An identification of NEHRP`s strengths also resulted from the effort, since those strengths demonstrate successful methods that may be useful to NEHRP in the future. These strengths are identified in the text, and many of them represent important achievements since the Earthquake Hazards Reduction Act was passed in 1977.

  20. Estimate of airborne release of plutonium from Babcock and Wilcox plant as a result of severe wind hazard and earthquake

    International Nuclear Information System (INIS)

    Mishima, J.; Schwendiman, L.C.; Ayer, J.E.

    1978-10-01

    As part of an interdisciplinary study to evaluate the potential radiological consequences of wind hazard and earthquake upon existing commercial mixed oxide fuel fabrication plants, the potential mass airborne releases of plutonium (source terms) from such events are estimated. The estimated souce terms are based upon the fraction of enclosures damaged to three levels of severity (crush, puncture penetrate, and loss of external filter, in order of decreasing severity), called damage ratio, and the airborne release if all enclosures suffered that level of damage. The discussion of damage scenarios and source terms is divided into wind hazard and earthquake scenarios in order of increasing severity. The largest airborne releases from the building were for cases involving the catastrophic collapse of the roof over the major production areas--wind hazard at 110 mph and earthquakes with peak ground accelerations of 0.20 to 0.29 g. Wind hazards at higher air velocities and earthquakes with higher ground acceleration do not result in significantly greater source terms. The source terms were calculated as additional mass of respirable particles released with time up to 4 days; and, under these assumptions, approximately 98% of the mass of material of concern is made airborne from 2 h to 4 days after the event. The overall building source terms from the damage scenarios evaluated are shown in a table. The contribution of individual areas to the overall building source term is presented in order of increasing severity for wind hazard and earthquake

  1. Coulomb Failure Stress Accumulation in Nepal After the 2015 Mw 7.8 Gorkha Earthquake: Testing Earthquake Triggering Hypothesis and Evaluating Seismic Hazards

    Science.gov (United States)

    Xiong, N.; Niu, F.

    2017-12-01

    A Mw 7.8 earthquake struck Gorkha, Nepal, on April 5, 2015, resulting in more than 8000 deaths and 3.5 million homeless. The earthquake initiated 70km west of Kathmandu and propagated eastward, rupturing an area of approximately 150km by 60km in size. However, the earthquake failed to fully rupture the locked fault beneath the Himalaya, suggesting that the region south of Kathmandu and west of the current rupture are still locked and a much more powerful earthquake might occur in future. Therefore, the seismic hazard of the unruptured region is of great concern. In this study, we investigated the Coulomb failure stress (CFS) accumulation on the unruptured fault transferred by the Gorkha earthquake and some nearby historical great earthquakes. First, we calculated the co-seismic CFS changes of the Gorkha earthquake on the nodal planes of 16 large aftershocks to quantitatively examine whether they were brought closer to failure by the mainshock. It is shown that at least 12 of the 16 aftershocks were encouraged by an increase of CFS of 0.1-3 MPa. The correspondence between the distribution of off-fault aftershocks and the increased CFS pattern also validates the applicability of the earthquake triggering hypothesis in the thrust regime of Nepal. With the validation as confidence, we calculated the co-seismic CFS change on the locked region imparted by the Gorkha earthquake and historical great earthquakes. A newly proposed ramp-flat-ramp-flat fault geometry model was employed, and the source parameters of historical earthquakes were computed with the empirical scaling relationship. A broad region south of the Kathmandu and west of the current rupture were shown to be positively stressed with CFS change roughly ranging between 0.01 and 0.5 MPa. The maximum of CFS increase (>1MPa) was found in the updip segment south of the current rupture, implying a high seismic hazard. Since the locked region may be additionally stressed by the post-seismic relaxation of the lower

  2. Summary of November 2010 meeting to evaluate turbidite data for constraining the recurrence parameters of great Cascadia earthquakes for the update of national seismic hazard maps

    Science.gov (United States)

    Frankel, Arthur D.

    2011-01-01

    , 1996), which were the basis for seismic provisions in the 2000 International Building Code. These hazard maps used the paleoseismic studies to constrain the recurrence rate of great CSZ earthquakes. Goldfinger and his colleagues have since collected many more deep ocean cores and done extensive analysis on the turbidite deposits that they identified in the cores (Goldfinger and others, 2003, 2008, in press; Goldfinger, 2011). Using their dating of the sediments and correlation of features in the logs of density and magnetic susceptibility between cores, they developed a detailed chronology of great earthquakes along the CSZ for the past 10,000 years (Goldfinger and others, in press). These correlations consist of attempting to match the peaks and valleys in logs of density and magnetic susceptibility between cores separated, in some cases, by hundreds of kilometers. Based on this work, Goldfinger and others (2003, 2008, in press) proposed that the turbidite evidence indicated the occurrence of great earthquakes (Mw 8) that only ruptured the southern portion of the CSZ, as well as earthquakes with about Mw 9 that ruptured the entire length of the CSZ. For the southernmost portion of the CSZ, Goldfinger and others (in press) proposed a recurrence time of Mw 8 or larger earthquakes of about 230 years. This proposed recurrence time was shorter than the 500 year time that was incorporated in one scenario in the NSHM’s. It is important to note that the hazard maps of 1996 and later also included a scenario or set of scenarios with a shorter recurrence time for Mw 8 earthquakes, using rupture zones that are distributed along the length of the CSZ (Frankel and others, 1996; Petersen and others, 2008). Originally, this scenario was meant to correspond to the idea that some of the 500-year averaged ruptures seen in the paleoseismic evidence could have been a series of Mw 8 earthquakes that occurred over a short period of time (a few decades), rather than Mw 9 earthquakes

  3. Seismological and geological investigation for earthquake hazard in the Greater Accra Metropolitan Area

    International Nuclear Information System (INIS)

    Doku, M. S.

    2013-07-01

    A seismological and geological investigation for earthquake hazard in the Greater Accra Metropolitan Area was undertaken. The research was aimed at employing a methematical model to estimate the seismic stress for the study area by generating a complete, unified and harmonized earthquake catalogue spanning 1615 to 2012. Seismic events were souced from Leydecker, G. and P. Amponsah, (1986), Ambraseys and Adams, (1986), Amponsah (2008), Geological Survey Department, Accra, Ghana, Amponsah (2002), National Earthquake Information Service, United States Geological Survey, Denver, Colorado 80225, USA, the International Seismological Centre and the National Data Centre of the Ghana Atomic Energy Commission. Events occurring in the study area were used to create and Epicentral Intensity Map and a seismicity map of the study area after interpolation of missing seismic magnitudes. The least square method and the maximum likelihood estimation method were employed to evaluate b-values of 0.6 and 0.9 respectively for the study area. A thematic map of epicentral intensity superimposed on the geology of the study area was also developed to help understand the relationship between the virtually fractured, jointed and sheared geology and the seismic events. The results obtained are indicative of the fact that the stress level of GAMA has a telling effect on its seismicity and also the events are prevalents at fractured, jointed and sheared zones. (au)

  4. Canister storage building hazard analysis report

    International Nuclear Information System (INIS)

    POWERS, T.B.

    1999-01-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis was performed in accordance with the DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', and meets the intent of HNF-PRO-704, ''Hazard and Accident Analysis Process''. This hazard analysis implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports''

  5. EARTHQUAKE INDUCED LIQUEFACTION ANALYSIS OF

    African Journals Online (AJOL)

    liquefaction analysis of Tendaho earth-fill dam, which is part ... sugar cane plantation in an area of 60,000 hectares. The project .... The model is prepared using the QUAKE/W program for the ..... Geo-slope International, Ltd., Canada. Dynamic ...

  6. Evaluating earthquake hazards in the Los Angeles region; an earth-science perspective

    Science.gov (United States)

    Ziony, Joseph I.

    1985-01-01

    Potentially destructive earthquakes are inevitable in the Los Angeles region of California, but hazards prediction can provide a basis for reducing damage and loss. This volume identifies the principal geologically controlled earthquake hazards of the region (surface faulting, strong shaking, ground failure, and tsunamis), summarizes methods for characterizing their extent and severity, and suggests opportunities for their reduction. Two systems of active faults generate earthquakes in the Los Angeles region: northwest-trending, chiefly horizontal-slip faults, such as the San Andreas, and west-trending, chiefly vertical-slip faults, such as those of the Transverse Ranges. Faults in these two systems have produced more than 40 damaging earthquakes since 1800. Ninety-five faults have slipped in late Quaternary time (approximately the past 750,000 yr) and are judged capable of generating future moderate to large earthquakes and displacing the ground surface. Average rates of late Quaternary slip or separation along these faults provide an index of their relative activity. The San Andreas and San Jacinto faults have slip rates measured in tens of millimeters per year, but most other faults have rates of about 1 mm/yr or less. Intermediate rates of as much as 6 mm/yr characterize a belt of Transverse Ranges faults that extends from near Santa Barbara to near San Bernardino. The dimensions of late Quaternary faults provide a basis for estimating the maximum sizes of likely future earthquakes in the Los Angeles region: moment magnitude .(M) 8 for the San Andreas, M 7 for the other northwest-trending elements of that fault system, and M 7.5 for the Transverse Ranges faults. Geologic and seismologic evidence along these faults, however, suggests that, for planning and designing noncritical facilities, appropriate sizes would be M 8 for the San Andreas, M 7 for the San Jacinto, M 6.5 for other northwest-trending faults, and M 6.5 to 7 for the Transverse Ranges faults. The

  7. The 2012 Ferrara seismic sequence: Regional crustal structure, earthquake sources, and seismic hazard

    Science.gov (United States)

    Malagnini, Luca; Herrmann, Robert B.; Munafò, Irene; Buttinelli, Mauro; Anselmi, Mario; Akinci, Aybige; Boschi, E.

    2012-10-01

    Inadequate seismic design codes can be dangerous, particularly when they underestimate the true hazard. In this study we use data from a sequence of moderate-sized earthquakes in northeast Italy to validate and test a regional wave propagation model which, in turn, is used to understand some weaknesses of the current design spectra. Our velocity model, while regionalized and somewhat ad hoc, is consistent with geophysical observations and the local geology. In the 0.02-0.1 Hz band, this model is validated by using it to calculate moment tensor solutions of 20 earthquakes (5.6 ≥ MW ≥ 3.2) in the 2012 Ferrara, Italy, seismic sequence. The seismic spectra observed for the relatively small main shock significantly exceeded the design spectra to be used in the area for critical structures. Observations and synthetics reveal that the ground motions are dominated by long-duration surface waves, which, apparently, the design codes do not adequately anticipate. In light of our results, the present seismic hazard assessment in the entire Pianura Padana, including the city of Milan, needs to be re-evaluated.

  8. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    Science.gov (United States)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  9. Sensitivity Analysis of Evacuation Speed in Hypothetical NPP Accident by Earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Effective emergency response in emergency situation of nuclear power plant (NPP) can make consequences be different therefore it is regarded important when establishing an emergency response plan and assessing the risk of hypothetical NPP accident. Situation of emergency response can be totally changed when NPP accident caused by earthquake or tsunami is considered due to the failure of roads and buildings by the disaster. In this study evacuation speed has been focused among above various factors and reasonable evacuation speed in earthquake scenario has been investigated. Finally, sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Evacuation scenario can be entirely different in the situation of seismic hazard and the sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Various references were investigated and earthquake evacuation model has been developed considering that evacuees may convert their evacuation method from using a vehicle to walking when they face the difficulty of using a vehicle due to intense traffic jam, failure of buildings and roads, and etc. The population dose within 5 km / 30 km have been found to be increased in earthquake situation due to decreased evacuation speed and become 1.5 - 2 times in the severest earthquake evacuation scenario set up in this study. It is not agreed that using same emergency response model which is used for normal evacuation situations when performing level 3 probabilistic safety assessment for earthquake and tsunami event. Investigation of data and sensitivity analysis for constructing differentiated emergency response model in the event of seismic hazard has been carried out in this study.

  10. Sensitivity Analysis of Evacuation Speed in Hypothetical NPP Accident by Earthquake

    International Nuclear Information System (INIS)

    Kim, Sung-yeop; Lim, Ho-Gon

    2016-01-01

    Effective emergency response in emergency situation of nuclear power plant (NPP) can make consequences be different therefore it is regarded important when establishing an emergency response plan and assessing the risk of hypothetical NPP accident. Situation of emergency response can be totally changed when NPP accident caused by earthquake or tsunami is considered due to the failure of roads and buildings by the disaster. In this study evacuation speed has been focused among above various factors and reasonable evacuation speed in earthquake scenario has been investigated. Finally, sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Evacuation scenario can be entirely different in the situation of seismic hazard and the sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Various references were investigated and earthquake evacuation model has been developed considering that evacuees may convert their evacuation method from using a vehicle to walking when they face the difficulty of using a vehicle due to intense traffic jam, failure of buildings and roads, and etc. The population dose within 5 km / 30 km have been found to be increased in earthquake situation due to decreased evacuation speed and become 1.5 - 2 times in the severest earthquake evacuation scenario set up in this study. It is not agreed that using same emergency response model which is used for normal evacuation situations when performing level 3 probabilistic safety assessment for earthquake and tsunami event. Investigation of data and sensitivity analysis for constructing differentiated emergency response model in the event of seismic hazard has been carried out in this study

  11. The smart cluster method. Adaptive earthquake cluster identification and analysis in strong seismic regions

    Science.gov (United States)

    Schaefer, Andreas M.; Daniell, James E.; Wenzel, Friedemann

    2017-07-01

    Earthquake clustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation for probabilistic seismic hazard assessment. This study introduces the Smart Cluster Method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal cluster identification. It utilises the magnitude-dependent spatio-temporal earthquake density to adjust the search properties, subsequently analyses the identified clusters to determine directional variation and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010-2011 Darfield-Christchurch sequence, a reclassification procedure is applied to disassemble subsequent ruptures using near-field searches, nearest neighbour classification and temporal splitting. The method is capable of identifying and classifying earthquake clusters in space and time. It has been tested and validated using earthquake data from California and New Zealand. A total of more than 1500 clusters have been found in both regions since 1980 with M m i n = 2.0. Utilising the knowledge of cluster classification, the method has been adjusted to provide an earthquake declustering algorithm, which has been compared to existing methods. Its performance is comparable to established methodologies. The analysis of earthquake clustering statistics lead to various new and updated correlation functions, e.g. for ratios between mainshock and strongest aftershock and general aftershock activity metrics.

  12. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States

    Science.gov (United States)

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in

  13. Input parameters for the statistical seismic hazard assessment in central part of Romania territory using crustal earthquakes

    International Nuclear Information System (INIS)

    Moldovan, A.I.; Bazacliu, O.; Popescu, E.

    2004-01-01

    The seismic hazard assessment in dense-populated geographical regions and subsequently the design of the strategic objectives (dams, nuclear power plants, etc.) are based on the knowledge of the seismicity parameters of the seismic sources which can generate ground motion amplitudes above the minimum level considered risky at the specific site and the way the seismic waves propagate between the focus and the site. The purpose of this paper is to provide a set of information required for a probabilistic assessment of the seismic hazard in the central Romanian territory relative to the following seismic sources: Fagaras zone (FC), Campulung zone (CP), and Transilvania zone (TD) all of them in the crust domain. Extremely vulnerable objectives are present in the central part of Romania, including cities of Pitesti and Sibiu and the 'Vidraru' dam. The analysis that we propose implies: (1) geometrical definition of the seismic sources, (2) estimation of the maximum possible magnitude, (3) estimation of the frequency - magnitude relationship and (4) estimation of the attenuation laws. As an example, the obtained input parameters are used to evaluate the seismic hazard distribution due to the crustal earthquakes applying the McGuire's procedure (1976). These preliminary results are in good agreement with the previous research based on deterministic approach (Radulian et al., 2000). (authors)

  14. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    Science.gov (United States)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  15. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    Science.gov (United States)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  16. CE-PA: A user's manual for determination of controlling earthquakes and development of seismic hazard information data base for the central and eastern United States

    International Nuclear Information System (INIS)

    Short, C.

    1995-05-01

    The CE-PA, Controlling Earthquake(s) through Probabilistic Analysis, software package developed at Lawrence Livermore National Laboratory (LLNL) is a research program used as part of a study performed for the US Office of Nuclear Regulatory Research Division Engineering project on Geosciences Issues in the revision of geological siting criteria. The objectives of this study were to explore ways on how to use results from probabilistic seismic hazard characterization (PSHC) to determine hazard-consistent scenario earthquakes and to develop design ground motion. The purpose of this document is to describe the CE-PA software to users. The software includes two operating system and process controllers plus several fortran routines and input decks. This manual gives an overview of the methodology to estimate controlling earthquakes in Section I. A descriptive overview of the procedures and the organization of the program modules used in CE-PA is provided in Section II. Section III contains four example executions with comments and a graphical display of each execution path, plus an overview of the directory/file structure. Section IV provides some general observations regarding the model

  17. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    Science.gov (United States)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  18. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-12

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios received increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.

  19. A consideration of hazards, earthquakes, aircraft crashes, explosions and fires in the safety of laboratories and plants

    International Nuclear Information System (INIS)

    Doumenc, A.; Faure, J.; Mohammadioun, B.; Jacquet, P.

    1987-03-01

    Although laboratories and plants differ from nuclear reactors both in their characteristics and sitings, safety measures developed for the hazards of earthquakes, aircraft crashes, explosions and fires are very similar. These measures provide a satisfactory level of safety for these installations [fr

  20. Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Eisses, A.; Kell, A.; Kent, G. [UNR; Driscoll, N. [UCSD; Karlin, R.; Baskin, R. [USGS; Louie, J. [UNR; Pullammanappallil, S. [Optim

    2016-08-01

    Amy Eisses, Annie M. Kell, Graham Kent, Neal W. Driscoll, Robert E. Karlin, Robert L. Baskin, John N. Louie, Kenneth D. Smith, Sathish Pullammanappallil, 2011, Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada: presented at American Geophysical Union Fall Meeting, San Francisco, Dec. 5-9, abstract NS14A-08.

  1. Cold Vacuum Drying Facility hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  2. Canister storage building hazard analysis report

    International Nuclear Information System (INIS)

    Krahn, D.E.; Garvin, L.J.

    1997-01-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  3. Cold Vacuum Drying Facility hazard analysis report

    International Nuclear Information System (INIS)

    Krahn, D.E.

    1998-01-01

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports

  4. Probabilistic Safety Assessment (PSA) of Natural External Hazards Including Earthquakes. Workshop Proceedings, Prague, Czech Republic, 17-20 June 2013

    International Nuclear Information System (INIS)

    2014-01-01

    The Fukushima Dai-ichi accident triggered discussions about the significance of external hazards and their treatment in safety analyses. In addition, stress tests results have shown vulnerabilities and potential of cliff-edge effects in plant responses to external hazards and have identified possibilities and priorities for improvements and safety measures' implementation at specific sites and designs. In order to address these issues and provide relevant conclusions and recommendations to CSNI and CNRA, the CSNI Working Group on Risk Assessment (WGRISK) directed, in cooperation with the CSNI Working Group on Integrity and Ageing of Components and Structures (WGIAGE), a workshop hosted by UJV Rez. The key objectives of the workshop were to collect information from the OECD member states on methods and approaches being used, and experience gained in probabilistic safety assessment of natural external hazards, as well as to support the fulfillment of the CSNI task on 'PSA of natural external hazards including earthquakes'. These objectives are described more in detail in the introduction in Chapter 1 of this report. The WGRISK activities preceding the workshop and leading to the decision to organize it are described in Chapter 2 of this report. The focus of the workshop was on external events PSA for nuclear power plants, including all modes of operation. The workshop scope was generally limited to external, natural hazards, including those hazards where the distinction between natural and man-made hazards is not sharp. The detailed information about the presentations, discussions, and results of the workshop is presented in Chapter 3 of this report. Some general conclusions were agreed on during the workshop, which are presented in the following paragraphs. - The lessons learned from the Fukushima Dai-ichi reactor accidents and related actions at the national, regional, and global level have emphasized the importance to assess risks associated (authors) with

  5. Historical analysis of US pipeline accidents triggered by natural hazards

    Science.gov (United States)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  6. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    Energy Technology Data Exchange (ETDEWEB)

    Türker, Tuğba, E-mail: tturker@ktu.edu.tr [Karadeniz Technical University, Department of Geophysics, Trabzon/Turkey (Turkey); Bayrak, Yusuf, E-mail: ybayrak@agri.edu.tr [Ağrı İbrahim Çeçen University, Ağrı/Turkey (Turkey)

    2016-04-18

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M{sub S}=7.3 and 1897, M{sub S}=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M{sub S} magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99

  7. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    International Nuclear Information System (INIS)

    Türker, Tuğba; Bayrak, Yusuf

    2016-01-01

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M_S=7.3 and 1897, M_S=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M_S magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99 with an earthquake

  8. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    International Nuclear Information System (INIS)

    Payne, S. M.; Gorman, V. W.; Jensen, S. A.; Nitzel, M. E.; Russell, M. J.; Smith, R. P.

    2000-01-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process

  9. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard... fish or fishery product being processed in the absence of those controls. (b) The HACCP plan. Every...

  10. Hazard analysis in uranium hexafluoride production facility

    International Nuclear Information System (INIS)

    Marin, Maristhela Passoni de Araujo

    1999-01-01

    The present work provides a method for preliminary hazard analysis of nuclear fuel cycle facilities. The proposed method identify both chemical and radiological hazards, as well as the consequences associated with accident scenarios. To illustrate the application of the method, a uranium hexafluoride production facility was selected. The main hazards are identified and the potential consequences are quantified. It was found that, although the facility handles radioactive material, the main hazards as associated with releases of toxic chemical substances such as hydrogen fluoride, anhydrous ammonia and nitric acid. It was shown that a contention bung can effectively reduce the consequences of atmospheric release of toxic materials. (author)

  11. Y-12 site-specific earthquake response analysis and soil liquefaction assessment

    International Nuclear Information System (INIS)

    Ahmed, S.B.; Hunt, R.J.; Manrod, W.E. III.

    1995-01-01

    A site-specific earthquake response analysis and soil liquefaction assessment were performed for the Oak Ridge Y-12 Plant. The main purpose of these studies was to use the results of the analyses for evaluating the safety of the performance category -1, -2, and -3 facilities against the natural phenomena seismic hazards. Earthquake response was determined for seven (7), one dimensional soil columns (Fig. 12) using two horizontal components of the PC-3 design basis 2000-year seismic event. The computer program SHAKE 91 (Ref. 7) was used to calculate the absolute response accelerations on top of ground (soil/weathered shale) and rock outcrop. The SHAKE program has been validated for horizontal response calculations at periods less than 2.0 second at several sites and consequently is widely accepted in the geotechnical earthquake engineering area for site response analysis

  12. GPS Imaging of Time-Variable Earthquake Hazard: The Hilton Creek Fault, Long Valley California

    Science.gov (United States)

    Hammond, W. C.; Blewitt, G.

    2016-12-01

    The Hilton Creek Fault, in Long Valley, California is a down-to-the-east normal fault that bounds the eastern edge of the Sierra Nevada/Great Valley microplate, and lies half inside and half outside the magmatically active caldera. Despite the dense coverage with GPS networks, the rapid and time-variable surface deformation attributable to sporadic magmatic inflation beneath the resurgent dome makes it difficult to use traditional geodetic methods to estimate the slip rate of the fault. While geologic studies identify cumulative offset, constrain timing of past earthquakes, and constrain a Quaternary slip rate to within 1-5 mm/yr, it is not currently possible to use geologic data to evaluate how the potential for slip correlates with transient caldera inflation. To estimate time-variable seismic hazard of the fault we estimate its instantaneous slip rate from GPS data using a new set of algorithms for robust estimation of velocity and strain rate fields and fault slip rates. From the GPS time series, we use the robust MIDAS algorithm to obtain time series of velocity that are highly insensitive to the effects of seasonality, outliers and steps in the data. We then use robust imaging of the velocity field to estimate a gridded time variable velocity field. Then we estimate fault slip rate at each time using a new technique that forms ad-hoc block representations that honor fault geometries, network complexity, connectivity, but does not require labor-intensive drawing of block boundaries. The results are compared to other slip rate estimates that have implications for hazard over different time scales. Time invariant long term seismic hazard is proportional to the long term slip rate accessible from geologic data. Contemporary time-invariant hazard, however, may differ from the long term rate, and is estimated from the geodetic velocity field that has been corrected for the effects of magmatic inflation in the caldera using a published model of a dipping ellipsoidal

  13. Subduction zone and crustal dynamics of western Washington; a tectonic model for earthquake hazards evaluation

    Science.gov (United States)

    Stanley, Dal; Villaseñor, Antonio; Benz, Harley

    1999-01-01

    The Cascadia subduction zone is extremely complex in the western Washington region, involving local deformation of the subducting Juan de Fuca plate and complicated block structures in the crust. It has been postulated that the Cascadia subduction zone could be the source for a large thrust earthquake, possibly as large as M9.0. Large intraplate earthquakes from within the subducting Juan de Fuca plate beneath the Puget Sound region have accounted for most of the energy release in this century and future such large earthquakes are expected. Added to these possible hazards is clear evidence for strong crustal deformation events in the Puget Sound region near faults such as the Seattle fault, which passes through the southern Seattle metropolitan area. In order to understand the nature of these individual earthquake sources and their possible interrelationship, we have conducted an extensive seismotectonic study of the region. We have employed P-wave velocity models developed using local earthquake tomography as a key tool in this research. Other information utilized includes geological, paleoseismic, gravity, magnetic, magnetotelluric, deformation, seismicity, focal mechanism and geodetic data. Neotectonic concepts were tested and augmented through use of anelastic (creep) deformation models based on thin-plate, finite-element techniques developed by Peter Bird, UCLA. These programs model anelastic strain rate, stress, and velocity fields for given rheological parameters, variable crust and lithosphere thicknesses, heat flow, and elevation. Known faults in western Washington and the main Cascadia subduction thrust were incorporated in the modeling process. Significant results from the velocity models include delineation of a previously studied arch in the subducting Juan de Fuca plate. The axis of the arch is oriented in the direction of current subduction and asymmetrically deformed due to the effects of a northern buttress mapped in the velocity models. This

  14. Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    Science.gov (United States)

    Necmioglu, Ocal; Meral Ozel, Nurcan

    2015-04-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the earthquakes resembling the

  15. On the long-term seismic hazard analysis in the Zhangjiakou Penglai seismotectonic zone, China

    Science.gov (United States)

    Fu, Zhengxiang; Liu, Jie; Liu, Guiping

    2004-10-01

    The Zhangjiakou-Penglai seismotectonic zone (ZPSZ) lies in the northern part of North China and extends along the Zhangjiakou-Beijing-Tianjin-Bohai Bay-Penglai-Yellow Sea. It is about 900 km long and some 250 km wide in a northwest direction. The great Sanhe-Pinggu ( MS=8.0) earthquake occurred on September 1679 and the Tangshan ( MS=7.8) earthquake on July 1976 caused serious economic and life losses. According to some differences in crust structure and regional tectonic stress field, the ZPSZ is divided into western and eastern segment by the 117°E line for study on long-term seismic hazard analysis. An analysis of Gutenberg-Richter's empirical relation of earthquake-frequency and time process of historic and recent earthquakes along the eastern and western segments shows that the earthquake activity obeys a Poisson process, and these calculations indicate that the earthquake occurrence probability of MS=6.0-6.9 is 0.77-0.83 in the eastern segment and the earthquake occurrence probability of MS=7.0-7.9 is 0.78-0.80 in the western segment of the ZPSZ during a period from 2005 to 2015.

  16. Research on the spatial analysis method of seismic hazard for island

    International Nuclear Information System (INIS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-01-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform. (paper)

  17. Research on the spatial analysis method of seismic hazard for island

    Science.gov (United States)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  18. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  19. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    Science.gov (United States)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  20. Fire hazard analysis of the radioactive mixed waste trenchs

    International Nuclear Information System (INIS)

    McDonald, K.M.

    1995-01-01

    This Fire Hazards Analysis (FHA) is intended to assess comprehensively the risk from fire associated with the disposal of low level radioactive mixed waste in trenches within the lined landfills, provided by Project W-025, designated Trench 31 and 34 of the Burial Ground 218-W-5. Elements within the FHA make recommendations for minimizing risk to workers, the public, and the environment from fire during the course of the operation's activity. Transient flammables and combustibles present that support the operation's activity are considered and included in the analysis. The graded FHA contains the following elements: description of construction, protection of essential safety class equipment, fire protection features, description of fire hazards, life safety considerations, critical process equipment, high value property, damage potential--maximum credible fire loss (MCFL) and maximum possible fire loss (MPFL), fire department/brigade response, recovery potential, potential for a toxic, biological and/or radiation incident due to a fire, emergency planning, security considerations related to fire protection, natural hazards (earthquake, flood, wind) impact on fire safety, and exposure fire potential, including the potential for fire spread between fire areas. Recommendations for limiting risk are made in the text of this report and printed in bold type. All recommendations are repeated in a list in Section 18.0

  1. Deterministic Tectonic Origin Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    Science.gov (United States)

    Necmioglu, O.; Meral Ozel, N.

    2014-12-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) and medium (1 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the

  2. Modeling and Hazard Analysis Using STPA

    Science.gov (United States)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  3. Coseismic Strain Steps of the 2008 Wenchuan Earthquake Indicate EW Extension of Tibetan Plateau and Increased Hazard South to Epicenter

    Science.gov (United States)

    Fu, G.; Shen, X.; Tang, J.; Fukuda, Y.

    2008-12-01

    The 2008 Wenchuan earthquake (Ms8.0) occurred at the east edge of Tibetan Plateau. It is the biggest seismic disaster in China since the 1976 Tangshan earthquake. To determine the effects of the earthquake on the deformation field of Tibetan Plateau, we collect and analyze continuing strain data of three stations before and after the earthquake in Tibetan Plateau observed by capacitance-type bore-hole strainmeters (Chi, 1985). We collect strain data in NS, EW, NE-SW and NW-NS directions at each borehole. Then we deduce the co-seismic strain steps at time point 14:28 of May 12, 2008 (at this time point the earthquake occurred) with the data before and after the earthquake using the least squares method. Our observation shows that in Tibetan Plateau significant co-seismic strain steps are accompanied with the 2008 Wenchuan earthquake. Extension in EW direction is observed at interior and north Tibetan Plateau which indicates a rapid EW extension of the whole Plateau. Field investigation shows that the 2008 Wenchuan earthquake is a manifestation of eastward growth of the Tibetan Plateau (Dong et al., 2008). Eastwards growth of the Tibetan Plateau results naturally in the extension of the Plateau in EW direction. Our co-seismic strain observation agrees well with the conclusion from surface rupture investigation. The magnitude of co-seismic strain step equals to five times of average year extensional strain rate throughout the plateau interior. Shortening in SE- NW direction is observed at the east edge of the Plateau. As hints that the eastward extension of Tibetan Plateau is resisted by Sichuan rigid basin which increases the potential earthquake hazard around the observation station, manifests the declaration from co-seismic stress changes calculation (Persons et al., 2008). Our observed co-seismic strain steps are in total lager than theoretical calculations of dislocation theories which indicate that magnitude of the great earthquake should be bigger than 7.9. Due

  4. Earthquake response analysis of a base isolated building

    International Nuclear Information System (INIS)

    Mazda, T.; Shiojiri, H.; Sawada, Y.; Harada, O.; Kawai, N.; Ontsuka, S.

    1989-01-01

    Recently, the seismic isolation has become one of the popular methods in the design of important structures or equipments against the earthquakes. However, it is desired to accumulate the demonstration data on reliability of seismically isolated structures and to establish the analysis methods of those structures. Based on the above recognition, the vibration tests of a base isolated building were carried out in Tsukuba Science City. After that, many earthquake records have been obtained at the building. In order to examine the validity of numerical models, earthquake response analyses have been executed by using both lumped mass model, and finite element model

  5. Safety analysis of nuclear containment vessels subjected to strong earthquakes and subsequent tsunamis

    Directory of Open Access Journals (Sweden)

    Feng Lin

    2017-08-01

    Full Text Available Nuclear power plants under expansion and under construction in China are mostly located in coastal areas, which means they are at risk of suffering strong earthquakes and subsequent tsunamis. This paper presents a safety analysis for a new reinforced concrete containment vessel in such events. A finite element method-based model was built, verified, and first used to understand the seismic performance of the containment vessel under earthquakes with increased intensities. Then, the model was used to assess the safety performance of the containment vessel subject to an earthquake with peak ground acceleration (PGA of 0.56g and subsequent tsunamis with increased inundation depths, similar to the 2011 Great East earthquake and tsunami in Japan. Results indicated that the containment vessel reached Limit State I (concrete cracking and Limit State II (concrete crushing when the PGAs were in a range of 0.8–1.1g and 1.2–1.7g, respectively. The containment vessel reached Limit State I with a tsunami inundation depth of 10 m after suffering an earthquake with a PGA of 0.56g. A site-specific hazard assessment was conducted to consider the likelihood of tsunami sources.

  6. Safety analysis of nuclear containment vessels subjected to strong earthquakes and subsequent tsunamis

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Feng; Li, Hong Zhi [Dept. Structural Engineering, Tongji University, Shanghai (China)

    2017-08-15

    Nuclear power plants under expansion and under construction in China are mostly located in coastal areas, which means they are at risk of suffering strong earthquakes and subsequent tsunamis. This paper presents a safety analysis for a new reinforced concrete containment vessel in such events. A finite element method-based model was built, verified, and first used to understand the seismic performance of the containment vessel under earthquakes with increased intensities. Then, the model was used to assess the safety performance of the containment vessel subject to an earthquake with peak ground acceleration (PGA) of 0.56g and subsequent tsunamis with increased inundation depths, similar to the 2011 Great East earthquake and tsunami in Japan. Results indicated that the containment vessel reached Limit State I (concrete cracking) and Limit State II (concrete crushing) when the PGAs were in a range of 0.8–1.1g and 1.2–1.7g, respectively. The containment vessel reached Limit State I with a tsunami inundation depth of 10 m after suffering an earthquake with a PGA of 0.56g. A site-specific hazard assessment was conducted to consider the likelihood of tsunami sources.

  7. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    International Nuclear Information System (INIS)

    Logan, Richard C.

    2002-01-01

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events

  8. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    International Nuclear Information System (INIS)

    Kubicek, J. L.

    2001-01-01

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events

  9. Simulation analysis of earthquake response of nuclear power plant to the 2003 Miyagi-Oki earthquake

    International Nuclear Information System (INIS)

    Yoshihiro Ogata; Kiyoshi Hirotani; Masayuki Higuchi; Shingo Nakayama

    2005-01-01

    On May 26, 2003 an earthquake of magnitude scale 7.1 (Japan Meteorological Agency) occurred just offshore of Miyagi Prefecture. This was the largest earthquake ever experienced by the nuclear power plant of Tohoku Electric Power Co. in Onagawa (hereafter the Onagawa Nuclear Power Plant) during the 19 years since it had started operations in 1984. In this report, we review the vibration characteristics of the reactor building of the Onagawa Nuclear Power Plant Unit 1 based on acceleration records observed at the building, and give an account of a simulation analysis of the earthquake response carried out to ascertain the appropriateness of design procedure and a seismic safety of the building. (authors)

  10. Seismic hazard analysis of the NPP Kozloduy site

    International Nuclear Information System (INIS)

    Petrovski, D.; Stamatovska, S.; Arsovski, M.; Hadzievski, D.; Sokerova, D.; Solakov, D.; Vaptzarov, I.; Satchanski, S.

    1993-01-01

    The principal objective of this study is to define the seismic hazard for the NPP Kozloduy site. Seismic hazard is by rule defined by the probability distribution function of the peak value of the chosen ground motion parameter in a defined time interval. The overall study methodology consists of reviewing the existing geological, seismological and tectonic information to formulate this information into a mathematical model of seismic activity of the region and using this assess earthquake ground motion in terms of probability. Detailed regional and local seismological investigations have been performed. Regional investigations encompass the area within a radius of 320 km from the NPP Kozloduy site. The results of these investigations include all seismological parameters that are necessary for determination of the mathematical model of the seismicity of the region needed for the seismic hazard analysis. Regional geological and neotectonic investigations were also performed for the wider area including almost the whole territory of Bulgaria, a large part of Serbia, part of Macedonia and almost the whole south part of Romania

  11. Vulnerability assessment of archaeological sites to earthquake hazard: An indicator based method integrating spatial and temporal aspects

    Directory of Open Access Journals (Sweden)

    Despina Minos-Minopoulos

    2017-07-01

    Full Text Available Across the world, numerous sites of cultural heritage value are at risk from a variety of human-induced and natural hazards such as war and earthquakes. Here we present and test a novel indicator-based method for assessing the vulnerability of archaeological sites to earthquakes. Vulnerability is approached as a dynamic element assessed through a combination of spatial and temporal parameters. The spatial parameters examine the susceptibility of the sites to the secondary Earthquake Environmental Effects of ground liquefaction, landslides and tsunami and are expressed through the Spatial Susceptibility Index (SSi. Parameters of physical vulnerability, economic importance and visitors density examine the temporal vulnerability of the sites expressed through the Temporal Vulnerability Index (TVi. The equally weighted sum of the spatial and temporal indexes represents the total Archaeological Site Vulnerability Index (A.S.V.I.. The A.S.V.I method is applied at 16 archaeological sites across Greece, allowing an assessment of their vulnerability. This then allows the establishment of a regional and national priority list for considering future risk mitigation. Results indicate that i the majority of the sites have low to moderate vulnerability to earthquake hazard, ii Neratzia Fortress on Kos and Heraion on Samos are characterised as highly vulnerable and should be prioritised for further studies and mitigation measures, and iii the majority of the sites are susceptible to at least one Earthquake Environmental Effect and present relatively high physical vulnerability attributed to the existing limited conservation works. This approach highlights the necessity for an effective vulnerability assessment methodology within the existing framework of disaster risk management for cultural heritage.

  12. Gambling score in earthquake prediction analysis

    Science.gov (United States)

    Molchan, G.; Romashkova, L.

    2011-03-01

    The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.

  13. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    Science.gov (United States)

    Gori, Paula L.

    1993-01-01

    INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and

  14. Romanian earthquakes analysis using BURAR seismic array

    International Nuclear Information System (INIS)

    Borleanu, Felix; Rogozea, Maria; Nica, Daniela; Popescu, Emilia; Popa, Mihaela; Radulian, Mircea

    2008-01-01

    Bucovina seismic array (BURAR) is a medium-aperture array, installed in 2002 in the northern part of Romania (47.61480 N latitude, 25.21680 E longitude, 1150 m altitude), as a result of the cooperation between Air Force Technical Applications Center, USA and National Institute for Earth Physics, Romania. The array consists of ten elements, located in boreholes and distributed over a 5 x 5 km 2 area; nine with short-period vertical sensors and one with a broadband three-component sensor. Since the new station has been operating the earthquake survey of Romania's territory has been significantly improved. Data recorded by BURAR during 01.01.2005 - 12.31.2005 time interval are first processed and analyzed, in order to establish the array detection capability of the local earthquakes, occurred in different Romanian seismic zones. Subsequently a spectral ratios technique was applied in order to determine the calibration relationships for magnitude, using only the information gathered by BURAR station. The spectral ratios are computed relatively to a reference event, considered as representative for each seismic zone. This method has the advantage to eliminate the path effects. The new calibration procedure is tested for the case of Vrancea intermediate-depth earthquakes and proved to be very efficient in constraining the size of these earthquakes. (authors)

  15. Repository Subsurface Preliminary Fire Hazard Analysis

    International Nuclear Information System (INIS)

    Logan, Richard C.

    2001-01-01

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M and O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents

  16. Meeting of the Central and Eastern U.S. (CEUS) Earthquake Hazards Program October 28–29, 2009

    Science.gov (United States)

    Tuttle, Martitia; Boyd, Oliver; McCallister, Natasha

    2013-01-01

    On October 28th and 29th, 2009, the U.S. Geological Survey Earthquake Hazards Program held a meeting of Central and Eastern United States investigators and interested parties in Memphis, Tennessee. The purpose of the meeting was to bring together the Central and Eastern United States earthquake-hazards community to present and discuss recent research results, to promote communication and collaboration, to garner input regarding future research priorities, to inform the community about research opportunities afforded by the 2010–2012 arrival of EarthScope/USArray in the central United States, and to discuss plans for the upcoming bicentennial of the 1811–1812 New Madrid earthquakes. The two-day meeting included several keynote speakers, oral and poster presentations by attendees, and breakout sessions. The meeting is summarized in this report and can be subdivided into four primary sections: (1) summaries of breakout discussion groups; (2) list of meeting participants; (3) submitted abstracts; and (4) slide presentations. The abstracts and slides are included “as submitted” by the meeting participants and have not been subject to any formal peer review process; information contained in these sections reflects the opinions of the presenter at the time of the meeting and does not constitute endorsement by the U.S. Geological Survey.

  17. Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile

    Science.gov (United States)

    Pasten, D.

    2017-12-01

    The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability

  18. GIS risk analysis of hazardous materials transport

    International Nuclear Information System (INIS)

    Anders, C.; Olsten, J.

    1991-01-01

    The Geographic Information System (GIS) was used to assess the risks and vulnerability of transporting hazardous materials and wastes (such as gasoline, explosives, poisons, etc) on the Arizona highway system. This paper discusses the methodology that was utilized, and the application of GIS systems to risk analysis problems

  19. Keeping pace with the science: Seismic hazard analysis in the western United States

    International Nuclear Information System (INIS)

    Youngs, R.R.; Coppersmith, K.J.

    1989-01-01

    Recent years have witnessed rapid advances in the understanding of the earthquake generation process in the western US, with particular emphasis on geologic studies of fault behavior and seismologic studies of the rupture process. The authors discuss how probabilistic seismic hazard analysis (PSHA) methodologies have been refined to keep pace with scientific understanding. Identified active faults are modeled as three-dimensional surfaces with the rupture shape and distribution of nucleation points estimated from physical constraints and seismicity. Active blind thrust ramps at depth and sources associated with subduction zones such as the Cascadia zone off Oregon and Washington can also be modeled. Maximum magnitudes are typically estimated from evaluations of possible rupture dimensions and empirical relations between these dimensions and earthquake magnitude. A rapidly evolving technique for estimating the length of future ruptures on a fault is termed segmentation, and incorporates behavior and geometric fault characteristics. To extend the short historical record, fault slip rate is now commonly used to constrain earthquake recurrence. Paleoseismic studies of fault behavior have led to the characteristic earthquake recurrence model specifying the relative frequency of earthquakes of various sizes. Recent studies have indicated the importance of faulting style and crustal structure on earthquake ground motions. For site-specific applications, empirical estimation techniques are being supplemented with numerical modeling approaches

  20. The hazard map of ML6.6 0206 Meinong earthquake near Guanmiao and its Neotectonic implication

    Science.gov (United States)

    Chung, L. H.; Shyu, J. B. H.; Huang, M. H.; Yang, K. M.; Le Beon, M.; Lee, Y. H.; Chuang, R.; Yi, D.

    2016-12-01

    The serious damage was occurred in SW Taiwan by ML 6.6 0206 Meinong earthquake. Based on InSAR result, 10 cm oval-raised surface deformation is 15 km away from its epicenter, and two obviously N-S trend sharp phase change nearby Guanmiao area. Our field investigation shows bulling damage and surface fracture are high related with the two sharp phase change. Here, we perform the detailed shallow underground geometry by using reflection seismic data, geologic data, and field hazard investigation. This N-S trend surface deformation may be induced by local shallow folding, while the huge uplift west of Guanmiao may be related with pure shear deformation of thick clayey Gutingkeng (GTK) Formation. Our results imply that not only a moderate lower crustal earthquake can trigger active structure at shallower depth, but also those minor shallow active structures are occurred serious damage and surface deformation.

  1. Seismic Hazard Analysis on a Complex, Interconnected Fault Network

    Science.gov (United States)

    Page, M. T.; Field, E. H.; Milner, K. R.

    2017-12-01

    In California, seismic hazard models have evolved from simple, segmented prescriptive models to much more complex representations of multi-fault and multi-segment earthquakes on an interconnected fault network. During the development of the 3rd Uniform California Earthquake Rupture Forecast (UCERF3), the prevalence of multi-fault ruptures in the modeling was controversial. Yet recent earthquakes, for example, the Kaikora earthquake - as well as new research on the potential of multi-fault ruptures (e.g., Nissen et al., 2016; Sahakian et al. 2017) - have validated this approach. For large crustal earthquakes, multi-fault ruptures may be the norm rather than the exception. As datasets improve and we can view the rupture process at a finer scale, the interconnected, fractal nature of faults is revealed even by individual earthquakes. What is the proper way to model earthquakes on a fractal fault network? We show multiple lines of evidence that connectivity even in modern models such as UCERF3 may be underestimated, although clustering in UCERF3 mitigates some modeling simplifications. We need a methodology that can be applied equally well where the fault network is well-mapped and where it is not - an extendable methodology that allows us to "fill in" gaps in the fault network and in our knowledge.

  2. The need for the geologic hazard analysis

    International Nuclear Information System (INIS)

    Mingarro, E.

    1984-01-01

    The parameters which are considered in the hazard analysis associated with the movements of the Earth Crust are considered. These movements are classified as: fast movements or seismic movements, which are produced in a certain geologic moment at a speed measured in cm/sg, and slow movements or secular movements, which take place within a long span of time at a speed measured by cm/year. The relations space/time are established after Poisson and Gumbel's probabilistic models. Their application is analyzed according to the structural gradient fields, which fall within Matteron's geostatistics studies. These statistic criteria should be analyzed or checked up in each geo-tectonic environment. This allows the definition of neotectonic and seismogenetic zones, because it is only in these zones where the probabilistic or deterministic criteria can be applied to evaluate the hazard and vulnerability, that is, to know the geologic hazard of every ''Uniform'' piece of the Earth Crust. (author)

  3. Global volcanic earthquake swarm database and preliminary analysis of volcanic earthquake swarm duration

    Directory of Open Access Journals (Sweden)

    S. R. McNutt

    1996-06-01

    Full Text Available Global data from 1979 to 1989 pertaining to volcanic earthquake swarms have been compiled into a custom-designed relational database. The database is composed of three sections: 1 a section containing general information on volcanoes, 2 a section containing earthquake swarm data (such as dates of swarm occurrence and durations, and 3 a section containing eruption information. The most abundant and reliable parameter, duration of volcanic earthquake swarms, was chosen for preliminary analysis. The distribution of all swarm durations was found to have a geometric mean of 5.5 days. Precursory swarms were then separated from those not associated with eruptions. The geometric mean precursory swarm duration was 8 days whereas the geometric mean duration of swarms not associated with eruptive activity was 3.5 days. Two groups of precursory swarms are apparent when duration is compared with the eruption repose time. Swarms with durations shorter than 4 months showed no clear relationship with the eruption repose time. However, the second group, lasting longer than 4 months, showed a significant positive correlation with the log10 of the eruption repose period. The two groups suggest that different suites of physical processes are involved in the generation of volcanic earthquake swarms.

  4. Statistical analysis of earthquake ground motion parameters

    International Nuclear Information System (INIS)

    1979-12-01

    Several earthquake ground response parameters that define the strength, duration, and frequency content of the motions are investigated using regression analyses techniques; these techniques incorporate statistical significance testing to establish the terms in the regression equations. The parameters investigated are the peak acceleration, velocity, and displacement; Arias intensity; spectrum intensity; bracketed duration; Trifunac-Brady duration; and response spectral amplitudes. The study provides insight into how these parameters are affected by magnitude, epicentral distance, local site conditions, direction of motion (i.e., whether horizontal or vertical), and earthquake event type. The results are presented in a form so as to facilitate their use in the development of seismic input criteria for nuclear plants and other major structures. They are also compared with results from prior investigations that have been used in the past in the criteria development for such facilities

  5. Time-dependent earthquake hazard evaluation in seismogenic systems using mixed Markov Chains: An application to the Japan area

    Science.gov (United States)

    Herrera, C.; Nava, F. A.; Lomnitz, C.

    2006-08-01

    A previous work introduced a new method for seismic hazard evaluation in a system (a geographic area with distinct, but related seismogenic regions) based on modeling the transition probabilities of states (patterns of presence or absence of seismicity, with magnitude greater or equal to a threshold magnitude Mr, in the regions of the system, during a time interval Δt) as a Markov chain. Application of this direct method to the Japan area gave very good results. Given that the most important limitation of the direct method is the relative scarcity of large magnitude events, we decided to explore the possibility that seismicity with magnitude M ≥ Mmr contains information about the future occurrence of earthquakes with M ≥ Mmr > Mmr. This mixed Markov chain method estimates the probabilities of occurrence of a system state for M ≥ MMr on the basis of the observed state for M ≥ Mmr in the previous Δt. Application of the mixed method to the area of Japan gives better hazard estimations than the direct method; in particular for large earthquakes. As part of this study, the problem of performance evaluation of hazard estimation methods is addressed, leading to the use of grading functions.

  6. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  7. Probabilistic tsunami hazard assessment based on the long-term evaluation of subduction-zone earthquakes along the Sagami Trough, Japan

    Science.gov (United States)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Maeda, T.; Matsuyama, H.; Toyama, N.; Kito, T.; Murata, Y.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.; Hakamata, T.

    2017-12-01

    For the forthcoming large earthquakes along the Sagami Trough where the Philippine Sea Plate is subducting beneath the northeast Japan arc, the Earthquake Research Committee(ERC) /Headquarters for Earthquake Research Promotion, Japanese government (2014a) assessed that M7 and M8 class earthquakes will occur there and defined the possible extent of the earthquake source areas. They assessed 70% and 0% 5% of the occurrence probability within the next 30 years (from Jan. 1, 2014), respectively, for the M7 and M8 class earthquakes. First, we set possible 10 earthquake source areas(ESAs) and 920 ESAs, respectively, for M8 and M7 class earthquakes. Next, we constructed 125 characterized earthquake fault models (CEFMs) and 938 CEFMs, respectively, for M8 and M7 class earthquakes, based on "tsunami receipt" of ERC (2017) (Kitoh et al., 2016, JpGU). All the CEFMs are allowed to have a large slip area for expression of fault slip heterogeneity. For all the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. Finally, we re-distributed the occurrence probability to all CEFMs (Abe et al., 2014, JpGU) and gathered excess probabilities for variable tsunami heights, calculated from all the CEFMs, at every observation point along Pacific coast to get PTHA. We incorporated aleatory uncertainties inherent in tsunami calculation and earthquake fault slip heterogeneity. We considered two kinds of probabilistic hazard models; one is "Present-time hazard model" under an assumption that the earthquake occurrence basically follows a renewal process based on BPT distribution if the latest faulting time was known. The other is "Long-time averaged hazard model" under an assumption that earthquake occurrence follows a stationary Poisson process. We fixed our viewpoint, for example, on the probability that the tsunami height will exceed 3 meters at coastal points in next

  8. Java Programs for Using Newmark's Method and Simplified Decoupled Analysis to Model Slope Performance During Earthquakes

    Science.gov (United States)

    Jibson, Randall W.; Jibson, Matthew W.

    2003-01-01

    Landslides typically cause a large proportion of earthquake damage, and the ability to predict slope performance during earthquakes is important for many types of seismic-hazard analysis and for the design of engineered slopes. Newmark's method for modeling a landslide as a rigid-plastic block sliding on an inclined plane provides a useful method for predicting approximate landslide displacements. Newmark's method estimates the displacement of a potential landslide block as it is subjected to earthquake shaking from a specific strong-motion record (earthquake acceleration-time history). A modification of Newmark's method, decoupled analysis, allows modeling landslides that are not assumed to be rigid blocks. This open-file report is available on CD-ROM and contains Java programs intended to facilitate performing both rigorous and simplified Newmark sliding-block analysis and a simplified model of decoupled analysis. For rigorous analysis, 2160 strong-motion records from 29 earthquakes are included along with a search interface for selecting records based on a wide variety of record properties. Utilities are available that allow users to add their own records to the program and use them for conducting Newmark analyses. Also included is a document containing detailed information about how to use Newmark's method to model dynamic slope performance. This program will run on any platform that supports the Java Runtime Environment (JRE) version 1.3, including Windows, Mac OSX, Linux, Solaris, etc. A minimum of 64 MB of available RAM is needed, and the fully installed program requires 400 MB of disk space.

  9. Preliminary Hazards Analysis of K-Basin Fuel Encapsulation and Storage

    International Nuclear Information System (INIS)

    Strickland, G.C.

    1994-01-01

    This Preliminary Hazards Analysis (PHA) systematically examines the K-Basin facilities and their supporting systems for hazards created by abnormal operating conditions and external events (e.g., earthquakes) which have the potential for causing undesirable consequences to the facility worker, the onsite individual, or the public. The operational activities examined are fuel encapsulation, fuel storage and cooling. Encapsulation of sludges in the basins is not examined. A team of individuals from Westinghouse produced a set of Hazards and Operability (HAZOP) tables documenting their examination of abnormal process conditions in the systems and activities examined in K-Basins. The purpose of this report is to reevaluate and update the HAZOP in the original Preliminary Hazard Analysis of K-Basin Fuel Encapsulation and Storage originally developed in 1991

  10. Reliability analysis of service water system under earthquake

    International Nuclear Information System (INIS)

    Yu Yu; Qian Xiaoming; Lu Xuefeng; Wang Shengfei; Niu Fenglei

    2013-01-01

    Service water system is one of the important safety systems in nuclear power plant, whose failure probability is always gained by system reliability analysis. The probability of equipment failure under the earthquake is the function of the peak acceleration of earthquake motion, while the occurrence of earthquake is of randomicity, thus the traditional fault tree method in current probability safety assessment is not powerful enough to deal with such case of conditional probability problem. An analysis frame was put forward for system reliability evaluation in seismic condition in this paper, in which Monte Carlo simulation was used to deal with conditional probability problem. Annual failure probability of service water system was calculated, and failure probability of 1.46X10 -4 per year was obtained. The analysis result is in accordance with the data which indicate equipment seismic resistance capability, and the rationality of the model is validated. (authors)

  11. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    Science.gov (United States)

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  12. Widespread seismicity excitation following the 2011 M=9.0 Tohoku, Japan, earthquake and its implications for seismic hazard

    Science.gov (United States)

    Toda, S.; Stein, R. S.; Lin, J.

    2011-12-01

    trench slope normal faults, the Kanto fragment beneath Tokyo, the Itoigawa-Shizuoka Tectonic Line, and several other major faults were brought significantly closer to failure. Elevated seismicity in these areas is evident and sustained higher than normal during the 4.5 months after the Tohoku earthquake. Since several faults are overdue and closer to the next failure, an urgent update of the probabilistic seismic hazard map incorporating the impact of the great Tohoku earthquake is required.

  13. Fracture analysis of concrete gravity dam under earthquake induced ...

    African Journals Online (AJOL)

    Michael Horsfall

    Fracture analysis of concrete gravity dam under earthquake induced loads. 1. ABBAS MANSOURI;. 2 ... 1 Civil Engineering, Islamic Azad University (South Branch of Tehran)Tehran, Iran ..... parameter has on the results of numerical calculations. In this analysis ... with the help of Abaqus software (Abaqus theory manual ...

  14. Seismic hazard analysis. Application of methodology, results, and sensitivity studies

    International Nuclear Information System (INIS)

    Bernreuter, D.L.

    1981-10-01

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectra for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimated seismic hazard in this region of the country. (author)

  15. Fractal analysis of the spatial distribution of earthquakes along the Hellenic Subduction Zone

    Science.gov (United States)

    Papadakis, Giorgos; Vallianatos, Filippos; Sammonds, Peter

    2014-05-01

    slope of the recurrence curve to forecast earthquakes in Colombia. Earth Sci. Res. J., 8, 3-9. Makropoulos, K., Kaviris, G., Kouskouna, V., 2012. An updated and extended earthquake catalogue for Greece and adjacent areas since 1900. Nat. Hazards Earth Syst. Sci., 12, 1425-1430. Papadakis, G., Vallianatos, F., Sammonds, P., 2013. Evidence of non extensive statistical physics behavior of the Hellenic Subduction Zone seismicity. Tectonophysics, 608, 1037-1048. Papaioannou, C.A., Papazachos, B.C., 2000. Time-independent and time-dependent seismic hazard in Greece based on seismogenic sources. Bull. Seismol. Soc. Am., 90, 22-33. Robertson, M.C., Sammis, C.G., Sahimi, M., Martin, A.J., 1995. Fractal analysis of three-dimensional spatial distributions of earthquakes with a percolation interpretation. J. Geophys. Res., 100, 609-620. Turcotte, D.L., 1997. Fractals and chaos in geology and geophysics. Second Edition, Cambridge University Press. Vallianatos, F., Michas, G., Papadakis, G., Sammonds, P., 2012. A non-extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece). Acta Geophys., 60, 758-768.

  16. Using Earthquake Analysis to Expand the Oklahoma Fault Database

    Science.gov (United States)

    Chang, J. C.; Evans, S. C.; Walter, J. I.

    2017-12-01

    The Oklahoma Geological Survey (OGS) is compiling a comprehensive Oklahoma Fault Database (OFD), which includes faults mapped in OGS publications, university thesis maps, and industry-contributed shapefiles. The OFD includes nearly 20,000 fault segments, but the work is far from complete. The OGS plans on incorporating other sources of data into the OFD, such as new faults from earthquake sequence analyses, geologic field mapping, active-source seismic surveys, and potential fields modeling. A comparison of Oklahoma seismicity and the OFD reveals that earthquakes in the state appear to nucleate on mostly unmapped or unknown faults. Here, we present faults derived from earthquake sequence analyses. From 2015 to present, there has been a five-fold increase in realtime seismic stations in Oklahoma, which has greatly expanded and densified the state's seismic network. The current seismic network not only improves our threshold for locating weaker earthquakes, but also allows us to better constrain focal plane solutions (FPS) from first motion analyses. Using nodal planes from the FPS, HypoDD relocation, and historic seismic data, we can elucidate these previously unmapped seismogenic faults. As the OFD is a primary resource for various scientific investigations, the inclusion of seismogenic faults improves further derivative studies, particularly with respect to seismic hazards. Our primal focus is on four areas of interest, which have had M5+ earthquakes in recent Oklahoma history: Pawnee (M5.8), Prague (M5.7), Fairview (M5.1), and Cushing (M5.0). Subsequent areas of interest will include seismically active data-rich areas, such as the central and northcentral parts of the state.

  17. Promise and problems in using stress triggering models for time-dependent earthquake hazard assessment

    Science.gov (United States)

    Cocco, M.

    2001-12-01

    Earthquake stress changes can promote failures on favorably oriented faults and modify the seismicity pattern over broad regions around the causative faults. Because the induced stress perturbations modify the rate of production of earthquakes, they alter the probability of seismic events in a specified time window. Comparing the Coulomb stress changes with the seismicity rate changes and aftershock patterns can statistically test the role of stress transfer in earthquake occurrence. The interaction probability may represent a further tool to test the stress trigger or shadow model. The probability model, which incorporate stress transfer, has the main advantage to include the contributions of the induced stress perturbation (a static step in its present formulation), the loading rate and the fault constitutive properties. Because the mechanical conditions of the secondary faults at the time of application of the induced load are largely unkown, stress triggering can only be tested on fault populations and not on single earthquake pairs with a specified time delay. The interaction probability can represent the most suitable tool to test the interaction between large magnitude earthquakes. Despite these important implications and the stimulating perspectives, there exist problems in understanding earthquake interaction that should motivate future research but at the same time limit its immediate social applications. One major limitation is that we are unable to predict how and if the induced stress perturbations modify the ratio between small versus large magnitude earthquakes. In other words, we cannot distinguish between a change in this ratio in favor of small events or of large magnitude earthquakes, because the interaction probability is independent of magnitude. Another problem concerns the reconstruction of the stressing history. The interaction probability model is based on the response to a static step; however, we know that other processes contribute to

  18. Keeping pace with the science: Seismic hazard analysis in the central and eastern United States

    International Nuclear Information System (INIS)

    Coppersmith, K.J.; Youngs, R.R.

    1989-01-01

    Our evolving tectonic understanding of the causes and locations of earthquakes in the central and eastern US (CEUS) has been a challenge to probabilistic seismic hazard analyses (PSHA) methodologies. The authors summarize some of the more significant advances being made in characterizing the location, maximum earthquake size, recurrence, and ground motions associated with CEUS earthquakes

  19. Seismic hazard analysis with PSHA method in four cities in Java

    International Nuclear Information System (INIS)

    Elistyawati, Y.; Palupi, I. R.; Suharsono

    2016-01-01

    In this study the tectonic earthquakes was observed through the peak ground acceleration through the PSHA method by dividing the area of the earthquake source. This study applied the earthquake data from 1965 - 2015 that has been analyzed the completeness of the data, location research was the entire Java with stressed in four large cities prone to earthquakes. The results were found to be a hazard map with a return period of 500 years, 2500 years return period, and the hazard curve were four major cities (Jakarta, Bandung, Yogyakarta, and the city of Banyuwangi). Results Java PGA hazard map 500 years had a peak ground acceleration within 0 g ≥ 0.5 g, while the return period of 2500 years had a value of 0 to ≥ 0.8 g. While, the PGA hazard curves on the city's most influential source of the earthquake was from sources such as fault Cimandiri backgroud, for the city of Bandung earthquake sources that influence the seismic source fault dent background form. In other side, the city of Yogyakarta earthquake hazard curve of the most influential was the source of the earthquake background of the Opak fault, and the most influential hazard curve of Banyuwangi earthquake was the source of Java and Sumba megatruts earthquake. (paper)

  20. Performance Analysis: Control of Hazardous Energy

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, Connie E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Freeman, Jeff W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kerr, Christine E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-10-06

    LLNL experienced 26 occurrences related to the control of hazardous energy from January 1, 2008 through August 2010. These occurrences were 17% of the total number of reported occurrences during this 32-month period. The Performance Analysis and Reporting Section of the Contractor Assurance Office (CAO) routinely analyzes reported occurrences and issues looking for patterns that may indicate changes in LLNL’s performance and early indications of performance trends. It became apparent through these analyses that LLNL might have experienced a change in the control of hazardous energy and that these occurrences should be analyzed in more detail to determine if the perceived change in performance was real, whether that change is significant and if the causes of the occurrences are similar. This report documents the results of this more detailed analysis.

  1. Analysis of Earthquake Source Spectra in Salton Trough

    Science.gov (United States)

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  2. Large LOCA accident analysis for AP1000 under earthquake

    International Nuclear Information System (INIS)

    Yu, Yu; Lv, Xuefeng; Niu, Fenglei

    2015-01-01

    Highlights: • Seismic failure event probability is induced by uncertainties in PGA and in Am. • Uncertainty in PGA is shared by all the components at the same place. • Relativity induced by sharing PGA value can be analyzed explicitly by MC method. • Multi components failures and accident sequences will occur under high PGA value. - Abstract: Seismic probabilistic safety assessment (PSA) is developed to give the insight of nuclear power plant risk under earthquake and the main contributors to the risk. However, component failure probability including the initial event frequency is the function of peak ground acceleration (PGA), and all the components especially the different kinds of components at same place will share the common ground shaking, which is one of the important factors to influence the result. In this paper, we propose an analysis method based on Monte Carlo (MC) simulation in which the effect of all components sharing the same PGA level can be expressed by explicit pattern. The Large LOCA accident in AP1000 is analyzed as an example, based on the seismic hazard curve used in this paper, the core damage frequency is almost equal to the initial event frequency, moreover the frequency of each accident sequence is close to and even equal to the initial event frequency, while the main contributors are seismic events since multi components and systems failures will happen simultaneously when a high value of PGA is sampled. The component failure probability is determined by uncertainties in PGA and in component seismic capacity, and the former is the crucial element to influence the result

  3. A procedure for assessing seismic hazard generated by Vrancea earthquakes and its application. III. A method for developing isoseismal and isoacceleration maps. Applications

    International Nuclear Information System (INIS)

    Enescu, D.; Enescu, B.D.

    2007-01-01

    A method for developing isoseismal and isoacceleration maps assumedly valid for future strong earthquakes (M GR > 6.7) is described as constituting the third stage of a procedure for assessing the seismic hazard generated by Vrancea earthquakes. The method relies on the results of the former two stages given by Enescu et al., and on further developments that are presented in this paper. Moreover, it is based on instrument recording data. Major earthquakes taking place in Vrancea (November 10, 1940 - M GR 7.4, March 4, 1977 - M GR = 7.2 and the strongest possible) were examined as a way to test the method. The method is also applied for an earthquake of magnitude M GR = 6.7. Given the successful results of the tests, the method can by used for predicting isoseismal and isoacceleration maps for future Vrancea earthquakes of various magnitudes M GR ≥ 6.7. (authors)

  4. An Overview of Soil Models for Earthquake Response Analysis

    Directory of Open Access Journals (Sweden)

    Halida Yunita

    2015-01-01

    Full Text Available Earthquakes can damage thousands of buildings and infrastructure as well as cause the loss of thousands of lives. During an earthquake, the damage to buildings is mostly caused by the effect of local soil conditions. Depending on the soil type, the earthquake waves propagating from the epicenter to the ground surface will result in various behaviors of the soil. Several studies have been conducted to accurately obtain the soil response during an earthquake. The soil model used must be able to characterize the stress-strain behavior of the soil during the earthquake. This paper compares equivalent linear and nonlinear soil model responses. Analysis was performed on two soil types, Site Class D and Site Class E. An equivalent linear soil model leads to a constant value of shear modulus, while in a nonlinear soil model, the shear modulus changes constantly,depending on the stress level, and shows inelastic behavior. The results from a comparison of both soil models are displayed in the form of maximum acceleration profiles and stress-strain curves.

  5. Observations and recommendations regarding landslide hazards related to the January 13, 2001 M-7.6 El Salvador earthquake

    Science.gov (United States)

    Jibson, Randall W.; Crone, Anthony J.

    2001-01-01

    The January 13, 2001 earthquake (M-7.6) off the coast of El Salvador triggered widespread damaging landslides in many parts of the El Salvador. In the aftermath of the earthquake, the Salvadoran government requested technical assistance through the U.S. Agency for International Development (USAID); USAID, in turn, requested help from technical experts in landslide hazards from the U.S. Geological Survey. In response to that request, we arrived in El Salvador on January 31, 2001 and worked with USAID personnel and Salvadoran agency counterparts in visiting landslide sites and evaluating present and potential hazards. A preliminary, unofficial report was prepared at the end of our trip (February 9) to provide immediate information and assistance to interested agencies and parties. The current report is an updated and somewhat expanded version of that unofficial report. Because of the brief nature of this report, conclusions and recommendations contained herein should be considered tentative and may be revised in the future.

  6. Analysis of the local lithospheric magnetic activity before and after Panzhihua Mw = 6.0 earthquake (30 August 2008, China

    Directory of Open Access Journals (Sweden)

    Q. Li

    2011-12-01

    Full Text Available Lithospheric ultra low frequency (ULF magnetic activity is recently considered as a very promising candidate for application to short-time earthquake forecasting. However the intensity of the ULF lithospheric magnetic field is very weak and often masked by much stronger ionospheric and magnetospheric signals. The study of pre-earthquake magnetic activity before the occurrence of a strong earthquake is a very hard problem which consists of the identification and localization of the weak signal sources in earthquake hazardous areas of the Earth's crust. For the separation and localization of such sources, we used a new polarization ellipse technique (Dudkin et al., 2010 to process data acquired from fluxgate magnetometers installed in the Sichuan province, China. Sichuan is the region of the strongest seismic activity on the territory of China. During the last century, about 40 earthquakes with magnitude M ≥ 6.5 happened here in close proximity to heavy populated zones. The Panzhihua earthquake Mw = 6.0 happened in the southern part of Sichuan province on 30 August 2008 at 8:30:52 UT. The earthquake hypocentre was located at 10 km depth. During the period 30–31 August – the beginning of September 2008, many clustered aftershocks with magnitudes of up to 5.6 occurred near the earthquake epicentre. The data from three fluxgate magnetometers (belonged to China magnetometer network and placed near to the clustered earthquakes at a distance of 10–55 km from main shock epicenter have been processed. The separation between the magnetometers was in the range of 40–65 km. The analysis of a local lithospheric magnetic activity during the period of January–December 2008 and a possible source structure have been presented in this paper.

  7. Natural time analysis of the Centennial Earthquake Catalog

    International Nuclear Information System (INIS)

    Sarlis, N. V.; Christopoulos, S.-R. G.

    2012-01-01

    By using the most recent version (1900–2007) of the Centennial Earthquake Catalog, we examine the properties of the global seismicity. Natural time analysis reveals that the fluctuations of the order parameter κ 1 of seismicity exhibit for at least three orders of magnitude a characteristic feature similar to that of the order parameter for other equilibrium or non-equilibrium critical systems—including self-organized critical systems. Moreover, we find non-trivial magnitude correlations for earthquakes of magnitude greater than or equal to 7.

  8. Time-decreasing hazard and increasing time until the next earthquake

    International Nuclear Information System (INIS)

    Corral, Alvaro

    2005-01-01

    The existence of a slowly always decreasing probability density for the recurrence times of earthquakes in the stationary case implies that the occurrence of an event at a given instant becomes more unlikely as time since the previous event increases. Consequently, the expected waiting time to the next earthquake increases with the elapsed time, that is, the event moves away fast to the future. We have found direct empirical evidence of this counterintuitive behavior in two worldwide catalogs as well as in diverse regional catalogs. Universal scaling functions describe the phenomenon well

  9. Site Specific Probabilistic Seismic Hazard and Risk Analysis for Surrounding Communities of The Geysers Geothermal Development Area

    Science.gov (United States)

    Miah, M.; Hutchings, L. J.; Savy, J. B.

    2014-12-01

    We conduct a probabilistic seismic hazard and risk analysis from induced and tectonic earthquakes for a 50 km radius area centered on The Geysers, California and for the next ten years. We calculate hazard with both a conventional and physics-based approach. We estimate site specific hazard. We convert hazard to risk of nuisance and damage to structures per year and map the risk. For the conventional PSHA we assume the past ten years is indicative of hazard for the next ten years from Mnoise. Then, we interpolate within each geologic unit in finely gridded points. All grid points within a unit are weighted by distance from each data collection point. The entire process is repeated for all of the other types of geologic units until the entire area is gridded and assigned a hazard value for every grid points. We found that nuisance and damage risks calculated by both conventional and physics-based approaches provided almost identical results. This is very surprising since they were calculated by completely independent means. The conventional approach used the actual catalog of the past ten years of earthquakes to estimate the hazard for the next ten year. While the physics-based approach used geotechnical modeling to calculate the catalog for the next ten years. Similarly, for the conventional PSHA, we utilized attenuation relations from past earthquakes recorded at the Geysers to translate the ground motion from the source to the site. While for the physics-based approach we calculated ground motion from simulation of actual earthquake rupture. Finally, the source of the earthquakes was the actual source for the conventional PSHA. While, we assumed random fractures for the physics-based approach. From all this, we consider the calculation of the conventional approach, based on actual data, to validate the physics-based approach used.

  10. WIPP fire hazards and risk analysis

    International Nuclear Information System (INIS)

    1991-05-01

    The purpose of this analysis was to conduct a fire hazards risk analysis of the Transuranic (TRU) contact-handled waste receipt, emplacement, and disposal activities at the Waste Isolation Pilot Plant (WIPP). The technical bases and safety envelope for these operations are defined in the approved WIPP Final Safety Analysis Report (FSAR). Although the safety documentation for the initial phase of the Test Program, the dry bin scale tests, has not yet been approved by the Department of Energy (DOE), reviews of the draft to date, including those by the Advisory Committee on Nuclear Facility Safety (ACNFS), have concluded that the dry bin scale tests present no significant risks in excess of those estimated in the approved WIPP FSAR. It is the opinion of the authors and reviewers of this analysis, based on sound engineering judgment and knowledge of the WIPP operations, that a Fire Hazards and Risk Analysis specific to the dry bin scale test program is not warranted prior to first waste receipt. This conclusion is further supported by the risk analysis presented in this document which demonstrates the level of risk to WIPP operations posed by fire to be extremely low. 15 refs., 41 figs., 48 tabs

  11. Fire hazard analysis for fusion energy experiments

    International Nuclear Information System (INIS)

    Alvares, N.J.; Hasegawa, H.K.

    1979-01-01

    The 2XIIB mirror fusion facility at Lawrence Livermore Laboratory (LLL) was used to evaluate the fire safety of state-of-the-art fusion energy experiments. The primary objective of this evaluation was to ensure the parallel development of fire safety and fusion energy technology. Through fault-tree analysis, we obtained a detailed engineering description of the 2XIIB fire protection system. This information helped us establish an optimum level of fire protection for experimental fusion energy facilities as well as evaluate the level of protection provided by various systems. Concurrently, we analyzed the fire hazard inherent to the facility using techniques that relate the probability of ignition to the flame spread and heat-release potential of construction materials, electrical and thermal insulations, and dielectric fluids. A comparison of the results of both analyses revealed that the existing fire protection system should be modified to accommodate the range of fire hazards inherent to the 2XIIB facility

  12. Decision analysis for INEL hazardous waste storage

    Energy Technology Data Exchange (ETDEWEB)

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  13. Decision analysis for INEL hazardous waste storage

    International Nuclear Information System (INIS)

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft 2 of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies

  14. Study on Frequency content in seismic hazard analysis in West Azarbayjan and East Azarbayjan provinces (Iran)

    Science.gov (United States)

    Behzadafshar, K.; Abbaszadeh Shahri, A.; Isfandiari, K.

    2012-12-01

    ABSTRACT: Iran plate is prone to earthquake, occurrence of destructive earthquakes approximately every 5 years certify it. Due to existence of happened great earthquakes and large number of potential seismic sources (active faults) which some of them are responsible for great earthquakes the North-West of Iran which is located in junction of Alborz and Zagros seismotectonic provinces (Mirzaii et al, 1998) is an interesting area for seismologists. Considering to population and existence of large cities like Tabriz, Ardabil and Orumiyeh which play crucial role in industry and economy of Iran, authors decided to focus on study of seismic hazard assessment in these two provinces to achieve ground acceleration in different frequency content and indicate critical frequencies in the studied area. It is important to note that however lots of studies have been done in North -West of Iran, but building code modifications also need frequency content analysis to asses seismic hazard more precisely which has been done in the present study. Furthermore, in previous studies have been applied free download softwares which were provided before 2000 but the most important advantage of this study is applying professional industrial software which has been written in 2009 and provided by authors. This applied software can cover previous software weak points very well such as gridding potential sources, attention to the seismogenic zone and applying attenuation relationships directly. Obtained hazard maps illustrate that maximum accelerations will be experienced in North West to South East direction which increased by frequency reduction from 100 Hz to 10 Hz then decreased by frequency reduce (to 0.25 Hz). Maximum acceleration will be occurred in the basement in 10 HZ frequency content. Keywords: hazard map, Frequency content, seismogenic zone, Iran

  15. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis

    Science.gov (United States)

    Schaefer, K.; Prakash, A.; Witte, W.

    2011-12-01

    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  16. Earthquake prediction in Japan and natural time analysis of seismicity

    Science.gov (United States)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    ' (SES) data are available as in Greece, the natural time analysis of the seismicity after the initiation of the SES allows the determination of the time window of the impending mainshock through the evolution of the value of κ1 itself. It was found to work also for the 1989 M7.1 Loma Prieta earthquake. If SES data are not available, we solely rely on the evolution of the fluctuations of κ1 obtained by computing κ1 values using a natural time window of certain length sliding through the earthquake catalog. The fluctuations of the order parameter, in terms of variability, i. e., standard deviation divided by average, was found to increase dramatically when approaching the 11 March M9 super- giant earthquake. In fact, such increase was also found for M7.1 Kobe in 1995, M8.0 Tokachi-oki in 2003 and Landers and Hector-Mines earthquakes in Southern California. It is worth mentioning that such increase is obtained straghtforwardly from ordinary earthquake catalogs without any adjustable parameters.

  17. Teleseismic analysis of the 1990 and 1991 earthquakes near Potenza

    Directory of Open Access Journals (Sweden)

    G. Ekstrom

    1994-06-01

    Full Text Available Analysis of the available teleseismic data for two moderate earthquakes near the town of Potenza in the Southern Apennines shows that both involve strike-slip faulting on a plane oriented approximately east-west. Only the larger, 5 May 1990, earthquake is sufficiently large for analysis by conventional teleseismic waveform inversion methods, and is seen to consist of a foreshock followed 11 seconds later by the main release of moment. The focal mechanism and seismic moment of the 26 May 1991 earthquake is determined by quantitative comparison of its 15-60 s period surface waves with those generated by the 5 May 1990 event. The focal mechanisms for the two events are found to be very similar. The 1991 earthquake has a scalar moment that is approximately 18% that of the 1990 mainshock. Comparison of higher frequency P waves for the two events, recorded at regional distance, shows that the ratio of trace amplitudes is smaller than the ratio of scalar moments, suggesting that the stress drop for the 1991 event is distinctly smaller than for the 1990 mainshock.

  18. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  19. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  20. Current issues and related activities in seismic hazard analysis in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong-Moon [Korea Atomic Energy Research Inst., Taejon (Korea, Republic of); Lee, Jong-Rim; Chang, Chun-Joong

    1997-03-01

    This paper discusses some technical issues identified from the seismic hazard analyses for probabilistic safety assessment on the operating Korean nuclear power plants and the related activities to resolve the issues. Since there are no strong instrumental earthquake records in Korea, the seismic hazard analysis is mainly dependent on the historical earthquake records. Results of the past seismic hazard analyses show that there are many uncertainties in attenuation function and intensity level and that there is a need to improve statistical method. The identification of the activity of the Yangsan Fault, which is close to nuclear power plant sites, has been an important issue. But the issue has not been resolved yet in spite of much research works done. Recently, some capable faults were found in the offshore area of Gulupdo Island in the Yellow Sea. It is anticipated that the results of research on both the Yangsan Fault and reduction of uncertainty in seismic hazard analysis will have an significant influence on seismic design and safety assessment of nuclear power plants in the future. (author)

  1. Current issues and related activities in seismic hazard analysis in Korea

    International Nuclear Information System (INIS)

    Seo, Jeong-Moon; Lee, Jong-Rim; Chang, Chun-Joong.

    1997-01-01

    This paper discusses some technical issues identified from the seismic hazard analyses for probabilistic safety assessment on the operating Korean nuclear power plants and the related activities to resolve the issues. Since there are no strong instrumental earthquake records in Korea, the seismic hazard analysis is mainly dependent on the historical earthquake records. Results of the past seismic hazard analyses show that there are many uncertainties in attenuation function and intensity level and that there is a need to improve statistical method. The identification of the activity of the Yangsan Fault, which is close to nuclear power plant sites, has been an important issue. But the issue has not been resolved yet in spite of much research works done. Recently, some capable faults were found in the offshore area of Gulupdo Island in the Yellow Sea. It is anticipated that the results of research on both the Yangsan Fault and reduction of uncertainty in seismic hazard analysis will have an significant influence on seismic design and safety assessment of nuclear power plants in the future. (author)

  2. Seismic hazard analysis. A methodology for the Eastern United States

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D L

    1980-08-01

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  3. Surface-seismic imaging for nehrp soil profile classifications and earthquake hazards in urban areas

    Science.gov (United States)

    Williams, R.A.; Stephenson, W.J.; Odum, J.K.

    1998-01-01

    We acquired high-resolution seismic-refraction data on the ground surface in selected areas of the San Fernando Valley (SFV) to help explain the earthquake damage patterns and the variation in ground motion caused by the 17 January 1994 magnitude 6.7 Northridge earthquake. We used these data to determine the compressional- and shear-wave velocities (Vp and Vs) at 20 aftershock recording sites to 30-m depth ( V??s30, and V??p30). Two other sites, located next to boreholes with downhole Vp and Vs data, show that we imaged very similar seismic-vefocity structures in the upper 40 m. Overall, high site response appears to be associated with tow Vs in the near surface, but there can be a wide rangepf site amplifications for a given NEHRP soil type. The data suggest that for the SFV, if the V??s30 is known, we can determine whether the earthquake ground motion will be amplified above a factor of 2 relative to a local rock site.

  4. Physically based probabilistic seismic hazard analysis using broadband ground motion simulation: a case study for the Prince Islands Fault, Marmara Sea

    Science.gov (United States)

    Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali

    2016-08-01

    The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.

  5. Physically-Based Probabilistic Seismic Hazard Analysis Using Broad-Band Ground Motion Simulation: a Case Study for Prince Islands Fault, Marmara Sea

    Science.gov (United States)

    Mert, A.

    2016-12-01

    The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.

  6. Job Hazards Analysis Among A Group Of Surgeons At Zagazig ...

    African Journals Online (AJOL)

    ... 75% respectively. Conclusion: Job hazards analysis model was effective in assessment, evaluation and management of occupational hazards concerning surgeons and should considered as part of hospital wide quality and safety program. Key Words: Job Hazard Analysis, Risk Management, occupational Health Safety.

  7. 40 CFR 68.67 - Process hazard analysis.

    Science.gov (United States)

    2010-07-01

    ...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree...) The hazards of the process; (2) The identification of any previous incident which had a likely...

  8. Seismic hazard and seismic risk assessment based on the unified scaling law for earthquakes: Himalayas and adjacent regions

    Science.gov (United States)

    Nekrasova, A. K.; Kossobokov, V. G.; Parvez, I. A.

    2015-03-01

    For the Himalayas and neighboring regions, the maps of seismic hazard and seismic risk are constructed with the use of the estimates for the parameters of the unified scaling law for earthquakes (USLE), in which the Gutenberg-Richter law for magnitude distribution of seismic events within a given area is applied in the modified version with allowance for linear dimensions of the area, namely, log N( M, L) = A + B (5 - M) + C log L, where N( M, L) is the expected annual number of the earthquakes with magnitude M in the area with linear dimension L. The spatial variations in the parameters A, B, and C for the Himalayas and adjacent regions are studied on two time intervals from 1965 to 2011 and from 1980 to 2011. The difference in A, B, and C between these two time intervals indicates that seismic activity experiences significant variations on a scale of a few decades. With a global consideration of the seismic belts of the Earth overall, the estimates of coefficient A, which determines the logarithm of the annual average frequency of the earthquakes with a magnitude of 5.0 and higher in the zone with a linear dimension of 1 degree of the Earth's meridian, differ by a factor of 30 and more and mainly fall in the interval from -1.1 to 0.5. The values of coefficient B, which describes the balance between the number of earthquakes with different magnitudes, gravitate to 0.9 and range from less than 0.6 to 1.1 and higher. The values of coefficient C, which estimates the fractal dimension of the local distribution of epicenters, vary from 0.5 to 1.4 and higher. In the Himalayas and neighboring regions, the USLE coefficients mainly fall in the intervals of -1.1 to 0.3 for A, 0.8 to 1.3 for B, and 1.0 to 1.4 for C. The calculations of the local value of the expected peak ground acceleration (PGA) from the maximal expected magnitude provided the necessary basis for mapping the seismic hazards in the studied region. When doing this, we used the local estimates of the

  9. Unexpected earthquake hazard revealed by Holocene rupture on the Kenchreai Fault (central Greece): Implications for weak sub-fault shear zones

    Science.gov (United States)

    Copley, Alex; Grützner, Christoph; Howell, Andy; Jackson, James; Penney, Camilla; Wimpenny, Sam

    2018-03-01

    High-resolution elevation models, palaeoseismic trenching, and Quaternary dating demonstrate that the Kenchreai Fault in the eastern Gulf of Corinth (Greece) has ruptured in the Holocene. Along with the adjacent Pisia and Heraion Faults (which ruptured in 1981), our results indicate the presence of closely-spaced and parallel normal faults that are simultaneously active, but at different rates. Such a configuration allows us to address one of the major questions in understanding the earthquake cycle, specifically what controls the distribution of interseismic strain accumulation? Our results imply that the interseismic loading and subsequent earthquakes on these faults are governed by weak shear zones in the underlying ductile crust. In addition, the identification of significant earthquake slip on a fault that does not dominate the late Quaternary geomorphology or vertical coastal motions in the region provides an important lesson in earthquake hazard assessment.

  10. Micro-earthquake signal analysis and hypocenter determination around Lokon volcano complex

    Energy Technology Data Exchange (ETDEWEB)

    Firmansyah, Rizky, E-mail: rizkyfirmansyah@hotmail.com [Geophysical Engineering, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Bandung, 40132 (Indonesia); Nugraha, Andri Dian, E-mail: nugraha@gf.itb.ac.id [Global Geophysical Group, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Bandung, 40132 (Indonesia); Kristianto, E-mail: kris@vsi.esdm.go.id [Center for Volcanology and Geological Hazard Mitigation (CVGHM), Geological Agency, Bandung, 40122 (Indonesia)

    2015-04-24

    Mount Lokon is one of five active volcanoes which is located in the North Sulawesi region. Since June 26{sup th}, 2011, standby alert set by the Center for Volcanology and Geological Hazard Mitigation (CVGHM) for this mountain. The Mount Lokon volcano erupted on July 4{sup th}, 2011 and still continuously erupted until August 28{sup th}, 2011. Due to its high seismic activity, this study is focused to analysis of micro-earthquake signal and determine the micro-earthquake hypocenter location around the complex area of Lokon-Empung Volcano before eruption phase in 2011 (time periods of January, 2009 up to March, 2010). Determination of the hypocenter location was conducted with Geiger Adaptive Damping (GAD) method. We used initial model from previous study in Volcan de Colima, Mexico. The reason behind the model selection was based on the same characteristics that shared between Mount Lokon and Colima including andesitic stratovolcano and small-plinian explosions volcanian types. In this study, a picking events was limited to the volcano-tectonics of A and B types, hybrid, long-period that has a clear signal onset, and local tectonic with different maximum S – P time are not more than three seconds. As a result, we observed the micro-earthquakes occurred in the area north-west of Mount Lokon region.

  11. Micro-earthquake signal analysis and hypocenter determination around Lokon volcano complex

    International Nuclear Information System (INIS)

    Firmansyah, Rizky; Nugraha, Andri Dian; Kristianto

    2015-01-01

    Mount Lokon is one of five active volcanoes which is located in the North Sulawesi region. Since June 26 th , 2011, standby alert set by the Center for Volcanology and Geological Hazard Mitigation (CVGHM) for this mountain. The Mount Lokon volcano erupted on July 4 th , 2011 and still continuously erupted until August 28 th , 2011. Due to its high seismic activity, this study is focused to analysis of micro-earthquake signal and determine the micro-earthquake hypocenter location around the complex area of Lokon-Empung Volcano before eruption phase in 2011 (time periods of January, 2009 up to March, 2010). Determination of the hypocenter location was conducted with Geiger Adaptive Damping (GAD) method. We used initial model from previous study in Volcan de Colima, Mexico. The reason behind the model selection was based on the same characteristics that shared between Mount Lokon and Colima including andesitic stratovolcano and small-plinian explosions volcanian types. In this study, a picking events was limited to the volcano-tectonics of A and B types, hybrid, long-period that has a clear signal onset, and local tectonic with different maximum S – P time are not more than three seconds. As a result, we observed the micro-earthquakes occurred in the area north-west of Mount Lokon region

  12. Real-time Position Based Population Data Analysis and Visualization Using Heatmap for Hazard Emergency Response

    Science.gov (United States)

    Ding, R.; He, T.

    2017-12-01

    With the increased popularity in mobile applications and services, there has been a growing demand for more advanced mobile technologies that utilize real-time Location Based Services (LBS) data to support natural hazard response efforts. Compared to traditional sources like the census bureau that often can only provide historical and static data, an LBS service can provide more current data to drive a real-time natural hazard response system to more accurately process and assess issues such as population density in areas impacted by a hazard. However, manually preparing or preprocessing the data to suit the needs of the particular application would be time-consuming. This research aims to implement a population heatmap visual analytics system based on real-time data for natural disaster emergency management. System comprised of a three-layered architecture, including data collection, data processing, and visual analysis layers. Real-time, location-based data meeting certain polymerization conditions are collected from multiple sources across the Internet, then processed and stored in a cloud-based data store. Parallel computing is utilized to provide fast and accurate access to the pre-processed population data based on criteria such as the disaster event and to generate a location-based population heatmap as well as other types of visual digital outputs using auxiliary analysis tools. At present, a prototype system, which geographically covers the entire region of China and combines population heat map based on data from the Earthquake Catalogs database has been developed. It Preliminary results indicate that the generation of dynamic population density heatmaps based on the prototype system has effectively supported rapid earthquake emergency rescue and evacuation efforts as well as helping responders and decision makers to evaluate and assess earthquake damage. Correlation analyses that were conducted revealed that the aggregation and movement of people

  13. Analysis of Earthquake Catalogs for CSEP Testing Region Italy

    International Nuclear Information System (INIS)

    Peresan, A.; Romashkova, L.; Nekrasova, A.; Kossobokov, V.; Panza, G.F.

    2010-07-01

    A comprehensive analysis shows that the set of catalogs provided by the Istituto Nazionale di Geofisica e Vulcanologia (INGV, Italy) as the authoritative database for the Collaboratory for the Study of Earthquake Predictability - Testing Region Italy (CSEP-TRI), is hardly a unified one acceptable for the necessary tuning of models/algorithms, as well as for running rigorous prospective predictability tests at intermediate- or long-term scale. (author)

  14. Preliminary hazard analysis using sequence tree method

    International Nuclear Information System (INIS)

    Huang Huiwen; Shih Chunkuan; Hung Hungchih; Chen Minghuei; Yih Swu; Lin Jiinming

    2007-01-01

    A system level PHA using sequence tree method was developed to perform Safety Related digital I and C system SSA. The conventional PHA is a brainstorming session among experts on various portions of the system to identify hazards through discussions. However, this conventional PHA is not a systematic technique, the analysis results strongly depend on the experts' subjective opinions. The analysis quality cannot be appropriately controlled. Thereby, this research developed a system level sequence tree based PHA, which can clarify the relationship among the major digital I and C systems. Two major phases are included in this sequence tree based technique. The first phase uses a table to analyze each event in SAR Chapter 15 for a specific safety related I and C system, such as RPS. The second phase uses sequence tree to recognize what I and C systems are involved in the event, how the safety related systems work, and how the backup systems can be activated to mitigate the consequence if the primary safety systems fail. In the sequence tree, the defense-in-depth echelons, including Control echelon, Reactor trip echelon, ESFAS echelon, and Indication and display echelon, are arranged to construct the sequence tree structure. All the related I and C systems, include digital system and the analog back-up systems are allocated in their specific echelon. By this system centric sequence tree based analysis, not only preliminary hazard can be identified systematically, the vulnerability of the nuclear power plant can also be recognized. Therefore, an effective simplified D3 evaluation can be performed as well. (author)

  15. 327 Building fire hazards analysis implementation plan

    International Nuclear Information System (INIS)

    BARILO, N.F.

    1999-01-01

    In March 1998, the 327 Building Fire Hazards Analysis (FHA) (Reference 1) was approved by the U.S. Department of Energy, Richland Operations Office (DOE-E) for implementation by B and W Hanford Company (BWC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in five areas and provided nine recommendations (11 items) to bring the 327 Building into compliance. A status is provided for each recommendation in this document. BWHC will use this Implementation Plan to bring the 327 Building and its operation into compliance with DOE Order 5480.7A and IUD 5480.7

  16. An innovative assessment of the seismic hazard from Vrancea intermediate-depth earthquakes: Case studies in Romania and Bulgaria

    International Nuclear Information System (INIS)

    Panza, G.F.; Cioflan, C.; Marmureanu, G.; Kouteva, M.; Paskaleva, I.; Romanelli, F.

    2002-02-01

    An advanced procedure for ground motion, capable of synthesizing the seismic ground motion from basic understanding of fault mechanism and seismic wave propagation, is applied to the case studies of Bucharest (Romania) and Russe, NE Bulgaria, exposed to the seismic hazard from Vrancea events. Synthetic seismic signals along representative geological cross sections in Bucharest and Russe and been computed and the energetic input spectra have been derived both from the synthetic signals and the few existing records. The theoretical signals are successfully compared with the available observations. The site response has been calculated for three recent, strong and intermediate-depth, Vrancea earthquakes: August 30, 1986 and May 30 and 31, 1990. The used approach differs significantly from today's engineering practice that relays upon rock-site hazard maps and applies the site correction at a later stage. The obtained results show that it is very useful to estimate the site effect via waveform modelling, considering simultaneously the geotechnical properties of the site, the position and geometry of the seismic source and the mechanical properties of the propagation medium. (author)

  17. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    Science.gov (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  18. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    Science.gov (United States)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    incorporated with additional GIS and statistic data to a comprehensive property-by-property geodatabase of the existing elements and values. This stock of elements and values geodatabase is furthermore the consistent basis for all natural hazard analyses and enables the comparison of the results. The study follows the general accepted moduls (i) hazard analysis, (ii) exposition analysis, and (iii) consequence analysis, whereas the exposition analysis estimates the elements at risk with their corresponding damage potentials and the consequence analysis estimates the PMLs. This multi-hazard analysis focuses on process types with a high to extreme potential of negative consequences on a regional scale. In this context, (i) floodings, (ii) rockslides with the potential of corresponding consequence effects (backwater ponding and outburst flood), (iii) earthquakes, (iv) hail events, and (v) winter storms were considered as hazard processes. Based on general hazard analyses (hazard maps) concrete scenarios and their spatial affectedness were determined. For the different hazard processes, different vulnerability approaches were considered to demonstrate their sensitivity and implication on the results. Thus, no absolute values of losses but probable loss ranges were estimated. It can be shown, that the most serious amount of losses would arise from extreme earthquake events with loss burdens up to more than € 7 bn. solely on buildings and inventory. Possible extreme flood events could lead to losses between € 2 and 2.5 bn., whereas a severe hail swath which affects the central Inn valley could result in losses of ca. € 455 mill. (thereof € 285 mill. on vehicles). The potential most serious rockslide with additional consequence effects would result in losses up to ca. € 185 mill. and extreme winter storms can induce losses between € 100 mill. and 150 mill..

  19. Preliminary analysis of the rupture process of 11 March 2011 Tohoku-Oki earthquake

    Science.gov (United States)

    Vilotte, J.; Satriano, C.; Dionicio, V.; Lancieri, M.; Bernard, P.

    2011-12-01

    . In particular, the largest aftershock (Mw > 7.9) that occurred off Ibaraki in the southeast termination of the main rupture is analyzed combining teleseismic back-projection and broadband strong motion analysis. This large aftershock raises important questions with regard to the understanding of the seismic hazard in the Tokyo area. Those results evidence a frequency dependent rupture process. with a down-dip short period radiation and a long period up-dip radiation producing large slip and most of the long period moment release. The down-dip short period radiation, with a weakly coherent slip distribution, is shown to be consistent with the complexity of the Kik-net strong motion recordings. The up-dip long period radiation, with a large coherent and compact slip distribution, is consistent with the tsunami source and the long period CMT analysis. This underlines the importance of an along dip and along strike segmentation. Finally, discussions are drawn based on a comparison Tohoku-Oki earthquake and the recent 2010 Maule earthquake in Central Chile, in the light of the 2007 Tocopilla earthquake in North Chile.

  20. Deep-Sea Turbidites as Guides to Holocene Earthquake History at the Cascadia Subduction Zone—Alternative Views for a Seismic-Hazard Workshop

    Science.gov (United States)

    Atwater, Brian F.; Griggs, Gary B.

    2012-01-01

    This report reviews the geological basis for some recent estimates of earthquake hazards in the Cascadia region between southern British Columbia and northern California. The largest earthquakes to which the region is prone are in the range of magnitude 8-9. The source of these great earthquakes is the fault down which the oceanic Juan de Fuca Plate is being subducted or thrust beneath the North American Plate. Geologic evidence for their occurrence includes sedimentary deposits that have been observed in cores from deep-sea channels and fans. Earthquakes can initiate subaqueous slumps or slides that generate turbidity currents and which produce the sedimentary deposits known as turbidites. The hazard estimates reviewed in this report are derived mainly from deep-sea turbidites that have been interpreted as proxy records of great Cascadia earthquakes. The estimates were first published in 2008. Most of the evidence for them is contained in a monograph now in press. We have reviewed a small part of this evidence, chiefly from Cascadia Channel and its tributaries, all of which head offshore the Pacific coast of Washington State. According to the recent estimates, the Cascadia plate boundary ruptured along its full length in 19 or 20 earthquakes of magnitude 9 in the past 10,000 years; its northern third broke during these giant earthquakes only, and southern segments produced at least 20 additional, lesser earthquakes of Holocene age. The turbidite case for full-length ruptures depends on stratigraphic evidence for simultaneous shaking at the heads of multiple submarine canyons. The simultaneity has been inferred primarily from turbidite counts above a stratigraphic datum, sandy beds likened to strong-motion records, and radiocarbon ages adjusted for turbidity-current erosion. In alternatives proposed here, this turbidite evidence for simultaneous shaking is less sensitive to earthquake size and frequency than previously thought. Turbidites far below a channel

  1. Guidance Index for Shallow Landslide Hazard Analysis

    Directory of Open Access Journals (Sweden)

    Cheila Avalon Cullen

    2016-10-01

    Full Text Available Rainfall-induced shallow landslides are one of the most frequent hazards on slanted terrains. Intense storms with high-intensity and long-duration rainfall have high potential to trigger rapidly moving soil masses due to changes in pore water pressure and seepage forces. Nevertheless, regardless of the intensity and/or duration of the rainfall, shallow landslides are influenced by antecedent soil moisture conditions. As of this day, no system exists that dynamically interrelates these two factors on large scales. This work introduces a Shallow Landslide Index (SLI as the first implementation of antecedent soil moisture conditions for the hazard analysis of shallow rainfall-induced landslides. The proposed mathematical algorithm is built using a logistic regression method that systematically learns from a comprehensive landslide inventory. Initially, root-soil moisture and rainfall measurements modeled from AMSR-E and TRMM respectively, are used as proxies to develop the index. The input dataset is randomly divided into training and verification sets using the Hold-Out method. Validation results indicate that the best-fit model predicts the highest number of cases correctly at 93.2% accuracy. Consecutively, as AMSR-E and TRMM stopped working in October 2011 and April 2015 respectively, root-soil moisture and rainfall measurements modeled by SMAP and GPM are used to develop models that calculate the SLI for 10, 7, and 3 days. The resulting models indicate a strong relationship (78.7%, 79.6%, and 76.8% respectively between the predictors and the predicted value. The results also highlight important remaining challenges such as adequate information for algorithm functionality and satellite based data reliability. Nevertheless, the experimental system can potentially be used as a dynamic indicator of the total amount of antecedent moisture and rainfall (for a given duration of time needed to trigger a shallow landslide in a susceptible area. It is

  2. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  3. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-03-01

    When training for a job in which human error has the potential of producing catastrophic results, an understanding of the hazards that may be encountered is of paramount importance. In high consequence activities, it is important that the training program be conducted in a safe environment and yet emphasize the potential hazards. Because of the high consequence of a human error the use of a high-fidelity simulation is of great importance to provide the safe environment the worker needs to learn and hone required skills. A hazards analysis identifies the operation hazards, potential human error, and associated positive measures that aid in the mitigation or prevention of the hazard. The information gained from the hazards analysis should be used in the development of training. This paper will discuss the integration of information from the hazards analysis into the development of simulation components of a training program.

  4. Fire hazard analysis for the fuel supply shutdown storage buildings

    International Nuclear Information System (INIS)

    REMAIZE, J.A.

    2000-01-01

    The purpose of a fire hazards analysis (FHA) is to comprehensively assess the risk from fire and other perils within individual fire areas in a DOE facility in relation to proposed fire protection so as to ascertain whether the objectives of DOE 5480.7A, Fire Protection, are met. This Fire Hazards Analysis was prepared as required by HNF-PRO-350, Fire Hazards Analysis Requirements, (Reference 7) for a portion of the 300 Area N Reactor Fuel Fabrication and Storage Facility

  5. Earthquake response analysis considering structure-soil-structure interaction

    International Nuclear Information System (INIS)

    Shiomi, T.; Takahashi, K.; Oguro, E.

    1981-01-01

    This paper proposes a numerical method of earthquake response analysis considering the structure-soil-structure interaction between two adjacent buildings. In this paper an analytical study is presented in order to show some typical features of coupling effects of two reactor buildings of the BWR-type nuclear power plant. The technical approach is a kind of substructure method, which at first evaluates the compliance properties with the foundation-soil-foundation interaction and then uses the compliance in determining seismic responses of two super-structures during earthquake motions. For this purpose, it is assumed that the soil medium is an elastic half space for modeling and that the rigidity of any type of structures such as piping facilities connecting the adjacent buildings is negligible. The technical approach is mainly based on the following procedures. Supersturcture stiffness is calculated by using the method which has been developed in our laboratory based on the Thin-Wall Beam Theory. Soil stiffness is expressed by a matrix with 12 x 12 elements as a function of frequency, which is calculated using the soil compliance functions proposed in Dr. Tajimi's Theory. These stiffness values may be expressed by complex numbers for modeling the damping mechanism of superstructures. We can solve eigenvalue problems with frequency dependent stiffness and the large-scale matrix using our method which is based on condensing the matrix to the suitable size by Rayleigh-Ritz method. Earthquake responses can be solved in the frequency domain by Fourier Transform. (orig./RW)

  6. Review of earthquake hazard assessments of plant sites at Paducah, Kentucky and Portsmouth, Ohio

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    Members of the US Geological Survey staff in Golden, Colorado, have reviewed the submissions of Lawrence Livermore National Laboratory (LLNL) staff and of Risk Engineering, Inc. (REI) (Golden, Colorado) for seismic hazard estimates for Department of Energy facilities at Portsmouth, Ohio, and Paducah, Kentucky. We reviewed the historical seismicity and seismotectonics near the two sites, and general features of the LLNL and EPRI/SOG methodologies used by LLNL and Risk Engineering respectively, and also the separate Risk Engineering methodology used at Paducah. We discussed generic issues that affect the modeling of both sites, and performed alternative calculations to determine sensitivities of seismic hazard results to various assumptions and models in an attempt to assign reasonable bounding values of the hazard. In our studies we find that peak acceleration values of 0.08 g for Portsmouth and 0.32 g for Paducah represent central values of the, ground motions obtained at 1000-year return periods. Peak accelerations obtained in the LLNL and Risk Engineering studies have medians near these values (results obtained using the EPRI/SOG methodology appear low at both sites), and we believe that these medians are appropriate values for use in the evaluation of systems, structures, and components for seismic structural integrity and for the seismic design of new and improved systems, structures, and components at Portsmouth and Paducah.

  7. Review of earthquake hazard assessments of plant sites at Paducah, Kentucky, and Portsmouth, Ohio

    International Nuclear Information System (INIS)

    1992-03-01

    Members of the US Geological Survey staff in Golden, Colorado, have reviewed the submissions of Lawrence Livermore National Laboratory (LLNL) staff and of Risk Engineering, Inc. (REI) (Golden, Colorado) for seismic hazard estimates for Department of Energy facilities at Portsmouth, Ohio, and Paducah, Kentucky. We reviewed the historical seismicity and seismotectonics near the two sites, and general features of the LLNL and EPRI/SOG methodologies used by LLNL and Risk Engineering respectively, and also the separate Risk Engineering methodology used at Paducah. We discussed generic issues that affect the modeling of both sites, and performed alternative calculations to determine sensitivities of seismic hazard results to various assumptions and models in an attempt to assign reasonable bounding values of the hazard. In our studies we find that peak acceleration values of 0.08 g for Portsmouth and 0.32 g for Paducah represent central values of the ground motions obtained at 1000-year return periods. Peak accelerations obtained in the LLNL and Risk Engineering studies have medians near these values (results obtained using the EPRI/SOG methodology appear low at both sites), and we believe that these medians are appropriate values for use in the evaluation of systems, structures, and components for seismic structural integrity and for the seismic design of new and improved systems, structures, and components at Portsmouth and Paducah

  8. Review of earthquake hazard assessments of plant sites at Paducah, Kentucky and Portsmouth, Ohio

    International Nuclear Information System (INIS)

    1997-01-01

    Members of the US Geological Survey staff in Golden, Colorado, have reviewed the submissions of Lawrence Livermore National Laboratory (LLNL) staff and of Risk Engineering, Inc. (REI) (Golden, Colorado) for seismic hazard estimates for Department of Energy facilities at Portsmouth, Ohio, and Paducah, Kentucky. We reviewed the historical seismicity and seismotectonics near the two sites, and general features of the LLNL and EPRI/SOG methodologies used by LLNL and Risk Engineering respectively, and also the separate Risk Engineering methodology used at Paducah. We discussed generic issues that affect the modeling of both sites, and performed alternative calculations to determine sensitivities of seismic hazard results to various assumptions and models in an attempt to assign reasonable bounding values of the hazard. In our studies we find that peak acceleration values of 0.08 g for Portsmouth and 0.32 g for Paducah represent central values of the, ground motions obtained at 1000-year return periods. Peak accelerations obtained in the LLNL and Risk Engineering studies have medians near these values (results obtained using the EPRI/SOG methodology appear low at both sites), and we believe that these medians are appropriate values for use in the evaluation of systems, structures, and components for seismic structural integrity and for the seismic design of new and improved systems, structures, and components at Portsmouth and Paducah

  9. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  10. Strategic crisis and risk communication during a prolonged natural hazard event: lessons learned from the Canterbury earthquake sequence

    Science.gov (United States)

    Wein, A. M.; Potter, S.; Becker, J.; Doyle, E. E.; Jones, J. L.

    2015-12-01

    While communication products are developed for monitoring and forecasting hazard events, less thought may have been given to crisis and risk communication plans. During larger (and rarer) events responsible science agencies may find themselves facing new and intensified demands for information and unprepared for effectively resourcing communications. In a study of the communication of aftershock information during the 2010-12 Canterbury Earthquake Sequence (New Zealand), issues are identified and implications for communication strategy noted. Communication issues during the responses included reliability and timeliness of communication channels for immediate and short decision time frames; access to scientists by those who needed information; unfamiliar emergency management frameworks; information needs of multiple audiences, audience readiness to use the information; and how best to convey empathy during traumatic events and refer to other information sources about what to do and how to cope. Other science communication challenges included meeting an increased demand for earthquake education, getting attention on aftershock forecasts; responding to rumor management; supporting uptake of information by critical infrastructure and government and for the application of scientific information in complex societal decisions; dealing with repetitive information requests; addressing diverse needs of multiple audiences for scientific information; and coordinating communications within and outside the science domain. For a science agency, a communication strategy would consider training scientists in communication, establishing relationships with university scientists and other disaster communication roles, coordinating messages, prioritizing audiences, deliberating forecasts with community leaders, identifying user needs and familiarizing them with the products ahead of time, and practicing the delivery and use of information via scenario planning and exercises.

  11. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-12-01

    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  12. 324 Building fire hazards analysis implementation plan

    International Nuclear Information System (INIS)

    BARILO, N.F.

    1999-01-01

    In March 1998, the 324 Building Fire Hazards Analysis (FHA) (Reference 1) was approved by the U S. Department of Energy, Richland Operations Office (DOE-RL) for implementation by B and W Hanford Company (BWHC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in six areas and provided 20 recommendations to bring the 324 Building into compliance with DOE Order 5480 7A. Additionally, one observation was provided. A status is provided for each recommendation in this document. The actions for recommendations associated with the safety related part of the 324 Building and operation of the cells and support areas were evaluated using the Unreviewed Safety Question (USQ) process BWHC will use this Implementation Plan to bring the 324 Building and its operation into compliance with DOE Order 5480 7A and RLID 5480.7

  13. 324 Building fire hazards analysis implementation plan

    International Nuclear Information System (INIS)

    Eggen, C.D.

    1998-01-01

    In March 1998, the 324 Building Fire Hazards Analysis (FHA) (Reference 1) was approved by the US Department of Energy, Richland Operations Office (DOE-RL) for implementation by B and W Hanford Company (BWHC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in six areas and provided 20 recommendations to bring the 324 Building into compliance with DOE Order 5480.7A. Additionally, one observation was provided. To date, four of the recommendations and the one observation have been completed. Actions identified for seven of the recommendations are currently in progress. Exemption requests will be transmitted to DOE-RL for three of the recommendations. Six of the recommendations are related to future shut down activities of the facility and the corrective actions are not being addressed as part of this plan. The actions for recommendations associated with the safety related part of the 324 Building and operation of the cells and support areas were evaluated using the Unreviewed Safety Question (USQ) process. Major Life Safety Code concerns have been corrected. The status of the recommendations and actions was confirmed during the July 1998 Fire Protection Assessment. BVMC will use this Implementation Plan to bring the 324 Building and its operation into compliance with DOE Order 5480.7A and RLID 5480.7

  14. 327 Building fire hazards analysis implementation plan

    International Nuclear Information System (INIS)

    Eggen, C.D.

    1998-01-01

    In March 1998, the 327 Building Fire Hazards Analysis (FRA) (Reference 1) was approved by the US Department of Energy, Richland Operations Office (DOE-RL) for implementation by B and W Hanford Company (B and WHC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in five areas and provided nine recommendations (11 items) to bring the 327 Building into compliance. To date, actions for five of the 11 items have been completed. Exemption requests will be transmitted to DOE-RL for two of the items. Corrective actions have been identified for the remaining four items. The completed actions address combustible loading requirements associated with the operation of the cells and support areas. The status of the recommendations and actions was confirmed during the July 1998 Fire Protection Assessment. B and WHC will use this Implementation Plan to bring the 327 Building and its operation into compliance with DOE Order 5480.7A and RLID 5480.7

  15. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  16. Investigating potential seismic hazard in the Gulf of Gökova (South Eastern Aegean Sea) deduced from recent shallow earthquake activity

    Science.gov (United States)

    Rontogianni, S.; Konstantinou, K. I.; Evangelidis, C.; Melis, N. S.

    2011-12-01

    concentrated to the south of the Gulf. The seismic activity moves at greater depths from the west to the inner part of the Gulf, from ~5 km to ~15 km. These observations indicate that the activity of the Datça fault has not been decelerated as previously proposed, but contrary there is a probable connection between GTF and the Fault of Kos. The fact that this is a 40° dipping fault, buried under thick 2.5 km sediment deposits, increases the possibility that a large earthquake could trigger a sediment landslide. Since the Gulf is quite narrow (~25 km N-S width), its expected effects could be devastating for the area. Further analysis of the major earthquake mechanisms is scheduled to be done, in order to investigate the potential hazard arising from this major fault.

  17. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)

    2016-09-15

    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  18. Evidence of a Large Triggered Event in the Nepal Himalaya Following the Gorkha Earthquake: Implications Toward Enhanced Seismic Hazard

    Science.gov (United States)

    Mandal, Prantik

    2018-03-01

    A DC (double couple) constrained multiple point-source moment-tensor inversion is performed on the band-passed (0.008-0.10 Hz) displacement data of the 25 April (M w 7.8) 2015 Nepal mainshock, from 17 broadband stations in India. Our results reveal that the 25 April event (strike = 324°, dip = 14°, rake = 88°) ruptured the north-dipping main Himalayan thrust (MHT) at 16 km depth. We modeled the Coulomb failure stress changes (ΔCFS) produced by the slip on the fault plane of the 25 April Nepal mainshock. A strong correlation with occurrences of aftershocks and regions of increased positive ΔCFS is obtained below the aftershock zone of the 2015 Nepal mainshock. We notice that predicted ΔCFS at 16 km depth show a positive Coulomb stress of 0.06 MPa at the location of the 12 May 2015 event. These small modeled stress changes can lead to trigger events if the crust is already near to failure, but these small stresses can also advance the occurrence of future earthquakes. The main finding of our ΔCFS modeling implies that the 25 April event increased the Coulomb stress changes by 0.06 MPa at 16 km depth below the site of the 12 May event, and thus, this event can be termed as triggered. We propose that the seismic hazard in the Himalaya is not only caused by the mainshock slip on the MHT; rather, the occurrence of large triggered event on the MHT can also enhance our understanding of the seismic hazard in the Nepal Himalaya.

  19. Academia Sinica, TW E-science to Assistant Seismic Observations for Earthquake Research, Monitor and Hazard Reduction Surrounding the South China Sea

    Science.gov (United States)

    Huang, Bor-Shouh; Liu, Chun-Chi; Yen, Eric; Liang, Wen-Tzong; Lin, Simon C.; Huang, Win-Gee; Lee, Shiann-Jong; Chen, Hsin-Yen

    Experience from the 1994 giant Sumatra earthquake, seismic and tsunami hazard have been considered as important issues in the South China Sea and its surrounding region, and attracted many seismologist's interesting. Currently, more than 25 broadband seismic instruments are currently operated by Institute of Earth Sciences, Academia Sinica in northern Vietnam to study the geodynamic evolution of the Red river fracture zone and rearranged to distribute to southern Vietnam recently to study the geodynamic evolution and its deep structures of the South China Sea. Similar stations are planned to deploy in Philippines in near future. In planning, some high quality stations may be as permanent stations and added continuous GPS observations, and instruments to be maintained and operated by several cooperation institutes, for instance, Institute of Geophysics, Vietnamese Acadamy of Sciences and Technology in Vietnam and Philippine Institute of Volcanology and Seismology in Philippines. Finally, those stations will be planed to upgrade as real time transmission stations for earthquake monitoring and tsunami warning. However, high speed data transfer within different agencies is always a critical issue for successful network operation. By taking advantage of both EGEE and EUAsiaGrid e-Infrastructure, Academia Sinica Grid Computing Centre coordinates researchers from various Asian countries to construct a platform to high performance data transfer for huge parallel computation. Efforts from this data service and a newly build earthquake data centre for data management may greatly improve seismic network performance. Implementation of Grid infrastructure and e-science issues in this region may assistant development of earthquake research, monitor and natural hazard reduction. In the near future, we will search for new cooperation continually from the surrounding countries of the South China Sea to install new seismic stations to construct a complete seismic network of the

  20. Fire hazards analysis for solid waste burial grounds

    International Nuclear Information System (INIS)

    McDonald, K.M.

    1995-01-01

    This document comprises the fire hazards analysis for the solid waste burial grounds, including TRU trenches, low-level burial grounds, radioactive mixed waste trenches, etc. It analyzes fire potential, and fire damage potential for these facilities. Fire scenarios may be utilized in future safety analysis work, or for increasing the understanding of where hazards may exist in the present operation

  1. Reduction of uncertainties in probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-02-01

    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  2. Using Dynamic Fourier Analysis to Discriminate Between Seismic Signals from Natural Earthquakes and Mining Explosions

    Directory of Open Access Journals (Sweden)

    Maria C. Mariani

    2017-08-01

    Full Text Available A sequence of intraplate earthquakes occurred in Arizona at the same location where miningexplosions were carried out in previous years. The explosions and some of the earthquakes generatedvery similar seismic signals. In this study Dynamic Fourier Analysis is used for discriminating signalsoriginating from natural earthquakes and mining explosions. Frequency analysis of seismogramsrecorded at regional distances shows that compared with the mining explosions the earthquake signalshave larger amplitudes in the frequency interval ~ 6 to 8 Hz and significantly smaller amplitudes inthe frequency interval ~ 2 to 4 Hz. This type of analysis permits identifying characteristics in theseismograms frequency yielding to detect potentially risky seismic events.

  3. Slip in the 2010-2011 Canterbury Earthquakes, New Zealand and implications for future seismic hazard in Christchurch

    Science.gov (United States)

    Elliott, J. R.; Nissen, E.; England, P. C.; Jackson, J. A.; Lamb, S.; Li, Z.; Oehlers, M.; Parsons, B. E.

    2011-12-01

    crustal block with strain accommodated elsewhere around its boundaries. The fault parameters derived from the satellite observations of both the Darfield and Christchurch events reveal a 15-km-long gap in fault slip south-west of Christchurch which presents a continuing seismic hazard if a further unknown fault structure should exist there. The identification of such possible structures in the vicinity of the city is now a priority as the current gap has a similar length to the rupture in the 2011 Christchurch earthquake. Wallace, L. M., J. Beavan, R. McCaffrey, K. Berryman, and P. Denys, Balancing the plate motion budget in the South Island, New Zealand using GPS, geological and seismological datas, Geophysics Journal International, 168, 332-352, doi:10.1111/j.1365-246X.2006.03183.x, 2007.

  4. Using Spatial Multi-Criteria Analysis and Ranking Tool (SMART in earthquake risk assessment: a case study of Delhi region, India

    Directory of Open Access Journals (Sweden)

    Nishant Sinha

    2016-03-01

    Full Text Available This article is aimed at earthquake hazard, vulnerability and risk assessment as a case study to demonstrate the applicability of Spatial Multi-Criteria Analysis and Ranking Tool (SMART, which is based on Saaty's multi-criteria decision analysis (MCDA technique. The three specific study sites of Delhi were chosen for research as it corresponds to a typical patch of the urban environs, completely engrossed with residential, commercial and industrial units. The earthquake hazard affecting components are established in the form of geographic information system data-set layers including seismic zone, peak ground acceleration (PGA, soil characteristics, liquefaction potential, geological characteristics, land use, proximity to fault and epicentre. The physical vulnerability layers comprising building information, namely number of stories, year-built range, area, occupancy and construction type, derived from remote sensing imagery, were only considered for the current research. SMART was developed for earthquake risk assessment, and weights were derived both at component and its element level. Based on weighted overlay techniques, the earthquake hazard and vulnerability layers were created from which the risk maps were derived through multiplicative analysis. The developed risk maps may prove useful in decision-making process and formulating risk mitigation measures.

  5. Automatic analysis of the 2015 Gorkha earthquake aftershock sequence.

    Science.gov (United States)

    Baillard, C.; Lyon-Caen, H.; Bollinger, L.; Rietbrock, A.; Letort, J.; Adhikari, L. B.

    2016-12-01

    The Mw 7.8 Gorkha earthquake, that partially ruptured the Main Himalayan Thrust North of Kathmandu on the 25th April 2015, was the largest and most catastrophic earthquake striking Nepal since the great M8.4 1934 earthquake. This mainshock was followed by multiple aftershocks, among them, two notable events that occurred on the 12th May with magnitudes of 7.3 Mw and 6.3 Mw. Due to these recent events it became essential for the authorities and for the scientific community to better evaluate the seismic risk in the region through a detailed analysis of the earthquake catalog, amongst others, the spatio-temporal distribution of the Gorkha aftershock sequence. Here we complement this first study by doing a microseismic study using seismic data coming from the eastern part of the Nepalese Seismological Center network associated to one broadband station in Everest. Our primary goal is to deliver an accurate catalog of the aftershock sequence. Due to the exceptional number of events detected we performed an automatic picking/locating procedure which can be splitted in 4 steps: 1) Coarse picking of the onsets using a classical STA/LTA picker, 2) phase association of picked onsets to detect and declare seismic events, 3) Kurtosis pick refinement around theoretical arrival times to increase picking and location accuracy and, 4) local magnitude calculation based amplitude of waveforms. This procedure is time efficient ( 1 sec/event), reduces considerably the location uncertainties ( 2 to 5 km errors) and increases the number of events detected compared to manual processing. Indeed, the automatic detection rate is 10 times higher than the manual detection rate. By comparing to the USGS catalog we were able to give a new attenuation law to compute local magnitudes in the region. A detailed analysis of the seismicity shows a clear migration toward the east of the region and a sudden decrease of seismicity 100 km east of Kathmandu which may reveal the presence of a tectonic

  6. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  7. The evaluation of the earthquake hazard using the exponential distribution method for different seismic source regions in and around Ağrı

    Energy Technology Data Exchange (ETDEWEB)

    Bayrak, Yusuf, E-mail: ybayrak@agri.edu.tr [Ağrı İbrahim Çeçen University, Ağrı/Turkey (Turkey); Türker, Tuğba, E-mail: tturker@ktu.edu.tr [Karadeniz Technical University, Department of Geophysics, Trabzon/Turkey (Turkey)

    2016-04-18

    The aim of this study; were determined of the earthquake hazard using the exponential distribution method for different seismic sources of the Ağrı and vicinity. A homogeneous earthquake catalog has been examined for 1900-2015 (the instrumental period) with 456 earthquake data for Ağrı and vicinity. Catalog; Bogazici University Kandilli Observatory and Earthquake Research Institute (Burke), National Earthquake Monitoring Center (NEMC), TUBITAK, TURKNET the International Seismological Center (ISC), Seismological Research Institute (IRIS) has been created using different catalogs like. Ağrı and vicinity are divided into 7 different seismic source regions with epicenter distribution of formed earthquakes in the instrumental period, focal mechanism solutions, and existing tectonic structures. In the study, the average magnitude value are calculated according to the specified magnitude ranges for 7 different seismic source region. According to the estimated calculations for 7 different seismic source regions, the biggest difference corresponding with the classes of determined magnitudes between observed and expected cumulative probabilities are determined. The recurrence period and earthquake occurrence number per year are estimated of occurring earthquakes in the Ağrı and vicinity. As a result, 7 different seismic source regions are determined occurrence probabilities of an earthquake 3.2 magnitude, Region 1 was greater than 6.7 magnitude, Region 2 was greater than than 4.7 magnitude, Region 3 was greater than 5.2 magnitude, Region 4 was greater than 6.2 magnitude, Region 5 was greater than 5.7 magnitude, Region 6 was greater than 7.2 magnitude, Region 7 was greater than 6.2 magnitude. The highest observed magnitude 7 different seismic source regions of Ağrı and vicinity are estimated 7 magnitude in Region 6. Region 6 are determined according to determining magnitudes, occurrence years of earthquakes in the future years, respectively, 7.2 magnitude was in 158

  8. Probabilistic safety analysis of earth retaining structures during earthquakes

    Science.gov (United States)

    Grivas, D. A.; Souflis, C.

    1982-07-01

    A procedure is presented for determining the probability of failure of Earth retaining structures under static or seismic conditions. Four possible modes of failure (overturning, base sliding, bearing capacity, and overall sliding) are examined and their combined effect is evaluated with the aid of combinatorial analysis. The probability of failure is shown to be a more adequate measure of safety than the customary factor of safety. As Earth retaining structures may fail in four distinct modes, a system analysis can provide a single estimate for the possibility of failure. A Bayesian formulation of the safety retaining walls is found to provide an improved measure for the predicted probability of failure under seismic loading. The presented Bayesian analysis can account for the damage incurred to a retaining wall during an earthquake to provide an improved estimate for its probability of failure during future seismic events.

  9. (Multi)fractality of Earthquakes by use of Wavelet Analysis

    Science.gov (United States)

    Enescu, B.; Ito, K.; Struzik, Z. R.

    2002-12-01

    The fractal character of earthquakes' occurrence, in time, space or energy, has by now been established beyond doubt and is in agreement with modern models of seismicity. Moreover, the cascade-like generation process of earthquakes -with one "main" shock followed by many aftershocks, having their own aftershocks- may well be described through multifractal analysis, well suited for dealing with such multiplicative processes. The (multi)fractal character of seismicity has been analysed so far by using traditional techniques, like the box-counting and correlation function algorithms. This work introduces a new approach for characterising the multifractal patterns of seismicity. The use of wavelet analysis, in particular of the wavelet transform modulus maxima, to multifractal analysis was pioneered by Arneodo et al. (1991, 1995) and applied successfully in diverse fields, such as the study of turbulence, the DNA sequences or the heart rate dynamics. The wavelets act like a microscope, revealing details about the analysed data at different times and scales. We introduce and perform such an analysis on the occurrence time of earthquakes and show its advantages. In particular, we analyse shallow seismicity, characterised by a high aftershock "productivity", as well as intermediate and deep seismic activity, known for its scarcity of aftershocks. We examine as well declustered (aftershocks removed) versions of seismic catalogues. Our preliminary results show some degree of multifractality for the undeclustered, shallow seismicity. On the other hand, at large scales, we detect a monofractal scaling behaviour, clearly put in evidence for the declustered, shallow seismic activity. Moreover, some of the declustered sequences show a long-range dependent (LRD) behaviour, characterised by a Hurst exponent, H > 0.5, in contrast with the memory-less, Poissonian model. We demonstrate that the LRD is a genuine characteristic and is not an effect of the time series probability

  10. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-15

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, each based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.

  11. Automated hazard analysis of digital control systems

    International Nuclear Information System (INIS)

    Garrett, Chris J.; Apostolakis, George E.

    2002-01-01

    Digital instrumentation and control (I and C) systems can provide important benefits in many safety-critical applications, but they can also introduce potential new failure modes that can affect safety. Unlike electro-mechanical systems, whose failure modes are fairly well understood and which can often be built to fail in a particular way, software errors are very unpredictable. There is virtually no nontrivial software that will function as expected under all conditions. Consequently, there is a great deal of concern about whether there is a sufficient basis on which to resolve questions about safety. In this paper, an approach for validating the safety requirements of digital I and C systems is developed which uses the Dynamic Flowgraph Methodology to conduct automated hazard analyses. The prime implicants of these analyses can be used to identify unknown system hazards, prioritize the disposition of known system hazards, and guide lower-level design decisions to either eliminate or mitigate known hazards. In a case study involving a space-based reactor control system, the method succeeded in identifying an unknown failure mechanism

  12. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Labor Market Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake

    Science.gov (United States)

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of economic Super Sectors in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each Super Sector to each Instrumental Intensity level. The analysis concerns the direct effect of the scenario earthquake on economic sectors and provides a baseline for the indirect and interactive analysis of an input-output model of the regional economy. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by the North American Industry Classification System. According to the analysis results, nearly 225,000 business

  13. Seismic hazard analysis of Sinop province, Turkey using ...

    Indian Academy of Sciences (India)

    1997-01-11

    Jan 11, 1997 ... 2008 in the Sinop province of Turkey this study presents a seismic hazard analysis based on ... Considering the development and improvement ... It is one of the most populated cities in the coun- ... done as reliably as the seismic hazard of region per- .... Seismic safety work of underground networks was.

  14. Fleeing to Fault Zones: Incorporating Syrian Refugees into Earthquake Risk Analysis along the East Anatolian and Dead Sea Rift Fault Zones

    Science.gov (United States)

    Wilson, B.; Paradise, T. R.

    2016-12-01

    The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration

  15. Coulomb Stress Change and Seismic Hazard of Rift Zones in Southern Tibet after the 2015 Mw7.8 Nepal Earthquake and Its Mw7.3 Aftershock

    Science.gov (United States)

    Dai, Z.; Zha, X.; Lu, Z.

    2015-12-01

    In southern Tibet (30~34N, 80~95E), many north-trending rifts, such as Yadong-Gulu and Lunggar rifts, are characterized by internally drained graben or half-graben basins bounded by active normal faults. Some developed rifts have become a portion of important transportation lines in Tibet, China. Since 1976, eighty-seven >Mw5.0 earthquakes have happened in the rift regions, and fifty-five events have normal faulting focal mechanisms according to the GCMT catalog. These rifts and normal faults are associated with both the EW-trending extension of the southern Tibet and the convergence between Indian and Tibet. The 2015 Mw7.8 Nepal great earthquake and its Mw7.3 aftershock occurred at the main Himalayan Thrust zone and caused tremendous damages in Kathmandu region. Those earthquakes will lead to significant viscoelastic deformation and stress changes in the southern Tibet in the future. To evaluate the seismic hazard in the active rift regions in southern Tibet, we modeled the slip distribution of the 2015 Nepal great earthquakes using the InSAR displacement field from the ALOS-2 satellite SAR data, and calculated the Coulomb failure stress (CFS) on these active normal faults in the rift zones. Because the estimated CFS depends on the geometrical parameters of receiver faults, it is necessary to get the accurate fault parameters in the rift zones. Some historical earthquakes have been studied using the field data, teleseismic data and InSAR observations, but results are in not agreement with each other. In this study, we revaluated the geometrical parameters of seismogenic faults occurred in the rift zones using some high-quality coseismic InSAR observations and teleseismic body-wave data. Finally, we will evaluate the seismic hazard in the rift zones according to the value of the estimated CFS and aftershock distribution.

  16. The revaluation of the macroseismic effects of March 4, 1977 earthquake in the frame of the new seismic hazard assessment methodologies

    International Nuclear Information System (INIS)

    Pantea, A.; Constantin, Angela; Anghel, M.

    2002-01-01

    To increase the earthquakes resistance of structure the design norms and construction require the best knowledge of seismic hazard parameters and using the new methodologies of seismic hazard assessment. One of these parameters is seismic intensity of the earthquakes occurred on the whole territory analyzed during as long as possible time interval for which data are available, especially for the strongest of them. For Romanian territory the strongest and the best known from the point of view of the macroseismic effects is the March 4, 1977 earthquake. Seismology by itself, without geophysics (solid earth physics), geology, geography, and geodesy, cannot fully, comprehensively, validly assess seismic hazards. Among those who have understood seismic hazard assessment as the result of cooperation between geosciences as a whole and seismology, one may quote Bune, 1978; Pantea et al., 2002, etc. Assessing seismic hazards is a complex undertaking, for it draws on a vast amount of knowledge in numerous sectors of geosciences, particularly solid earth physics as a branch of geophysics that also includes seismology, tectonic physics, gravimetry, geomagnetism, geochronology, etc.. It involves processing the results of complex geophysical, seismologic, tectonic, and geologic studies. To get a picture of, and understand, the laws that govern seismogenesis, one has to know what the relations are among the measured physical quantities indicating the properties of the rocks (whether gravimetric, magnetometric, electrometric, seismometric, or others), the dynamics of tectonic structures, as well as the nature and geological characteristics. Geophysics can be relied upon to determine the deep internal structure of the earth that geological methods are unable to reveal. Geophysics, and implicitly seismology, can help resolve the problem by: 1. Identifying the areas of the seismic sources and their characteristics, including focal depth, M max [Bune, 1978], and the recurrence chart

  17. ODH, oxygen deficiency hazard cryogenic analysis

    International Nuclear Information System (INIS)

    Augustynowicz, S.D.

    1994-01-01

    An oxygen deficiency exists when the concentration of oxygen, by volume, drops to a level at which atmosphere supplying respiratory protection must be provided. Since liquid cryogens can expand by factors of 700 (LN 2 ) to 850 (LH e ), the uncontrolled release into an enclosed space can easily cause an oxygen-deficient condition. An oxygen deficiency hazard (ODH) fatality rate per hour (OE) is defined as: OE = Σ N i P i F i , where N i = number of components, P i = probability of failure or operator error, and F i = fatality factor. ODHs range from open-quotes unclassifiedclose quotes (OE -9 1/h) to class 4, which is the most hazardous (OE>10 -1 1/h). For Superconducting Super Collider Laboratory (SSCL) buildings where cryogenic systems exist, failure rate, fatality factor, reduced oxygen ratio, and fresh air circulation are examined

  18. Proceedings of conference XXXII; workshop on Future directions in evaluating earthquake hazards of Southern California

    Science.gov (United States)

    Brown, William M.; Kockelman, William J.; Ziony, Joseph I.

    1986-01-01

    Hydrologic data were collected at White Sands Missile Range, NM, in 1985. The total groundwater withdrawal in 1985 was 676,433 ,800 gallons. The 11 supply wells in the Post Headquarters well field produced 642,056,000 gallons, or about 95 percent of the total. The six Range area supply wells produced 34,377,800 gallons. The total groundwater withdrawal was 8,841,200 gallons less in 1985 than 1984. Water samples from six Post Headquarters supply wells were collected for major chemical analysis. Water samples from 19 other wells were collected for pH and specific-conductance analysis. Depth-to-water measurements in the Post Headquarters supply wells showed seasonal fluctuations as well as continued long-term declines. (USGS)

  19. Statistical analysis of the uncertainty related to flood hazard appraisal

    Science.gov (United States)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  20. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    Science.gov (United States)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  1. Analysis of soil radon data in earthquake precursory studies

    Directory of Open Access Journals (Sweden)

    Hari Prasad Jaishi

    2014-10-01

    Full Text Available Soil radon data were recorded at two selected sites along Mat fault in Mizoram (India, which lies in the highest seismic zone in India. The study was carried out during July 2011 to May 2013 using LR-115 Type II films. Precursory changes in radon concentration were observed prior to some earthquakes that occurred around the measuring sites. Positive correlation was found between the measured radon data and the seismic activity in the region. Statistical analysis of the radon data together with the meteorological parameters was done using Multiple Regression Method. Results obtained show that the method employed was useful for removing the effect of meteorological parameters and to identify radon maxima possibly caused by seismic activity.

  2. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    International Nuclear Information System (INIS)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka; Ken Yanagisawa; Tadashi Annaka

    2006-01-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present an example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)

  3. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    Science.gov (United States)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard

  4. NATURAL HAZARD ASSESSMENT OF SW MYANMAR - A CONTRIBUTION OF REMOTE SENSING AND GIS METHODS TO THE DETECTION OF AREAS VULNERABLE TO EARTHQUAKES AND TSUNAMI / CYCLONE FLOODING

    Directory of Open Access Journals (Sweden)

    George Pararas-Carayannis

    2009-01-01

    Full Text Available Myanmar, formerly Burma, is vulnerable to several natural hazards, such as earthquakes, cyclones, floods, tsunamis and landslides. The present study focuses on geomorphologic and geologic investigations of the south-western region of the country, based on satellite data (Shuttle Radar Topography Mission-SRTM, MODIS and LANDSAT. The main objective is to detect areas vulnerable to inundation by tsunami waves and cyclone surges. Since the region is also vulnerable to earthquake hazards, it is also important to identify seismotectonic patterns, the location of major active faults, and local site conditions that may enhance ground motions and earthquake intensities. As illustrated by this study, linear, topographic features related to subsurface tectonic features become clearly visible on SRTM-derived morphometric maps and on LANDSAT imagery. The GIS integrated evaluation of LANDSAT and SRTM data helps identify areas most susceptible to flooding and inundation by tsunamis and storm surges. Additionally, land elevation maps help identify sites greater than 10 m in elevation height, that would be suitable for the building of protective tsunami/cyclone shelters.

  5. Study on China’s Earthquake Prediction by Mathematical Analysis and its Application in Catastrophe Insurance

    Science.gov (United States)

    Jianjun, X.; Bingjie, Y.; Rongji, W.

    2018-03-01

    The purpose of this paper was to improve catastrophe insurance level. Firstly, earthquake predictions were carried out using mathematical analysis method. Secondly, the foreign catastrophe insurances’ policies and models were compared. Thirdly, the suggestions on catastrophe insurances to China were discussed. The further study should be paid more attention on the earthquake prediction by introducing big data.

  6. Atlas of Wenchuan-Earthquake Geohazards : Analysis of co-seismic and post-seismic Geohazards in the area affected by the 2008 Wenchuan Earthquake

    NARCIS (Netherlands)

    Tang, Chuan; van Westen, C.J.

    2018-01-01

    This atlas provides basic information and overviews of the occurrence of co-seismic landslides, the subsequent rainstorm-induced debris flows, and the methods used for hazard and risk assessment in the Wenchuan-earthquake affected area. The atlas pages are illustrated with maps, photos and graphs,

  7. Flash sourcing, or rapid detection and characterization of earthquake effects through website traffic analysis

    Directory of Open Access Journals (Sweden)

    Laurent Frobert

    2011-06-01

    Full Text Available

    This study presents the latest developments of an approach called ‘flash sourcing’, which provides information on the effects of an earthquake within minutes of its occurrence. Information is derived from an analysis of the website traffic surges of the European–Mediterranean Seismological Centre website after felt earthquakes. These surges are caused by eyewitnesses to a felt earthquake, who are the first who are informed of, and hence the first concerned by, an earthquake occurrence. Flash sourcing maps the felt area, and at least in some circumstances, the regions affected by severe damage or network disruption. We illustrate how the flash-sourced information improves and speeds up the delivery of public earthquake information, and beyond seismology, we consider what it can teach us about public responses when experiencing an earthquake. Future developments should improve the description of the earthquake effects and potentially contribute to the improvement of the efficiency of earthquake responses by filling the information gap after the occurrence of an earthquake.

  8. Fire Hazards Analysis for the 200 Area Interim Storage Area

    International Nuclear Information System (INIS)

    JOHNSON, D.M.

    2000-01-01

    This documents the Fire Hazards Analysis (FHA) for the 200 Area Interim Storage Area. The Interim Storage Cask, Rad-Vault, and NAC-1 Cask are analyzed for fire hazards and the 200 Area Interim Storage Area is assessed according to HNF-PRO-350 and the objectives of DOE Order 5480 7A. This FHA addresses the potential fire hazards associated with the Interim Storage Area (ISA) facility in accordance with the requirements of DOE Order 5480 7A. It is intended to assess the risk from fire to ensure there are no undue fire hazards to site personnel and the public and to ensure property damage potential from fire is within acceptable limits. This FHA will be in the form of a graded approach commensurate with the complexity of the structure or area and the associated fire hazards

  9. Direct methods of soil-structure interaction analysis for earthquake loadings (IV)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J B; Kim, D S; Choi, J S; Kwon, K C; Kim, Y J; Lee, H J; Kim, S B; Kim, D K [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite element method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI-QK' has been developed. The computer program has been verified using a free-field site-response problem. The Hualien FVT stochastic finite element analysis after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analysis for the LSST structure has also been performed and compared with the measured data.

  10. Direct methods of soil-structure interaction analysis for earthquake loadings (IV)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Kim, D. S.; Choi, J. S.; Kwon, K. C.; Kim, Y. J.; Lee, H. J.; Kim, S. B.; Kim, D. K. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite element method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI-QK' has been developed. The computer program has been verified using a free-field site-response problem. The Hualien FVT stochastic finite element analysis after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analysis for the LSST structure has also been performed and compared with the measured data.

  11. Singular limit analysis of a model for earthquake faulting

    DEFF Research Database (Denmark)

    Bossolini, Elena; Brøns, Morten; Kristiansen, Kristian Uldall

    2017-01-01

    In this paper we consider the one dimensional spring-block model describing earthquake faulting. By using geometric singular perturbation theory and the blow-up method we provide a detailed description of the periodicity of the earthquake episodes. In particular, the limit cycles arise from...

  12. Agent-based simulation for human-induced hazard analysis.

    Science.gov (United States)

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  13. An innovative view to the seismic hazard from strong Vrancea intermediate-depth earthquakes: the case studies of Bucharest (Romania) and Russe (Bulgaria)

    International Nuclear Information System (INIS)

    Panza, G.F.; Cioflan, C.; Marmureanu, G.; Kouteva, M.; Paskaleva, I.; Romanelli, F.

    2003-04-01

    An advanced procedure for ground motion modelling, capable of synthesizing the seismic ground motion from basic understanding of fault mechanism and seismic wave propagation, is applied to compute seismic signals at Bucharest (Romania) and Russe, NE Bulgaria, due to the seismic hazard from intermediate-depth Vrancea earthquakes. The theoretically obtained signals are successfully compared with the available observations. For both case studies site response estimates along selected geological cross sections are provided for three recent, strong and intermediate-depth, Vrancea earthquakes: August 30, 1986 and May 30 and 31, 1990. The applied ground motion modelling technique has proved that it is possible to investigate the local effects, taking into account both the seismic source and the propagation path effects. The computation of realistic seismic input, utilising the huge amount of geological, geophysical and geotechnical data, already available, goes well beyond the conventional deterministic approach and gives an economically valid scientific tool for seismic microzonation. (author)

  14. Generation of artificial earthquakes for dynamic analysis of nuclear power plant

    International Nuclear Information System (INIS)

    Tsushima, Y.; Hiromatsu, T.; Abe, Y.; Tamaki, T.

    1979-01-01

    A procedure for generating artificial earthquakes for the purpose of the dynamic analysis of the nuclear power plant has been studied and relevant computer codes developed. This paper describes brieafly the generation procedure employed in the computer codes and also deals with the results of two artificial earthquakes generated as an example for input motions for the aseismic design of a BWR-type reactor building. Using one of the generated artificial earthquakes and two actually recorded earthquakes, non-linear responses of the reactor building were computed and the results were compared with each other. From this comparison, it has been concluded that the computer codes are practically usable and the generated artificial earthquakes are useful and powerful as input motions for dynamic analysis of a nuclear power plant. (author)

  15. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-08-01

    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  16. Estimated airborne release of plutonium from the 102 Building at the General Electric Vallecitos Nuclear Center, Vallecitos, California, as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.; Hays, I.D.

    1980-12-01

    This report estimates the potential airborne releases of plutonium as a consequence of various severities of earthquake and wind hazard postulated for the 102 Building at the General Electric Vallecitos Nuclear Center in California. The releases are based on damage scenarios developed by other specialists. The hazard severities presented range up to a nominal velocity of 230 mph for wind hazard and are in excess of 0.8 g linear acceleration for earthquakes. The consequences of thrust faulting are considered. The approaches and factors used to estimate the releases are discussed. Release estimates range from 0.003 to 3 g Pu

  17. Direct methods of soil-structure interaction analysis for earthquake loadings (V)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Choi, J. S.; Lee, J. J.; Park, D. U. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI' has been developed. The computer program has been verified using a free-field site-response problem. Post-correlation analysis for the Hualien FVT after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analyses for three LSST structures (Hualien, Lotung and Tepsco structure) have also been performed and compared with the measured data.

  18. Hospital stay as a proxy indicator for severe injury in earthquakes: a retrospective analysis.

    Science.gov (United States)

    Zhao, Lu-Ping; Gerdin, Martin; Westman, Lina; Rodriguez-Llanes, Jose Manuel; Wu, Qi; van den Oever, Barbara; Pan, Liang; Albela, Manuel; Chen, Gao; Zhang, De-Sheng; Guha-Sapir, Debarati; von Schreeb, Johan

    2013-01-01

    Earthquakes are the most violent type of natural disasters and injuries are the dominant medical problem in the early phases after earthquakes. However, likely because of poor data availability, high-quality research on injuries after earthquakes is lacking. Length of hospital stay (LOS) has been validated as a proxy indicator for injury severity in high-income settings and could potentially be used in retrospective research of injuries after earthquakes. In this study, we assessed LOS as an adequate proxy indicator for severe injury in trauma survivors of an earthquake. A retrospective analysis was conducted using a database of 1,878 injured patients from the 2008 Wenchuan earthquake. Our primary outcome was severe injury, defined as a composite measure of serious injury or resource use. Secondary outcomes were serious injury and resource use, analysed separately. Non-parametric receiver operating characteristics (ROC) and area under the curve (AUC) analysis was used to test the discriminatory accuracy of LOS when used to identify severe injury. An 0.7earthquake survivors. However, LOS was found to be a proxy for major nonorthopaedic surgery and blood transfusion. These findings can be useful for retrospective research on earthquake-injured patients when detailed hospital records are not available.

  19. Comparison of Structurally Controlled Landslide Hazard Simulation to the Co-seismic Landslides Caused by the M 7.2 2013 Bohol Earthquake.

    Science.gov (United States)

    Galang, J. A. M. B.; Eco, R. C.; Lagmay, A. M. A.

    2014-12-01

    The M_w 7.2 October 15, 2013 Bohol earthquake is one of the more destructive earthquake to hit the Philippines in the 21st century. The epicenter was located in Sagbayan municipality, central Bohol and was generated by a previously unmapped reverse fault called the "Inabanga Fault". The earthquake resulted in 209 fatalities and over 57 million USD worth of damages. The earthquake generated co-seismic landslides most of which were related to fault structures. Unlike rainfall induced landslides, the trigger for co-seismic landslides happen without warning. Preparations for this type of landslides rely heavily on the identification of fracture-related slope instability. To mitigate the impacts of co-seismic landslide hazards, morpho-structural orientations of discontinuity sets were mapped using remote sensing techniques with the aid of a Digital Terrain Model (DTM) obtained in 2012. The DTM used is an IFSAR derived image with a 5-meter pixel resolution and approximately 0.5 meter vertical accuracy. Coltop 3D software was then used to identify similar structures including measurement of their dip and dip directions. The chosen discontinuity sets were then keyed into Matterocking software to identify potential rock slide zones due to planar or wedged discontinuities. After identifying the structurally-controlled unstable slopes, the rock mass propagation extent of the possible rock slides was simulated using Conefall. Separately, a manually derived landslide inventory has been performed using post-earthquake satellite images and LIDAR. The results were compared to the landslide inventory which identified at least 873 landslides. Out of the 873 landslides identified through the inventory, 786 or 90% intersect the simulated structural-controlled landslide hazard areas of Bohol. The results show the potential of this method to identify co-seismic landslide hazard areas for disaster mitigation. Along with computer methods to simulate shallow landslides, and debris flow

  20. Hazards analysis of TNX Large Melter-Off-Gas System

    International Nuclear Information System (INIS)

    Randall, C.T.

    1982-03-01

    Analysis of the potential safety hazards and an evaluation of the engineered safety features and administrative controls indicate that the LMOG System can be operated without undue hazard to employees or the public, or damage to equipment. The safety features provided in the facility design coupled with the planned procedural and administrative controls make the occurrence of serious accidents very improbable. A set of recommendations evolved during this analysis that was judged potentially capable of further reducing the probability of personnel injury or further mitigating the consequences of potential accidents. These recommendations concerned areas such as formic acid vapor hazards, hazard of feeding water to the melter at an uncontrolled rate, prevention of uncontrolled glass pours due to melter pressure excursions and additional interlocks. These specific suggestions were reviewed with operational and technical personnel and are being incorporated into the process. The safeguards provided by these recommendations are discussed in this report

  1. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    Science.gov (United States)

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  2. Frequency Analysis of Aircraft hazards for License Application

    International Nuclear Information System (INIS)

    K. Ashley

    2006-01-01

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards

  3. Time history nonlinear earthquake response analysis considering materials and geometrical nonlinearity

    International Nuclear Information System (INIS)

    Kobayashi, T.; Yoshikawa, K.; Takaoka, E.; Nakazawa, M.; Shikama, Y.

    2002-01-01

    A time history nonlinear earthquake response analysis method was proposed and applied to earthquake response prediction analysis for a Large Scale Seismic Test (LSST) Program in Hualien, Taiwan, in which a 1/4 scale model of a nuclear reactor containment structure was constructed on sandy gravel layer. In the analysis both of strain-dependent material nonlinearity, and geometrical nonlinearity by base mat uplift, were considered. The 'Lattice Model' for the soil-structure interaction model was employed. An earthquake record on soil surface at the site was used as control motion, and deconvoluted to the input motion of the analysis model at GL-52 m with 300 Gal of maximum acceleration. The following two analyses were considered: (A) time history nonlinear, (B) equivalent linear, and the advantage of time history nonlinear earthquake response analysis method is discussed

  4. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  5. Hazard screening application guide. Safety Analysis Report Update Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  6. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  7. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    International Nuclear Information System (INIS)

    SHULTZ MV

    2008-01-01

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process

  8. Association between earthquake events and cholera outbreaks: a cross-country 15-year longitudinal analysis.

    Science.gov (United States)

    Sumner, Steven A; Turner, Elizabeth L; Thielman, Nathan M

    2013-12-01

    Large earthquakes can cause population displacement, critical sanitation infrastructure damage, and increased threats to water resources, potentially predisposing populations to waterborne disease epidemics such as cholera. Problem The risk of cholera outbreaks after earthquake disasters remains uncertain. A cross-country analysis of World Health Organization (WHO) cholera data that would contribute to this discussion has yet to be published. A cross-country longitudinal analysis was conducted among 63 low- and middle-income countries from 1995-2009. The association between earthquake disasters of various effect sizes and a relative spike in cholera rates for a given country was assessed utilizing fixed-effects logistic regression and adjusting for gross domestic product per capita, water and sanitation level, flooding events, percent urbanization, and under-five child mortality. Also, the association between large earthquakes and cholera rate increases of various degrees was assessed. Forty-eight of the 63 countries had at least one year with reported cholera infections during the 15-year study period. Thirty-six of these 48 countries had at least one earthquake disaster. In adjusted analyses, country-years with ≥10,000 persons affected by an earthquake had 2.26 times increased odds (95 CI, 0.89-5.72, P = .08) of having a greater than average cholera rate that year compared to country-years having earthquake. The association between large earthquake disasters and cholera infections appeared to weaken as higher levels of cholera rate increases were tested. A trend of increased risk of greater than average cholera rates when more people were affected by an earthquake in a country-year was noted. However these findings did not reach statistical significance at traditional levels and may be due to chance. Frequent large-scale cholera outbreaks after earthquake disasters appeared to be relatively uncommon.

  9. Guidance document on practices to model and implement Earthquake hazards in extended PSA (final version). Volume 1

    International Nuclear Information System (INIS)

    Decker, K.; Hirata, K.; Groudev, P.

    2016-01-01

    The current report provides guidance for the assessment of seismo-tectonic hazards in level 1 and 2 PSA. The objective is to review existing guidance, identify methodological challenges, and to propose novel guidance on key issues. Guidance for the assessment of vibratory ground motion and fault capability comprises the following: - listings of data required for the hazard assessment and methods to estimate data quality and completeness; - in-depth discussion of key input parameters required for hazard models; - discussions on commonly applied hazard assessment methodologies; - references to recent advances of science and technology. Guidance on the assessment of correlated or coincident hazards comprises of chapters on: - screening of correlated hazards; - assessment of correlated hazards (natural and man-made); - assessment of coincident hazards. (authors)

  10. Ranking of several ground-motion models for seismic hazard analysis in Iran

    International Nuclear Information System (INIS)

    Ghasemi, H; Zare, M; Fukushima, Y

    2008-01-01

    In this study, six attenuation relationships are classified with respect to the ranking scheme proposed by Scherbaum et al (2004 Bull. Seismol. Soc. Am. 94 1–22). First, the strong motions recorded during the 2002 Avaj, 2003 Bam, 2004 Kojour and 2006 Silakhor earthquakes are consistently processed. Then the normalized residual sets are determined for each selected ground-motion model, considering the strong-motion records chosen. The main advantage of these records is that corresponding information about the causative fault plane has been well studied for the selected events. Such information is used to estimate several control parameters which are essential inputs for attenuation relations. The selected relations (Zare et al (1999 Soil Dyn. Earthq. Eng. 18 101–23); Fukushima et al (2003 J. Earthq. Eng. 7 573–98); Sinaeian (2006 PhD Thesis International Institute of Earthquake Engineering and Seismology, Tehran, Iran); Boore and Atkinson (2007 PEER, Report 2007/01); Campbell and Bozorgnia (2007 PEER, Report 2007/02); and Chiou and Youngs (2006 PEER Interim Report for USGS Review)) have been deemed suitable for predicting peak ground-motion amplitudes in the Iranian plateau. Several graphical techniques and goodness-of-fit measures are also applied for statistical distribution analysis of the normalized residual sets. Such analysis reveals ground-motion models, developed using Iranian strong-motion records as the most appropriate ones in the Iranian context. The results of the present study are applicable in seismic hazard assessment projects in Iran

  11. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures

    Science.gov (United States)

    Kalkan, Erol; Chopra, Anil K.

    2010-01-01

    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  12. Automated economic analysis model for hazardous waste minimization

    International Nuclear Information System (INIS)

    Dharmavaram, S.; Mount, J.B.; Donahue, B.A.

    1990-01-01

    The US Army has established a policy of achieving a 50 percent reduction in hazardous waste generation by the end of 1992. To assist the Army in reaching this goal, the Environmental Division of the US Army Construction Engineering Research Laboratory (USACERL) designed the Economic Analysis Model for Hazardous Waste Minimization (EAHWM). The EAHWM was designed to allow the user to evaluate the life cycle costs for various techniques used in hazardous waste minimization and to compare them to the life cycle costs of current operating practices. The program was developed in C language on an IBM compatible PC and is consistent with other pertinent models for performing economic analyses. The potential hierarchical minimization categories used in EAHWM include source reduction, recovery and/or reuse, and treatment. Although treatment is no longer an acceptable minimization option, its use is widespread and has therefore been addressed in the model. The model allows for economic analysis for minimization of the Army's six most important hazardous waste streams. These include, solvents, paint stripping wastes, metal plating wastes, industrial waste-sludges, used oils, and batteries and battery electrolytes. The EAHWM also includes a general application which can be used to calculate and compare the life cycle costs for minimization alternatives of any waste stream, hazardous or non-hazardous. The EAHWM has been fully tested and implemented in more than 60 Army installations in the United States

  13. Stress Drops for Oceanic Crust and Mantle Intraplate Earthquakes in the Subduction Zone of Northeastern Japan Inferred from the Spectral Inversion Analysis

    Science.gov (United States)

    Si, H.; Ishikawa, K.; Arai, T.; Ibrahim, R.

    2017-12-01

    Understanding stress drop related to intraplate earthquakes in the subducting plate is very important for seismic hazard mitigation. In previous studies, Kita et al. (2015) analyzed stress drops for intraplate earthquakes under Hokkaido, Northern Japan, using S-coda wave spectral ratio analysis methods, and found that the stress drop for events occurring more than 10 km beneath the upper surface of the subducting plate (within the oceanic mantle) was larger than the stress drop for events occurring within 10 km of the upper surface of the subducting plate (in the oceanic crust). In this study, we focus on intraplate earthquakes that occur under Tohoku, Northeastern Japan, to determine whether similar stress drop differences may exist between earthquakes occurring within the upper 10 km of the subducting plate (within the oceanic crust) and those occurring deeper than 10 km (within the oceanic mantle), based on spectral inversion analysis of seismic waveforms recorded during the earthquakes. We selected 64 earthquakes with focal depths between 49-76 km and Mw 3.5-5.0 that occurred in the source area of the 2003 Miyagi-ken-oki earthquake (Mw 7.0) (region 1), and 82 earthquakes with focal depths between 49-67 km and Mw 3.5-5.5 in the source area of the 2011 Miyagi- ken-oki earthquake (Mw 7.1) (region 2). Records from the target earthquakes at 24 stations in region 1 and 21 stations in region 2 were used in the analysis. A 5-sec time window following S-wave onset was used for each station record. Borehole records of KiK-net station (MYGH04) was used as a reference station for both regions 1 and 2. We applied the spectral inversion analysis method of Matsunami et al. (2003) separately to regions 1 and 2. Our results show that stress drop generally increases with focal depth and that the stress drop for events occurring deeper than 10 km in the plate (within the oceanic mantle) were larger than the stress drop for events occurring within 10 km of the upper surface of the

  14. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  15. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 25. Parameters for Specifying Intensity-Related Earthquake Ground Motions.

    Science.gov (United States)

    1987-09-01

    and Sponheuer, W. 1969. Scale of Seismic Intensity: Proc. Fourth World Conf. on Earthquake Engineering, Santiago, Chile . Murphy, J. R., and O’Brien, L...Predom V/H el, V/I Vel V/H Displ V/H sec VIH Period Period Predom Accel cm/sec Vel cm Disp .05 Dur sec sec Period S11 2 0.48 MODIFIED MERCALLI INTENSITY...0.1 0. 0.16 142.20 Long. Vert Hor Vert Ratio Ratio Vert Ratio Vert r io Du r atio Predom Predom VIH Acce V/H Vel V /H Dspi V H sec 1, H Period Period

  16. Flashsourcing or Real-Time Mapping of Earthquake Effects from Instantaneous Analysis of the EMSC Website Traffic

    Science.gov (United States)

    Bossu, R.; Gilles, S.; Roussel, F.

    2010-12-01

    Earthquake response efforts are often hampered by the lack of timely and reliable information on the earthquake impact. Rapid detection of damaging events and production of actionable information for emergency response personnel within minutes of their occurrence are essential to mitigate the human impacts from earthquakes. Economically developed countries deploy dense real-time accelerometric networks in regions of high seismic hazard to constrain scenarios from in-situ data. A cheaper alternative, named flashsourcing, is based on implicit data derived from the analysis of the visits by eyewitnesses, the first informed persons, to websites offering real time earthquake information. We demonstrated in 2004 that widely felt earthquakes generate a surge of traffic, known as a flashcrowd, caused by people rushing websites such as the EMSC’s to find information about the shaking they have just felt. With detailed traffic analysis and metrics, widely felt earthquakes can be detected within one minute of the earthquake’s occurrence. In addition, the geographical area where the earthquake has been felt is automatically mapped within 5 minutes by statistically analysing the IP locations of the eyewitnesses, without using any seismological data. These results have been validated on more than 150 earthquakes by comparing the automatic felt maps with the felt area derived from macroseismic questionnaires. In practice, the felt maps are available before the first location is published by the EMSC. We have also demonstrated the capacity to rapidly detect and map areas of widespread damage by detecting when visitors suddenly end their sessions on the website en masse. This has been successfully applied to time and map the massive power failure which plunged a large part of Chile into darkness in March, 2010. If damage to power and communication lines cannot be discriminated from damage to buildings, the absence of sudden session closures precludes the possibility of heavy

  17. The 13 January 2001 El Salvador earthquake: A multidata analysis

    Science.gov (United States)

    ValléE, Martin; Bouchon, Michel; Schwartz, Susan Y.

    2003-04-01

    On 13 January 2001, a large normal faulting intermediate depth event (Mw = 7.7) occurred 40 km off the El Salvadorian coast (Central America). We analyze this earthquake using teleseismic, regional, and local data. We first build a kinematic source model by simultaneously inverting P and SH displacement waveforms and source time functions derived from surface waves using an empirical Green's function analysis. In an attempt to discriminate between the two nodal planes (30° trenchward dipping and 60° landward dipping), we perform identical inversions using both possible fault planes. After relocating the hypocentral depth at 54 km, we retrieve the kinematic features of the rupture using a combination of the Neighborhood algorithm of [1999] and the Simplex method allowing for variable rupture velocity and slip. We find updip rupture propagation yielding a centroid depth around 47 km for both assumed fault planes with a larger variance reduction obtained using the 60° landward dipping nodal plane. We test the two possible fault models using regional broadband data and near-field accelerograms provided by [2001]. Near-field data confirm that the steeper landward dipping nodal plane is preferred. Rupture propagated mostly updip and to the northwest, resulting in a main moment release zone of approximately 25 km × 50 km with an average slip of ˜3.5 m. The large slip occurs near the interplate interface at a location where the slab steepens dip significantly. The occurrence of this event is well-explained by bending of the subducting plate.

  18. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    International Nuclear Information System (INIS)

    Lewis, W.S.

    1994-01-01

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment

  19. Performance of USGS one-year earthquake hazard map for natural and induced seismicity in the central and eastern United States

    Science.gov (United States)

    Brooks, E. M.; Stein, S.; Spencer, B. D.; Salditch, L.; Petersen, M. D.; McNamara, D. E.

    2017-12-01

    Seismicity in the central United States has dramatically increased since 2008 due to the injection of wastewater produced by oil and gas extraction. In response, the USGS created a one-year probabilistic hazard model and map for 2016 to describe the increased hazard posed to the central and eastern United States. Using the intensity of shaking reported to the "Did You Feel It?" system during 2016, we assess the performance of this model. Assessing the performance of earthquake hazard maps for natural and induced seismicity is conceptually similar but has practical differences. Maps that have return periods of hundreds or thousands of years— as commonly used for natural seismicity— can be assessed using historical intensity data that also span hundreds or thousands of years. Several different features stand out when assessing the USGS 2016 seismic hazard model for the central and eastern United States from induced and natural earthquakes. First, the model can be assessed as a forecast in one year, because event rates are sufficiently high to permit evaluation with one year of data. Second, because these models are projections from the previous year thus implicitly assuming that fluid injection rates remain the same, misfit may reflect changes in human activity. Our results suggest that the model was very successful by the metric implicit in probabilistic hazard seismic assessment: namely, that the fraction of sites at which the maximum shaking exceeded the mapped value is comparable to that expected. The model also did well by a misfit metric that compares the spatial patterns of predicted and maximum observed shaking. This was true for both the central and eastern United States as a whole, and for the region within it with the highest amount of seismicity, Oklahoma and its surrounding area. The model performed least well in northern Texas, over-stating hazard, presumably because lower oil and gas prices and regulatory action reduced the water injection volume

  20. Multiple injuries after earthquakes: a retrospective analysis on 1,871 injured patients from the 2008 Wenchuan earthquake.

    Science.gov (United States)

    Lu-Ping, Zhao; Rodriguez-Llanes, Jose Manuel; Qi, Wu; van den Oever, Barbara; Westman, Lina; Albela, Manuel; Liang, Pan; Gao, Chen; De-Sheng, Zhang; Hughes, Melany; von Schreeb, Johan; Guha-Sapir, Debarati

    2012-05-17

    Multiple injuries have been highlighted as an important clinical dimension of the injury profile following earthquakes, but studies are scarce. We investigated the pattern and combination of injuries among patients with two injuries following the 2008 Wenchuan earthquake. We also described the general injury profile, causes of injury and socio-demographic characteristics of the injured patients. A retrospective hospital-based analysis of 1,871 earthquake injured patients, totaling 3,177 injuries, admitted between 12 and 31 May 2008 to the People's Hospital of Deyang city (PHDC). An electronic, webserver-based database with International Classification of Diseases (ICD)-10-based classification of earthquake-related injury diagnoses (IDs), anatomical sites and additional background variables of the inpatients was used. We analyzed this dataset for injury profile and number of injuries per patient. We then included all patients (856) with two injuries for more in-depth analysis. Possible spatial anatomical associations were determined a priori. Cross-tabulation and more complex frequency matrices for combination analyses were used to investigate the injury profile. Out of the 1,871 injured patients, 810 (43.3%) presented with a single injury. The rest had multiple injuries; 856 (45.8%) had two, 169 (9.0%) patients had three, 32 (1.7%) presented with four injuries, while only 4 (0.2%) were diagnosed with five injuries. The injury diagnoses of patients presenting with two-injuries showed important anatomical intra-site or neighboring clustering, which explained 49.1% of the combinations. For fractures, the result was even more marked as spatial clustering explained 57.9% of the association pattern. The most frequent combination of IDs was a double-fracture, affecting 20.7% of the two-injury patients (n = 177). Another 108 patients (12.6%) presented with fractures associated with crush injury and organ-soft tissue injury. Of the 3,177 injuries, 1,476 (46.5%) were

  1. An evaluation of an operating BWR piping system damping during earthquake by applying auto regressive analysis

    International Nuclear Information System (INIS)

    Kitada, Y.; Makiguchi, M.; Komori, A.; Ichiki, T.

    1985-01-01

    The records of three earthquakes which had induced significant earthquake response to the piping system were obtained with the earthquake observation system. In the present paper, first, the eigenvalue analysis results for the natural piping system based on the piping support (boundary) conditions are described and second, the frequency and the damping factor evaluation results for each vibrational mode are described. In the present study, the Auto Regressive (AR) analysis method is used in the evaluation of natural frequencies and damping factors. The AR analysis applied here has a capability of direct evaluation of natural frequencies and damping factors from earthquake records observed on a piping system without any information on the input motions to the system. (orig./HP)

  2. Investigation of Pre-Earthquake Ionospheric Disturbances by 3D Tomographic Analysis

    Science.gov (United States)

    Yagmur, M.

    2016-12-01

    Ionospheric variations before earthquakes have been widely discussed phenomena in ionospheric studies. To clarify the source and mechanism of these phenomena is highly important for earthquake forecasting. To well understanding the mechanical and physical processes of pre-seismic Ionospheric anomalies that might be related even with Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling, both statistical and 3D modeling analysis are needed. For these purpose, firstly we have investigated the relation between Ionospheric TEC Anomalies and potential source mechanisms such as space weather activity and lithospheric phenomena like positive surface electric charges. To distinguish their effects on Ionospheric TEC, we have focused on pre-seismically active days. Then, we analyzed the statistical data of 54 earthquakes that M≽6 between 2000 and 2013 as well as the 2011 Tohoku and the 2016 Kumamoto Earthquakes in Japan. By comparing TEC anomaly and Solar activity by Dst Index, we have found that 28 events that might be related with Earthquake activity. Following the statistical analysis, we also investigate the Lithospheric effect on TEC change on selected days. Among those days, we have chosen two case studies as the 2011 Tohoku and the 2016 Kumamoto Earthquakes to make 3D reconstructed images by utilizing 3D Tomography technique with Neural Networks. The results will be presented in our presentation. Keywords : Earthquake, 3D Ionospheric Tomography, Positive and Negative Anomaly, Geomagnetic Storm, Lithosphere

  3. Analysis of the Earthquake Impact towards water-based fire extinguishing system

    Science.gov (United States)

    Lee, J.; Hur, M.; Lee, K.

    2015-09-01

    Recently, extinguishing system installed in the building when the earthquake occurred at a separate performance requirements. Before the building collapsed during the earthquake, as a function to maintain a fire extinguishing. In particular, the automatic sprinkler fire extinguishing equipment, such as after a massive earthquake without damage to piping also must maintain confidentiality. In this study, an experiment installed in the building during the earthquake, the water-based fire extinguishing saw grasp the impact of the pipe. Experimental structures for water-based fire extinguishing seismic construction step by step, and then applied to the seismic experiment, the building appears in the extinguishing of the earthquake response of the pipe was measured. Construction of acceleration caused by vibration being added to the size and the size of the displacement is measured and compared with the data response of the pipe from the table, thereby extinguishing water piping need to enhance the seismic analysis. Define the seismic design category (SDC) for the four groups in the building structure with seismic criteria (KBC2009) designed according to the importance of the group and earthquake seismic intensity. The event of a real earthquake seismic analysis of Category A and Category B for the seismic design of buildings, the current fire-fighting facilities could have also determined that the seismic performance. In the case of seismic design categories C and D are installed in buildings to preserve the function of extinguishing the required level of seismic retrofit design is determined.

  4. The 2011 Mineral, Virginia, earthquake and its significance for seismic hazards in eastern North America: overview and synthesis

    Science.gov (United States)

    Horton, J. Wright; Chapman, Martin C.; Green, Russell A.

    2015-01-01

    The 23 August 2011 Mw (moment magnitude) 5.7 ± 0.1, Mineral, Virginia, earthquake was the largest and most damaging in the central and eastern United States since the 1886 Mw 6.8–7.0, Charleston, South Carolina, earthquake. Seismic data indicate that the earthquake rupture occurred on a southeast-dipping reverse fault and consisted of three subevents that progressed northeastward and updip. U.S. Geological Survey (USGS) "Did You Feel It?" intensity reports from across the eastern United States and southeastern Canada, rockfalls triggered at distances to 245 km, and regional groundwater-level changes are all consistent with efficient propagation of high-frequency seismic waves (∼1 Hz and higher) in eastern North America due to low attenuation.

  5. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    Science.gov (United States)

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  6. Earthquake scenario and probabilistic ground-shaking hazard maps for the Albuquerque-Belen-Santa Fe, New Mexico, corridor

    Science.gov (United States)

    Wong, I.; Olig, S.; Dober, M.; Silva, W.; Wright, D.; Thomas, P.; Gregor, N.; Sanford, A.; Lin, K.-W.; Love, D.

    2004-01-01

    New Mexico's population is concentrated along the corridor that extends from Belen in the south to Española in the north and includes Albuquerque and Santa Fe. The Rio Grande rift, which encompasses the corridor, is a major tectonically, volcanically, and seismically active continental rift in the western U.S. Although only one large earthquake (moment magnitude (M) ≥ 6) has possibly occurred in the New Mexico portion of the rift since 1849, paleoseismic data indicate that prehistoric surface-faulting earthquakes of M 6.5 and greater have occurred on aver- age every 400 yrs on many faults throughout the Rio Grande rift.

  7. Frequency spectrum method-based stress analysis for oil pipelines in earthquake disaster areas.

    Directory of Open Access Journals (Sweden)

    Xiaonan Wu

    Full Text Available When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline.

  8. Frequency spectrum method-based stress analysis for oil pipelines in earthquake disaster areas.

    Science.gov (United States)

    Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao

    2015-01-01

    When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline.

  9. Hazardous-waste analysis plan for LLNL operations

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  10. Fire hazards analysis for the uranium oxide (UO3) facility

    International Nuclear Information System (INIS)

    Wyatt, D.M.

    1994-01-01

    The Fire Hazards Analysis (FHA) documents the deactivation end-point status of the UO 3 complex fire hazards, fire protection and life safety systems. This FHA has been prepared for the Uranium Oxide Facility by Westinghouse Hanford Company in accordance with the criteria established in DOE 5480.7A, Fire Protection and RLID 5480.7, Fire Protection. The purpose of the Fire Hazards Analysis is to comprehensively and quantitatively assess the risk from a fire within individual fire areas in a Department of Energy facility so as to ascertain whether the objectives stated in DOE Order 5480.7, paragraph 4 are met. Particular attention has been paid to RLID 5480.7, Section 8.3, which specifies the criteria for deactivating fire protection in decommission and demolition facilities

  11. Hazardous-waste analysis plan for LLNL operations

    International Nuclear Information System (INIS)

    Roberts, R.S.

    1982-01-01

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste

  12. Landslide hazards and systems analysis: A Central European perspective

    Science.gov (United States)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  13. 340 Waste handling Facility Hazard Categorization and Safety Analysis

    International Nuclear Information System (INIS)

    Rodovsky, T.J.

    2010-01-01

    The analysis presented in this document provides the basis for categorizing the facility as less than Hazard Category 3. The final hazard categorization for the deactivated 340 Waste Handling Facility (340 Facility) is presented in this document. This hazard categorization was prepared in accordance with DOE-STD-1 027-92, Change Notice 1, Hazard Categorization and Accident Analysis Techniques for Compliance with Doe Order 5480.23, Nuclear Safety Analysis Reports. The analysis presented in this document provides the basis for categorizing the facility as less than Hazard Category (HC) 3. Routine nuclear waste receiving, storage, handling, and shipping operations at the 340 Facility have been deactivated, however, the facility contains a small amount of radioactive liquid and/or dry saltcake in two underground vault tanks. A seismic event and hydrogen deflagration were selected as bounding accidents. The generation of hydrogen in the vault tanks without active ventilation was determined to achieve a steady state volume of 0.33%, which is significantly less than the lower flammability limit of 4%. Therefore, a hydrogen deflagration is not possible in these tanks. The unmitigated release from a seismic event was used to categorize the facility consistent with the process defined in Nuclear Safety Technical Position (NSTP) 2002-2. The final sum-of-fractions calculation concluded that the facility is less than HC 3. The analysis did not identify any required engineered controls or design features. The Administrative Controls that were derived from the analysis are: (1) radiological inventory control, (2) facility change control, and (3) Safety Management Programs (SMPs). The facility configuration and radiological inventory shall be controlled to ensure that the assumptions in the analysis remain valid. The facility commitment to SMPs protects the integrity of the facility and environment by ensuring training, emergency response, and radiation protection. The full scale

  14. LOCAL SITE CONDITIONS INFLUENCING EARTHQUAKE INTENSITIES AND SECONDARY COLLATERAL IMPACTS IN THE SEA OF MARMARA REGION - Application of Standardized Remote Sensing and GIS-Methods in Detecting Potentially Vulnerable Areas to Earthquakes, Tsunamis and Other Hazards.

    Directory of Open Access Journals (Sweden)

    George Pararas-Carayannis

    2011-01-01

    Full Text Available The destructive earthquake that struck near the Gulf of Izmit along the North Anatolian fault in Northwest Turkey on August 17, 1999, not only generated a local tsunami that was destructive at Golcuk and other coastal cities in the eastern portion of the enclosed Sea of Marmara, but was also responsible for extensive damage from collateral hazards such as subsidence, landslides, ground liquefaction, soil amplifications, compaction and underwater slumping of unconsolidated sediments. This disaster brought attention in the need to identify in this highly populated region, local conditions that enhance earthquake intensities, tsunami run-up and other collateral disaster impacts. The focus of the present study is to illustrate briefly how standardized remote sensing techniques and GIS-methods can help detect areas that are potentially vulnerable, so that disaster mitigation strategies can be implemented more effectively. Apparently, local site conditions exacerbate earthquake intensities and collateral disaster destruction in the Marmara Sea region. However, using remote sensing data, the causal factors can be determined systematically. With proper evaluation of satellite imageries and digital topographic data, specific geomorphologic/topographic settings that enhance disaster impacts can be identified. With a systematic GIS approach - based on Digital Elevation Model (DEM data - geomorphometric parameters that influence the local site conditions can be determined. Digital elevation data, such as SRTM (Shuttle Radar Topography Mission, with 90m spatial resolution and ASTER-data with 30m resolution, interpolated up to 15 m is readily available. Areas with the steepest slopes can be identified from slope gradient maps. Areas with highest curvatures susceptible to landslides can be identified from curvature maps. Coastal areas below the 10 m elevation susceptible to tsunami inundation can be clearly delineated. Height level maps can also help locate

  15. Seismic response analysis of Wolsung NPP structure and equipment subjected to scenario earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Choi, In Kil; Ahn, Seong Moon; Choun, Young Sun; Seo, Jeong Moon

    2005-03-15

    The standard response spectrum proposed by US NRC has been used as a design earthquake for the design of Korean nuclear power plant structures. However, it does not reflect the characteristic of seismological and geological of Korea. In this study, the seismic response analysis of Wolsung NPP structure and equipment were performed. Three types of input motions, artificial time histories that envelop the US NRC Regulatory Guide 1.60 spectrum and the probability based scenario earthquake spectra developed for the Korean NPP site and a typical near-fault earthquake recorded at thirty sites, were used as input motions. The acceleration, displacement and shear force responses of Wolsung containment structure due to the design earthquake were larger than those due to the other input earthquakes. But, considering displacement response increases abruptly as Wolsung NPP structure does nonlinear behavior, the reassessment of the seismic safety margin based on the displacement is necessary if the structure does nonlinear behavior; although it has adequate the seismic safety margin within elastic limit. Among the main safety-related devices, electrical cabinet and pump showed the large responses on the scenario earthquake which has the high frequency characteristic. This has great effects of the seismic capacity of the main devices installed inside of the building. This means that the design earthquake is not so conservative for the safety of the safety related nuclear power plant equipments.

  16. Seismic response analysis of Wolsung NPP structure and equipment subjected to scenario earthquakes

    International Nuclear Information System (INIS)

    Choi, In Kil; Ahn, Seong Moon; Choun, Young Sun; Seo, Jeong Moon

    2005-03-01

    The standard response spectrum proposed by US NRC has been used as a design earthquake for the design of Korean nuclear power plant structures. However, it does not reflect the characteristic of seismological and geological of Korea. In this study, the seismic response analysis of Wolsung NPP structure and equipment were performed. Three types of input motions, artificial time histories that envelop the US NRC Regulatory Guide 1.60 spectrum and the probability based scenario earthquake spectra developed for the Korean NPP site and a typical near-fault earthquake recorded at thirty sites, were used as input motions. The acceleration, displacement and shear force responses of Wolsung containment structure due to the design earthquake were larger than those due to the other input earthquakes. But, considering displacement response increases abruptly as Wolsung NPP structure does nonlinear behavior, the reassessment of the seismic safety margin based on the displacement is necessary if the structure does nonlinear behavior; although it has adequate the seismic safety margin within elastic limit. Among the main safety-related devices, electrical cabinet and pump showed the large responses on the scenario earthquake which has the high frequency characteristic. This has great effects of the seismic capacity of the main devices installed inside of the building. This means that the design earthquake is not so conservative for the safety of the safety related nuclear power plant equipments

  17. Analysis of source spectra, attenuation, and site effects from central and eastern United States earthquakes

    International Nuclear Information System (INIS)

    Lindley, G.

    1998-02-01

    This report describes the results from three studies of source spectra, attenuation, and site effects of central and eastern United States earthquakes. In the first study source parameter estimates taken from 27 previous studies were combined to test the assumption that the earthquake stress drop is roughly a constant, independent of earthquake size. 200 estimates of stress drop and seismic moment from eastern North American earthquakes were combined. It was found that the estimated stress drop from the 27 studies increases approximately as the square-root of the seismic moment, from about 3 bars at 10 20 dyne-cm to 690 bars at 10 25 dyne-cm. These results do not support the assumption of a constant stress drop when estimating ground motion parameters from eastern North American earthquakes. In the second study, broadband seismograms recorded by the United States National Seismograph Network and cooperating stations have been analysed to determine Q Lg as a function of frequency in five regions: the northeastern US, southeastern US, central US, northern Basin and Range, and California and western Nevada. In the third study, using spectral analysis, estimates have been made for the anelastic attenuation of four regional phases, and estimates have been made for the source parameters of 27 earthquakes, including the M b 5.6, 14 April, 1995, West Texas earthquake

  18. ANALYSIS OF REGULARITIES IN DISTRIBUTION OF EARTHQUAKES BY FOCAL DISPLACEMENT IN THE KURIL-OKHOTSK REGION BEFORE THE CATASTROPHIC SIMUSHIR EARTHQUAKE OF 15 NOVEMBER 2006

    Directory of Open Access Journals (Sweden)

    Timofei K. Zlobin

    2012-01-01

    Full Text Available The catastrophic Simushir earthquake occurred on 15 November 2006 in the Kuril-Okhotsk region in the Middle Kuril Islands which is a transition zone between the Eurasian continent and the Pacific Ocean. It was followed by numerous strong earthquakes. It is established that the catastrophic earthquake was prepared on a site characterized by increased relative effective pressures which is located at the border of the low-pressure area (Figure 1.Based on data from GlobalCMT (Harvard, earthquake focal mechanisms were reconstructed, and tectonic stresses, the seismotectonic setting and the earthquakes distribution pattern were studied for analysis of the field of stresses in the region before to the Simushir earthquake (Figures 2 and 3; Table 1.Five areas of various types of movement were determined. Three of them are stretched along the Kuril Islands. It is established that seismodislocations in earthquake focal areas are regularly distributed. In each of the determined areas, displacements of a specific type (shear or reverse shear are concentrated and give evidence of the alteration and change of zones characterized by horizontal stretching and compression.The presence of the horizontal stretching and compression zones can be explained by a model of subduction (Figure 4. Detailed studies of the state of stresses of the Kuril region confirm such zones (Figure 5. Recent GeodynamicsThe established specific features of tectonic stresses before the catastrophic Simushir earthquake of 15 November 2006 contribute to studies of earthquake forecasting problems. The state of stresses and the geodynamic conditions suggesting occurrence of new earthquakes can be assessed from the data on the distribution of horizontal compression, stretching and shear areas of the Earth’s crust and the upper mantle in the Kuril region.

  19. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  20. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  1. Comprehensive analysis of earthquake source spectra in southern California

    OpenAIRE

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  2. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    Energy Technology Data Exchange (ETDEWEB)

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  3. Fire Hazards Analysis for the Inactive Equipment Storage Sprung Structure

    International Nuclear Information System (INIS)

    MYOTT, C.F.

    2000-01-01

    The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objective of DOE Order 5480.1A are met. The order acknowledges a graded approach commensurate with the hazards involved

  4. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    International Nuclear Information System (INIS)

    JOHNSON, B.H.

    1999-01-01

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met

  5. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    Science.gov (United States)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy

  6. Probabilistic seismic hazard analysis - lessons learned: A regulator's perspective

    International Nuclear Information System (INIS)

    Reiter, L.

    1990-01-01

    Probabilistic seismic hazard analysis is a powerful, rational and attractive tool for decision-making. It is capable of absorbing and integrating a wide range of information and judgement and their associated uncertainties into a flexible framework that permits the application of societal goals and priorities. Unfortunately, its highly integrative nature can obscure those elements which drive the results, its highly quantitative nature can lead to false impressions of accuracy, and its open embrace of uncertainty can make decision-making difficult. Addressing these problems can only help to increase its use and make it more palatable to those who need to assess seismic hazard and utilize the results. (orig.)

  7. Market mechanisms for compensating hazardous work: a critical analysis

    International Nuclear Information System (INIS)

    Shakow, D.

    1984-01-01

    Adam Smith's theory that the marketplace can compensate workers for social inequities (i.e., hazards, boredom, etc.) in the work place is applied to the nuclear industry. The author argues that market mechanisms are unlikely to ensure adequate compensation for work-related hazards. He summarizes and critiques the neoclassical compensating-wage hypothesis, then reviews empirical evidence in support of the hypothesis in light of an alternative hypothesis derived from the literature on labor market segmentation. He challenges the assumption of perfect labor mobility and perfect information. A promising direction for further research would be a structural analysis of the emerging market for temporary workers. 13 references, 2 figures

  8. ANALYSIS OF LABOUR ACCIDENTS OCCURRING IN DISASTER RESTORATION WORK FOLLOWING THE NIIGATA CHUETSU EARTHQUAKE (2004) AND THE NIIGATA CHUETSU-OKI EARTHQUAKE (2007)

    Science.gov (United States)

    Itoh, Kazuya; Noda, Masashi; Kikkawa, Naotaka; Hori, Tomohito; Tamate, Satoshi; Toyosawa, Yasuo; Suemasa, Naoaki

    Labour accidents in disaster-relief and disaster restoration work following the Niigata Chuetsu Earthquake (2004) and the Niigata Chuetsu-oki Earthquake (2007) were analysed and characterised in order to raise awareness of the risks and hazards in such work. The Niigata Chuetsu-oki Earthquake affected houses and buildings rather than roads and railways, which are generally disrupted due to landslides or slope failures caused by earthquakes. In this scenario, the predominant type of accident is a "fall to lower level," which increases mainly due to the fact that labourers are working to repair houses and buildings. On the other hand, landslides and slope failures were much more prevalent in the Niigata Chuetsu Earthquake, resulting in more accidents occurring in geotechnical works rather than in construction works. Therefore, care should be taken in preventing "fall to lower level" accidents associated with repair work on the roofs of low-rise houses, "cut or abrasion" accidents due to the demolition of damaged houses and "caught in or compressed by equipment" accidents in road works and water and sewage works.

  9. Hazard analysis of Clostridium perfringens in the Skylab Food System

    Science.gov (United States)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  10. Seismic hazard assessment: Issues and alternatives

    Science.gov (United States)

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  11. Probabilistic aftershock hazard analysis, two case studies in West and Northwest Iran

    Science.gov (United States)

    Ommi, S.; Zafarani, H.

    2018-01-01

    Aftershock hazard maps contain the essential information for search and rescue process, and re-occupation after a main-shock. Accordingly, the main purposes of this article are to study the aftershock decay parameters and to estimate the expected high-frequency ground motions (i.e., Peak Ground Acceleration (PGA)) for recent large earthquakes in the Iranian plateau. For this aim, the Ahar-Varzaghan doublet earthquake (August 11, 2012; M N =6.5, M N =6.3), and the Ilam (Murmuri) earthquake (August 18, 2014 ; M N =6.2) have been selected. The earthquake catalogue has been collected based on the Gardner and Knopoff (Bull Seismol Soc Am 64(5), 1363-1367, 1974) temporal and spatial windowing technique. The magnitude of completeness and the seismicity parameters ( a, b) and the modified Omori law parameters ( P, K, C) have been determined for these two earthquakes in the 14, 30, and 60 days after the mainshocks. Also, the temporal changes of parameters (a, b, P, K, C) have been studied. The aftershock hazard maps for the probability of exceedance (33%) have been computed in the time periods of 14, 30, and 60 days after the Ahar-Varzaghan and Ilam (Murmuri) earthquakes. For calculating the expected PGA of aftershocks, the regional and global ground motion prediction equations have been utilized. Amplification factor based on the site classes has also been implied in the calculation of PGA. These aftershock hazard maps show an agreement between the PGAs of large aftershocks and the forecasted PGAs. Also, the significant role of b parameter in the Ilam (Murmuri) probabilistic aftershock hazard maps has been investigated.

  12. Frequency Analysis of Aircraft hazards for License Application

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  13. A review on earthquake and tsunami hazards of the Sumatran plate boundary: Observing expected and unexpected events after the Aceh-Andaman Mw 9.15 event

    Science.gov (United States)

    Natawidjaja, D.

    2013-12-01

    The 600-km Mentawai megathrust had produced two giant historical earthquakes generating big tsunamies in 1797 and 1833. The SuGAr (Sumatran GPS continuous Array) network, first deployed in 2002, shows that the subduction interface underlying Mentawai Islands and the neighboring Nias section in the north are fully locked, thus confirming their potential hazards. Outreach activities to warn people about earthquake and tsunamies had been started since 4 months prior to the 26 December 2004 in Aceh-Andaman earthquake (Mw 9.15). Later in March 2005, the expected megathrust earthquake (Mw 8.7) hit Nias-Simelue area and killed about 2000 people, releasing the accumulated strain since the previous 1861 event (~Mw 8.5). After then many Mw 7s and smaller events occured in Sumatra, filling areas between and around two giant ruptures and heighten seismicities in neighboring areas. In March 2007, the twin earthquake disaster (Mw 6.3 and Mw 6.4) broke two consecutive segments of the transcurrent Sumatran fault in the Singkarak lake area. Only six month later, in September 2007, the rapid-fire-failures of three consecutive megathrust patches (Mw 8.5, Mw 7.9 and Mw 7.0) ruptured a 250-km-section of the southern part of the Mentawai. It was a big surprise since this particular section is predicted as a very-low coupled section from modelling the SuGAr data, and hence, bypassing the more potential fully coupled section of the Mentawai in between the 2005 and 2007 ruptures. In September 2009, a rare unexpected event (Mw 7.6) suddenly ruptured an intracrustal fault in the subducted slab down under Padang City and killed about 500 people. Padang had been in preparation for the next tsunami but not for strong shakes from near by major earthquake. This event seems to have remotely triggered another Mw 6.7 on the Sumatran fault near kerinci Lake, a few hundred kilometers south of Padang, in less than a day. Just a year later, in November 2010, again an unexpected large slow-slip event of

  14. Far field tsunami simulations of the 1755 Lisbon earthquake: Implications for tsunami hazard to the U.S. East Coast and the Caribbean

    Science.gov (United States)

    Barkan, R.; ten Brink, Uri S.; Lin, J.

    2009-01-01

    The great Lisbon earthquake of November 1st, 1755 with an estimated moment magnitude of 8.5-9.0 was the most destructive earthquake in European history. The associated tsunami run-up was reported to have reached 5-15??m along the Portuguese and Moroccan coasts and the run-up was significant at the Azores and Madeira Island. Run-up reports from a trans-oceanic tsunami were documented in the Caribbean, Brazil and Newfoundland (Canada). No reports were documented along the U.S. East Coast. Many attempts have been made to characterize the 1755 Lisbon earthquake source using geophysical surveys and modeling the near-field earthquake intensity and tsunami effects. Studying far field effects, as presented in this paper, is advantageous in establishing constraints on source location and strike orientation because trans-oceanic tsunamis are less influenced by near source bathymetry and are unaffected by triggered submarine landslides at the source. Source location, fault orientation and bathymetry are the main elements governing transatlantic tsunami propagation to sites along the U.S. East Coast, much more than distance from the source and continental shelf width. Results of our far and near-field tsunami simulations based on relative amplitude comparison limit the earthquake source area to a region located south of the Gorringe Bank in the center of the Horseshoe Plain. This is in contrast with previously suggested sources such as Marqu??s de Pombal Fault, and Gulf of C??diz Fault, which are farther east of the Horseshoe Plain. The earthquake was likely to be a thrust event on a fault striking ~ 345?? and dipping to the ENE as opposed to the suggested earthquake source of the Gorringe Bank Fault, which trends NE-SW. Gorringe Bank, the Madeira-Tore Rise (MTR), and the Azores appear to have acted as topographic scatterers for tsunami energy, shielding most of the U.S. East Coast from the 1755 Lisbon tsunami. Additional simulations to assess tsunami hazard to the U.S. East

  15. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system

    Science.gov (United States)

    Takarada, S.

    2012-12-01

    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in

  16. Multi-hazard risk analysis for management strategies

    Science.gov (United States)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  17. Epidemiological analysis of trauma patients following the Lushan earthquake.

    Directory of Open Access Journals (Sweden)

    Li Zhang

    Full Text Available BACKGROUND: A 7.0-magnitude earthquake hit Lushan County in China's Sichuan province on April 20, 2013, resulting in 196 deaths and 11,470 injured. This study was designed to analyze the characteristics of the injuries and the treatment of the seismic victims. METHODS: After the earthquake, an epidemiological survey of injured patients was conducted by the Health Department of Sichuan Province. Epidemiological survey tools included paper-and-pencil questionnaires and a data management system based on the Access Database. Questionnaires were completed based on the medical records of inpatients with earthquake-related injuries. Outpatients or non-seismic injured inpatients were excluded. A total of 2010 patients from 140 hospitals were included. RESULTS: The most common type of injuries involved bone fractures (58.3%. Children younger than 10 years of age suffered fewer fractures and chest injuries, but more skin and soft -tissue injuries. Patients older than 80 years were more likely to suffer hip and thigh fractures, pelvis fractures, and chest injuries, whereas adult patients suffered more ankle and foot fractures. A total of 207 cases of calcaneal fracture were due to high falling injuries related to extreme panic. The most common type of infection in hospitalized patients was pulmonary infections. A total of 70.5% patients had limb dysfunction, and 60.1% of this group received rehabilitation. Most patients received rehabilitation within 1 week, and the median duration of rehabilitation was 3 weeks. The cause of death of all seven hospitalized patients who died was severe traumatic brain injuries; five of this group died within 24 h after the earthquake. CONCLUSIONS: Injuries varied as a function of the age of the victim. As more injuries were indirectly caused by the Lushan earthquake, disaster education is urgently needed to avoid secondary injuries.

  18. Slope instabilities triggered by the 2011 Lorca earthquake (M{sub w} 5.1): a comparison and revision of hazard assessments of earthquake-triggered landslides in Murcia; Inestabilidades de ladera provocadas por el terremoto de Lorca de 2011 (Mw 5,1): comparacion y revision de estudios de peligrosidad de movimientos de ladera por efecto sismico en Murcia

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez-Peces, M. J.; Garcia-Mayordomo, J.; Martinez-Diaz, J. J.; Tsige, M.

    2012-11-01

    The Lorca basin has been the object of recent research aimed at studying the phenomenon of earthquake induced landslides and their assessment within the context of different seismic scenarios, bearing in mind the influence of soil and topographical amplification effects. Nevertheless, it was not until the Lorca earthquakes of 11 May 2011 that it became possible to adopt a systematic approach to the problem. We provide here an inventory of slope instabilities triggered by the Lorca earthquakes comprising 100 cases, mainly small rock and soil falls (1 to 100 m{sup 3}). The distribution of these instabilities is compared to two different earthquake-triggered landslide hazard maps: one considering the occurrence of the most probable earthquake for a 475-yr return period in the Lorca basin (M{sub w} = 5.0), which was previously published on the basis of a low-resolution digital elevation model (DEM), and a second one matching the occurrence of the M{sub w} = 5.1 2011 Lorca earthquake, which was undertaken using a higher resolution DEM. The most frequent Newmark displacement values related to the slope failures triggered by the 2011 Lorca earthquakes are smaller than 2 cm in both hazard scenarios and coincide with areas where significant soil and topographical seismic amplification effects have occurred.

  19. Physics-Based Simulations of Natural Hazards

    Science.gov (United States)

    Schultz, Kasey William

    seismic hazard analysis of Iran. Finally I present a prototype method that couples tsunami modeling with Virtual Quake earthquake simulations to potentially aid in the development of the Pacific Rim tsunami early warning system.

  20. Environmentally Friendly Solution to Ground Hazards in Design of Bridges in Earthquake Prone Areas Using Timber Piles

    Science.gov (United States)

    Sadeghi, H.

    2015-12-01

    Bridges are major elements of infrastructure in all societies. Their safety and continued serviceability guaranties the transportation and emergency access in urban and rural areas. However, these important structures are subject to earthquake induced damages in structure and foundations. The basic approach to the proper support of foundations are a) distribution of imposed loads to foundation in a way they can resist those loads without excessive settlement and failure; b) modification of foundation ground with various available methods; and c) combination of "a" and "b". The engineers has to face the task of designing the foundations meeting all safely and serviceability criteria but sometimes when there are numerous environmental and financial constrains, the use of some traditional methods become inevitable. This paper explains the application of timber piles to improve ground resistance to liquefaction and to secure the abutments of short to medium length bridges in an earthquake/liquefaction prone area in Bohol Island, Philippines. The limitations of using the common ground improvement methods (i.e., injection, dynamic compaction) because of either environmental or financial concerns along with the abundance of timber in the area made the engineers to use a network of timber piles behind the backwalls of the bridge abutments. The suggested timber pile network is simulated by numerical methods and its safety is examined. The results show that the compaction caused by driving of the piles and bearing capacity provided by timbers reduce the settlement and lateral movements due to service and earthquake induced loads.

  1. Fossil landscapes and youthful seismogenic sources in the central Apennines: excerpts from the 24 August 2016, Amatrice earthquake and seismic hazard implications

    Directory of Open Access Journals (Sweden)

    Gianluca Valensise

    2016-11-01

    Full Text Available We show and discuss the similarities among the 2016 Amatrice (Mw 6.0, 1997 Colfiorito-Sellano (Mw 6.0-5.6 and 2009 L’Aquila (Mw 6.3 earthquakes. They all occurred along the crest of the central Apennines and were caused by shallow dipping faults between 3 and 10 km depth, as shown by their characteristic InSAR signature. We contend that these earthquakes delineate a seismogenic style that is characteristic of this portion of the central Apennines, where the upward propagation of seismogenic faults is hindered by the presence of pre-existing regional thrusts. This leads to an effective decoupling between the deeper seismogenic portion of the upper crust and its uppermost 3 km.The decoupling implies that active faults mapped at the surface do not connect with the seismogenic sources, and that their evolution may be controlled by passive readjustments to coseismic strains or even by purely gravitational motions. Seismic hazard analyses and estimates based on such faults should hence be considered with great caution as they may be all but representative of the true seismogenic potential.

  2. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    Science.gov (United States)

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  3. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    Directory of Open Access Journals (Sweden)

    W. F. Peng

    2012-03-01

    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  4. Long term volcanic hazard analysis in the Canary Islands

    Science.gov (United States)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  5. Surface Fire Hazards Analysis Technical Report-Constructor Facilities

    International Nuclear Information System (INIS)

    Flye, R.E.

    2000-01-01

    The purpose of this Fire Hazards Analysis Technical Report (hereinafter referred to as Technical Report) is to assess the risk from fire within individual fire areas to ascertain whether the U.S. Department of Energy (DOE) fire safety objectives are met. The objectives identified in DOE Order 420.1, Change 2, Facility Safety, Section 4.2, establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public, or the environment; Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding defined limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events

  6. Aftershock stress analysis of the April 2015 Mw 7.8 Gorkha earthquake from the NAMASTE project

    Science.gov (United States)

    Pant, M.; Velasco, A. A.; Karplus, M. S.; Patlan, E.; Ghosh, A.; Nabelek, J.; Kuna, V. M.; Sapkota, S. N.; Adhikari, L. B.; Klemperer, S. L.

    2016-12-01

    Continental collision between the Indian plate and the Eurasian plate, converging at 45 mm/yr, has uplifted the northern part of Nepal forming the Himalaya. Because of this convergence, the region has experienced large, devastating earthquakes, including the 1934 Mw 8.4 Nepal-Bihar earthquake and two recent earthquakes on April 25, 2015 Mw 7.8 (Gorkha earthquake) and May 12, 2015 Mw 7.2. These quakes killed thousands of people and caused billion dollars of property loss. Despite some recent geologic and geophysical studies of this area, many tectonic questions remain unanswered. Shortly after the Gorkha earthquake, we deployed a seismic network, NAMASTE (Nepal Array Measuring Aftershock Seismicity Trailing Earthquake), to study the aftershocks of these two large events. Our network included 45 different seismic stations (16 short period, 25 broadband, and 4 strong motion sensors) that spanned the Gorkha rupture area. The deployment extends from south of the Main Frontal Thrust (MFT) to the Main Central Thrust region (MCT), and it to recorded aftershocks for more than ten months from June 2015 to May 2016. We are leveraging high-precision earthquake locations by measuring and picking P-wave first-motion arrival polarity to develop a catalog of focal mechanisms for the larger aftershocks. We will use this catalog to correlate the seismicity and stress related of the Indo-Eurasian plate margin, hoping to address questions regarding the complex fault geometries and future earthquake hazards at this plate margin.

  7. Expert systems for assisting the analysis of hazards

    International Nuclear Information System (INIS)

    Evrard, J.M.; Martinez, J.M.; Souchet, Y.

    1990-01-01

    The advantage of applying expert systems in the analysis of safety in the operation of nuclear power plants is discussed. Expert systems apply a method based on a common representation of nuclear power plants. The main steps of the method are summarized. The applications given concern in the following fields: the analysis of hazards in the electric power supplies of a gas-graphite power plant; the evaluation of the availability of safety procedures in a PWR power plant; the search for the sources of leakage in a PWR power plant. The analysis shows that expert systems are a powerful tool in the study of safety of nuclear power plants [fr

  8. Environmental risk analysis of hazardous material rail transportation

    International Nuclear Information System (INIS)

    Saat, Mohd Rapik; Werth, Charles J.; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P.L.

    2014-01-01

    Highlights: • Comprehensive, nationwide risk assessment of hazardous material rail transportation. • Application of a novel environmental (i.e. soil and groundwater) consequence model. • Cleanup cost and total shipment distance are the most significant risk factors. • Annual risk varies from $20,000 to $560,000 for different products. • Provides information on the risk cost associated with specific product shipments. -- Abstract: An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials

  9. Environmental risk analysis of hazardous material rail transportation

    Energy Technology Data Exchange (ETDEWEB)

    Saat, Mohd Rapik, E-mail: mohdsaat@illinois.edu [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States); Werth, Charles J.; Schaeffer, David [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States); Yoon, Hongkyu [Sandia National Laboratories, Albuquerque, NM 87123 (United States); Barkan, Christopher P.L. [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 1243 Newmark Civil Engineering Laboratory, 205 North Mathews Avenue, Urbana, IL 61801 (United States)

    2014-01-15

    Highlights: • Comprehensive, nationwide risk assessment of hazardous material rail transportation. • Application of a novel environmental (i.e. soil and groundwater) consequence model. • Cleanup cost and total shipment distance are the most significant risk factors. • Annual risk varies from $20,000 to $560,000 for different products. • Provides information on the risk cost associated with specific product shipments. -- Abstract: An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  10. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xueqin [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); School of Social Development and Public Policy, Beijing Normal University, Beijing 100875 (China); Li, Ning [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); Yuan, Shuai, E-mail: syuan@nmemc.org.cn [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); Xu, Ning; Shi, Wenqin; Chen, Weibin [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China)

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  11. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    International Nuclear Information System (INIS)

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-01-01

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  12. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro; Mai, Paul Martin; Yasuda, Tomohiro; Mori, Nobuhito

    2014-01-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  13. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro

    2014-09-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  14. Landslide Hazard Analysis with Multidisciplinary Approach: İstanbul example

    Science.gov (United States)

    Kılıç, Osman; Baş, Mahmut; Yahya Menteşe, Emin; Tarih, Ahmet; Duran, Kemal; Gümüş, Salim; Rıza Yapar, Evrens; Emin Karasu, Muhammed; Acar Kara, Sema; Karaman, Abdullah; Özalaybey, Serdar; Zor, Ekrem; Ediger, Vedat; Arpat, Esen; Özgül, Necdet; Polat, Feyzi; Doǧan, Uǧur; Çakır, Ziyadin

    2017-04-01

    There are several methods that can be utilized for describing the landslide mechanisms. While some of them are commonly used, there are relatively new methods that have been proven to be useful. Obviously, each method has its own limitations and thus integrated use of these methods contributes to obtaining a realistic landslide model. The slopes of Küçükçekmece and Büyükçekmece Lagoons located at the Marmara Sea coast of İstanbul, Turkey, are among most specific examples of complex type landslides. The landslides in the area started developing at low sea level, and appears to ceased or at least slowed down to be at minimum after the sea level rise, as oppose to the still-active landslides that continue to cause damage especially in the valley slopes above the recent sea level between the two lagoons. To clarify the characteristics of these slope movements and classify them in most accurate way, Directorate of Earthquake and Ground Research of Istanbul Metropolitan Municipality launched a project in cooperation with Marmara Research Center of The Scientific and Technological Research Council of Turkey (TÜBİTAK). The project benefits the utility of the techniques of different disciplines such as geology, geophysics, geomorphology, hydrogeology, geotechnics, geodesy, remote sensing and meteorology. The observations include detailed mapping of topography by airborne LIDAR, deformation monitoring with more than 80 GPS stations, Ground Based Synthetic Aperture Radar measurements in 8 critical zones, 81 geological drills and more than 20 km of geophysical measurements. With three years of monitoring, the acquired data, and the results such as landslide hazard map, were integrated in GIS database for the purpose of easing tasks for the urban planners and the decision makers.

  15. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    Science.gov (United States)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  16. Fault roughness and strength heterogeneity control earthquake size and stress drop

    KAUST Repository

    Zielke, Olaf

    2017-01-13

    An earthquake\\'s stress drop is related to the frictional breakdown during sliding and constitutes a fundamental quantity of the rupture process. High-speed laboratory friction experiments that emulate the rupture process imply stress drop values that greatly exceed those commonly reported for natural earthquakes. We hypothesize that this stress drop discrepancy is due to fault-surface roughness and strength heterogeneity: an earthquake\\'s moment release and its recurrence probability depend not only on stress drop and rupture dimension but also on the geometric roughness of the ruptured fault and the location of failing strength asperities along it. Using large-scale numerical simulations for earthquake ruptures under varying roughness and strength conditions, we verify our hypothesis, showing that smoother faults may generate larger earthquakes than rougher faults under identical tectonic loading conditions. We further discuss the potential impact of fault roughness on earthquake recurrence probability. This finding provides important information, also for seismic hazard analysis.

  17. Flood Hazard and Risk Analysis in Urban Area

    Science.gov (United States)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  18. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  19. Flash-sourcing or the rapid detection and characterisation of earthquake effects through clickstream data analysis

    Science.gov (United States)

    Bossu, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.

    2011-12-01

    Rapid characterisation of earthquake effects is essential for a timely and appropriate response in favour of victims and/or of eyewitnesses. In case of damaging earthquakes, any field observations that can fill the information gap characterising their immediate aftermath can contribute to more efficient rescue operations. This paper presents the last developments of a method called "flash-sourcing" addressing these issues. It relies on eyewitnesses, the first informed and the first concerned by an earthquake occurrence. More precisely, their use of the EMSC earthquake information website (www.emsc-csem.org) is analysed in real time to map the area where the earthquake was felt and identify, at least under certain circumstances zones of widespread damage. The approach is based on the natural and immediate convergence of eyewitnesses on the website who rush to the Internet to investigate cause of the shaking they just felt causing our traffic to increase The area where an earthquake was felt is mapped simply by locating Internet Protocol (IP) addresses during traffic surges. In addition, the presence of eyewitnesses browsing our website within minutes of an earthquake occurrence excludes the possibility of widespread damage in the localities they originate from: in case of severe damage, the networks would be down. The validity of the information derived from this clickstream analysis is confirmed by comparisons with EMS98 macroseismic map obtained from online questionnaires. The name of this approach, "flash-sourcing", is a combination of "flash-crowd" and "crowdsourcing" intending to reflect the rapidity of the data collation from the public. For computer scientists, a flash-crowd names a traffic surge on a website. Crowdsourcing means work being done by a "crowd" of people; It also characterises Internet and mobile applications collecting information from the public such as online macroseismic questionnaires. Like crowdsourcing techniques, flash-sourcing is a

  20. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Sevougian, S. David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  1. Lithium-thionyl chloride cell system safety hazard analysis

    Science.gov (United States)

    Dampier, F. W.

    1985-03-01

    This system safety analysis for the lithium thionyl chloride cell is a critical review of the technical literature pertaining to cell safety and draws conclusions and makes recommendations based on this data. The thermodynamics and kinetics of the electrochemical reactions occurring during discharge are discussed with particular attention given to unstable SOCl2 reduction intermediates. Potentially hazardous reactions between the various cell components and discharge products or impurities that could occur during electrical or thermal abuse are described and the most hazardous conditions and reactions identified. Design factors influencing the safety of Li/SOCl2 cells, shipping and disposal methods and the toxicity of Li/SOCl2 battery components are additional safety issues that are also addressed.

  2. Seismic imaging beneath an InSAR anomaly in eastern Washington State: Shallow faulting associated with an earthquake swarm in a low-hazard area

    Science.gov (United States)

    Stephenson, William J.; Odum, Jackson K.; Wicks, Chuck; Pratt, Thomas L.; Blakely, Richard J.

    2016-01-01

    In 2001, a rare swarm of small, shallow earthquakes beneath the city of Spokane, Washington, caused ground shaking as well as audible booms over a five‐month period. Subsequent Interferometric Synthetic Aperture Radar (InSAR) data analysis revealed an area of surface uplift in the vicinity of the earthquake swarm. To investigate the potential faults that may have caused both the earthquakes and the topographic uplift, we collected ∼3  km of high‐resolution seismic‐reflection profiles to image the upper‐source region of the swarm. The two profiles reveal a complex deformational pattern within Quaternary alluvial, fluvial, and flood deposits, underlain by Tertiary basalts and basin sediments. At least 100 m of arching on a basalt surface in the upper 500 m is interpreted from both the seismic profiles and magnetic modeling. Two west‐dipping faults deform Quaternary sediments and project to the surface near the location of the Spokane fault defined from modeling of the InSAR data.

  3. Analysis of a school building damaged by the 2015 Ranau earthquake Malaysia

    Science.gov (United States)

    Takano, Shugo; Saito, Taiki

    2017-10-01

    On June 5th, 2015 a severe earthquake with a moment Magnitude of 6.0 occurred in Ranau, Malaysia. Depth of the epicenter is 10 km. Due to the earthquake, many facilities were damaged and 18 people were killed due to rockfalls [1]. Because the British Standard (BS) is adopted as a regulation for built buildings in Malaysia, the seismic force is not considered in the structural design. Therefore, the seismic resistance of Malaysian buildings is unclear. To secure the human life and building safety, it is important to grasp seismic resistance of the building. The objective of this study is to evaluate the seismic resistance of the existing buildings in Malaysia built by the British Standard. A school building that was damaged at the Ranau earthquake is selected as the target building. The building is a four story building and the ground floor is designed to be a parking space for the staff. The structural types are infill masonries where main frame is configured by reinforced concrete columns and beams and brick is installed inside the frame as walls. Analysis is performed using the STERA_3D software that is the software to analyze the seismic performance of buildings developed by one of the authors. Firstly, the natural period of the building is calculated and compared with the result of micro-tremor measurement. Secondly, the nonlinear push-over analysis was conducted to evaluate the horizontal load bearing capacity of the building. Thirdly, the earthquake response analysis was conducted using the time history acceleration data measured at the Ranau earthquake by the seismograph installed at Kota Kinabalu. By comparing the results of earthquake response analysis and the actual damage of the building, the reason that caused damage to the building is clarified.

  4. Analysis of earthquake clustering and source spectra in the Salton Sea Geothermal Field

    Science.gov (United States)

    Cheng, Y.; Chen, X.

    2015-12-01

    The Salton Sea Geothermal field is located within the tectonic step-over between San Andreas Fault and Imperial Fault. Since the 1980s, geothermal energy exploration has resulted with step-like increase of microearthquake activities, which mirror the expansion of geothermal field. Distinguishing naturally occurred and induced seismicity, and their corresponding characteristics (e.g., energy release) is important for hazard assessment. Between 2008 and 2014, seismic data recorded by a local borehole array were provided public access from CalEnergy through SCEC data center; and the high quality local recording of over 7000 microearthquakes provides unique opportunity to sort out characteristics of induced versus natural activities. We obtain high-resolution earthquake location using improved S-wave picks, waveform cross-correlation and a new 3D velocity model. We then develop method to identify spatial-temporally isolated earthquake clusters. These clusters are classified into aftershock-type, swarm-type, and mixed-type (aftershock-like, with low skew, low magnitude and shorter duration), based on the relative timing of largest earthquakes and moment-release. The mixed-type clusters are mostly located at 3 - 4 km depth near injection well; while aftershock-type clusters and swarm-type clusters also occur further from injection well. By counting number of aftershocks within 1day following mainshock in each cluster, we find that the mixed-type clusters have much higher aftershock productivity compared with other types and historic M4 earthquakes. We analyze detailed spatial variation of 'b-value'. We find that the mixed-type clusters are mostly located within high b-value patches, while large (M>3) earthquakes and other types of clusters are located within low b-value patches. We are currently processing P and S-wave spectra to analyze the spatial-temporal correlation of earthquake stress parameter and seismicity characteristics. Preliminary results suggest that the

  5. Latitude-Time Total Electron Content Anomalies as Precursors to Japan's Large Earthquakes Associated with Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Jyh-Woei Lin

    2011-01-01

    Full Text Available The goal of this study is to determine whether principal component analysis (PCA can be used to p