Sample records for earthquake hazard analysis

  1. Earthquake Hazard Analysis Methods: A Review (United States)

    Sari, A. M.; Fakhrurrozi, A.


    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  2. Earthquake Hazards Program: Earthquake Scenarios (United States)

    U.S. Geological Survey, Department of the Interior — A scenario represents one realization of a potential future earthquake by assuming a particular magnitude, location, and fault-rupture geometry and estimating...

  3. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor


    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  4. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis? (United States)

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji


    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  5. Global Earthquake Hazard Distribution - Peak Ground Acceleration (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Distribution-peak ground acceleration is a 2.5 minute grid of global earthquake hazards developed using Global Seismic Hazard Program...

  6. Global Earthquake Hazard Distribution - Peak Ground Acceleration (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Distribution-Peak Ground Acceleration is a 2.5 by 2.5 minute grid of global earthquake hazards developed using Global Seismic Hazard Program...

  7. Site specific seismic hazard analysis and determination of response spectra of Kolkata for maximum considered earthquake (United States)

    Shiuly, Amit; Sahu, R. B.; Mandal, Saroj


    This paper presents site specific seismic hazard analysis of Kolkata city, former capital of India and present capital of state West Bengal, situated on the world’s largest delta island, Bengal basin. For this purpose, peak ground acceleration (PGA) for a maximum considered earthquake (MCE) at bedrock level has been estimated using an artificial neural network (ANN) based attenuation relationship developed on the basis of synthetic ground motion data for the region. Using the PGA corresponding to the MCE, a spectrum compatible acceleration time history at bedrock level has been generated by using a wavelet based computer program, WAVEGEN. This spectrum compatible time history at bedrock level has been converted to the same at surface level using SHAKE2000 for 144 borehole locations in the study region. Using the predicted values of PGA and PGV at the surface, corresponding contours for the region have been drawn. For the MCE, the PGA at bedrock level of Kolkata city has been obtained as 0.184 g, while that at the surface level varies from 0.22 g to 0.37 g. Finally, Kolkata has been subdivided into eight seismic subzones, and for each subzone a response spectrum equation has been derived using polynomial regression analysis. This will be very helpful for structural and geotechnical engineers to design safe and economical earthquake resistant structures.

  8. Assessment of the 1988 Saguenay earthquake: Implications on attenuation functions for seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Toro, G.R.; McGuire, R.K. (Risk Engineering, Inc., Golden, CO (United States))


    This study investigates the earthquake records from the 1988 Saguenay earthquake and examines the implications of these records with respect to ground-motion models used in seismic-hazard studies in eastern North America (ENA), specifically, to what extent the ground motions from this earthquake support or reject the various attenuation functions used in the EPRI and LLNL seismic-hazard calculations. Section 2 provides a brief description of the EPRI and LLNL attenuation functions for peak acceleration and for spectral velocities. Section 2 compares these attenuation functions the ground motions from the Saguenay earthquake and from other relevant earthquakes. Section 4 reviews available seismological studies about the Saguenay earthquake, in order to understand its seismological characteristics and why some observations may differ from predictions. Section 5 examines the assumptions and methodology used in the development of the attenuation functions selected by LLNL ground-motion expert 5. Finally, Section 6 draws conclusions about the validity of the various sets of attenuation functions, in light of the Saguenay data and of other evidence presented here. 50 refs., 37 figs., 7 tabs.

  9. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation (United States)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.


    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  10. Spatial earthquake hazard assessment of Evansville, Indiana (United States)

    Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.


    The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.

  11. 13 CFR 120.174 - Earthquake hazards. (United States)


    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  12. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Gürpinar, Aybars, E-mail: [Nuclear & Risk Consultancy, Anisgasse 4, 1221 Vienna (Austria); Serva, Leonello, E-mail: [Independent Consultant, Via dei Dauni 1, 00185 Rome (Italy); Livio, Franz, E-mail: [Dipartimento di Scienza ed Alta Tecnologia, Università degli Studi dell’Insubria, Via Velleggio, 11, 22100 Como (Italy); Rizzo, Paul C., E-mail: [RIZZO Associates, 500 Penn Center Blvd., Suite 100, Pittsburgh, PA 15235 (United States)


    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  13. Global Earthquake Hazard Frequency and Distribution (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  14. Global Earthquake Hazard Frequency and Distribution (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 by 2.5 minute global utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  15. Earthquake hazard analysis for the different regions in and around Ağrı

    Energy Technology Data Exchange (ETDEWEB)

    Bayrak, Erdem, E-mail:; Yilmaz, Şeyda, E-mail: [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: [Ağrı İbrahim Çeçen University, Ağrı (Turkey)


    We investigated earthquake hazard parameters for Eastern part of Turkey by determining the a and b parameters in a Gutenberg–Richter magnitude–frequency relationship. For this purpose, study area is divided into seven different source zones based on their tectonic and seismotectonic regimes. The database used in this work was taken from different sources and catalogues such as TURKNET, International Seismological Centre (ISC), Incorporated Research Institutions for Seismology (IRIS) and The Scientific and Technological Research Council of Turkey (TUBITAK) for instrumental period. We calculated the a value, b value, which is the slope of the frequency–magnitude Gutenberg–Richter relationship, from the maximum likelihood method (ML). Also, we estimated the mean return periods, the most probable maximum magnitude in the time period of t-years and the probability for an earthquake occurrence for an earthquake magnitude ≥ M during a time span of t-years. We used Zmap software to calculate these parameters. The lowest b value was calculated in Region 1 covered Cobandede Fault Zone. We obtain the highest a value in Region 2 covered Kagizman Fault Zone. This conclusion is strongly supported from the probability value, which shows the largest value (87%) for an earthquake with magnitude greater than or equal to 6.0. The mean return period for such a magnitude is the lowest in this region (49-years). The most probable magnitude in the next 100 years was calculated and we determined the highest value around Cobandede Fault Zone. According to these parameters, Region 1 covered the Cobandede Fault Zone and is the most dangerous area around the Eastern part of Turkey.

  16. Earthquake Hazard Mitigation Strategy in Indonesia (United States)

    Karnawati, D.; Anderson, R.; Pramumijoyo, S.


    Because of the active tectonic setting of the region, the risks of geological hazards inevitably increase in Indonesian Archipelagoes and other ASIAN countries. Encouraging community living in the vulnerable area to adapt with the nature of geology will be the most appropriate strategy for earthquake risk reduction. Updating the Earthquake Hazard Maps, enhancement ofthe existing landuse management , establishment of public education strategy and method, strengthening linkages among stake holders of disaster mitigation institutions as well as establishement of continues public consultation are the main strategic programs for community resilience in earthquake vulnerable areas. This paper highlights some important achievements of Earthquake Hazard Mitigation Programs in Indonesia, together with the difficulties in implementing such programs. Case examples of Yogyakarta and Bengkulu Earthquake Mitigation efforts will also be discussed as the lesson learned. The new approach for developing earthquake hazard map which is innitiating by mapping the psychological aspect of the people living in vulnerable area will be addressed as well.

  17. Incorporating the effects of topographic amplification in the analysis of earthquake-induced landslide hazards using logistic regression (United States)

    Lee, S. T.; Yu, T. T.; Peng, W. F.; Wang, C. L.


    Seismic-induced landslide hazards are studied using seismic shaking intensity based on the topographic amplification effect. The estimation of the topographic effect includes the theoretical topographic amplification factors and the corresponding amplified ground motion. Digital elevation models (DEM) with a 5-m grid space are used. The logistic regression model and the geographic information system (GIS) are used to perform the seismic landslide hazard analysis. The 99 Peaks area, located 3 km away from the ruptured fault of the Chi-Chi earthquake, is used to test the proposed hypothesis. An inventory map of earthquake-triggered landslides is used to produce a dependent variable that takes a value of 0 (no landslides) or 1 (landslides). A set of independent parameters, including lithology, elevation, slope gradient, slope aspect, terrain roughness, land use, and Arias intensity (Ia) with the topographic effect. Subsequently, logistic regression is used to find the best fitting function to describe the relationship between the occurrence and absence of landslides within an individual grid cell. The results of seismic landslide hazard analysis that includes the topographic effect (AUROC = 0.890) are better than those of the analysis without it (AUROC = 0.874).

  18. The HayWired earthquake scenario—Earthquake hazards (United States)

    Detweiler, Shane T.; Wein, Anne M.


    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  19. Analysis on Two Typical Landslide Hazard Phenomena in The Wenchuan Earthquake by Field Investigations and Shaking Table Tests. (United States)

    Yang, Changwei; Zhang, Jianjing; Liu, Feicheng; Bi, Junwei; Jun, Zhang


    Based on our field investigations of landslide hazards in the Wenchuan earthquake, some findings can be reported: (1) the multi-aspect terrain facing empty isolated mountains and thin ridges reacted intensely to the earthquake and was seriously damaged; (2) the slope angles of most landslides was larger than 45°. Considering the above disaster phenomena, the reasons are analyzed based on shaking table tests of one-sided, two-sided and four-sided slopes. The analysis results show that: (1) the amplifications of the peak accelerations of four-sided slopes is stronger than that of the two-sided slopes, while that of the one-sided slope is the weakest, which can indirectly explain the phenomena that the damage is most serious; (2) the amplifications of the peak accelerations gradually increase as the slope angles increase, and there are two inflection points which are the point where the slope angle is 45° and where the slope angle is 50°, respectively, which can explain the seismic phenomenon whereby landslide hazards mainly occur on the slopes whose slope angle is bigger than 45°. The amplification along the slope strike direction is basically consistent, and the step is smooth.

  20. Analysis on Two Typical Landslide Hazard Phenomena in The Wenchuan Earthquake by Field Investigations and Shaking Table Tests

    Directory of Open Access Journals (Sweden)

    Changwei Yang


    Full Text Available Based on our field investigations of landslide hazards in the Wenchuan earthquake, some findings can be reported: (1 the multi-aspect terrain facing empty isolated mountains and thin ridges reacted intensely to the earthquake and was seriously damaged; (2 the slope angles of most landslides was larger than 45°. Considering the above disaster phenomena, the reasons are analyzed based on shaking table tests of one-sided, two-sided and four-sided slopes. The analysis results show that: (1 the amplifications of the peak accelerations of four-sided slopes is stronger than that of the two-sided slopes, while that of the one-sided slope is the weakest, which can indirectly explain the phenomena that the damage is most serious; (2 the amplifications of the peak accelerations gradually increase as the slope angles increase, and there are two inflection points which are the point where the slope angle is 45° and where the slope angle is 50°, respectively, which can explain the seismic phenomenon whereby landslide hazards mainly occur on the slopes whose slope angle is bigger than 45°. The amplification along the slope strike direction is basically consistent, and the step is smooth.


    DEFF Research Database (Denmark)

    Zania, Varvara; Tsompanakis, Yiannis; Psarropoulos, Prodromos

    Earthquake hazards may arise as a result of: (a) transient ground deformation, which is induced due to seismic wave propagation, and (b) permanent ground deformation, which is caused by abrupt fault dislocation. Since the adequate performance of waste landfills after an earthquake is of outmost...... importance, the current study examines the impact of both types of earthquake hazards by performing efficient finite-element analyses. These took also into account the potential slip displacement development along the geosynthetic interfaces of the composite base liner. At first, the development of permanent...

  2. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project (United States)

    Boyd, Oliver S.


    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  3. 76 FR 64325 - Advisory Committee on Earthquake Hazards Reduction Meeting (United States)


    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet... Directive/PPD-8: National Preparedness to National Earthquake Hazards Reduction Program (NEHRP) activities...

  4. Fragility analysis of flood protection structures in earthquake and flood prone areas around Cologne, Germany for multi-hazard risk assessment (United States)

    Tyagunov, Sergey; Vorogushyn, Sergiy; Munoz Jimenez, Cristina; Parolai, Stefano; Fleming, Kevin; Merz, Bruno; Zschau, Jochen


    The work presents a methodology for fragility analyses of fluvial earthen dikes in earthquake and flood prone areas. Fragility estimates are being integrated into the multi-hazard (earthquake-flood) risk analysis being undertaken within the framework of the EU FP7 project MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) for the city of Cologne, Germany. Scenarios of probable cascading events due to the earthquake-triggered failure of flood protection dikes and the subsequent inundation of surroundings are analyzed for the area between the gauges Andernach and Düsseldorf along the Rhine River. Along this river stretch, urban areas are partly protected by earthen dikes, which may be prone to failure during exceptional floods and/or earthquakes. The seismic fragility of the dikes is considered in terms of liquefaction potential (factor of safety), estimated by the use of the simplified procedure of Seed and Idriss. It is assumed that initiation of liquefaction at any point throughout the earthen dikes' body corresponds to the failure of the dike and, therefore, this should be taken into account for the flood risk calculations. The estimated damage potential of such structures is presented as a two-dimensional surface (as a function of seismic hazard and water level). Uncertainties in geometrical and geotechnical dike parameters are considered within the framework of Monte Carlo simulations. Taking into consideration the spatial configuration of the existing flood protection system within the area under consideration, seismic hazard curves (in terms of PGA) are calculated for sites along the river segment of interest at intervals of 1 km. The obtained estimates are used to calculate the flood risk when considering the temporal coincidence of seismic and flood events. Changes in flood risk for the considered hazard cascade scenarios are quantified and compared to the single-hazard scenarios.

  5. Probabilistic Tsunami Hazard Analysis of the Pacific Coast of Mexico: Case Study Based on the 1995 Colima Earthquake Tsunami

    Directory of Open Access Journals (Sweden)

    Nobuhito Mori


    Full Text Available This study develops a novel computational framework to carry out probabilistic tsunami hazard assessment for the Pacific coast of Mexico. The new approach enables the consideration of stochastic tsunami source scenarios having variable fault geometry and heterogeneous slip that are constrained by an extensive database of rupture models for historical earthquakes around the world. The assessment focuses upon the 1995 Jalisco–Colima Earthquake Tsunami from a retrospective viewpoint. Numerous source scenarios of large subduction earthquakes are generated to assess the sensitivity and variability of tsunami inundation characteristics of the target region. Analyses of nine slip models along the Mexican Pacific coast are performed, and statistical characteristics of slips (e.g., coherent structures of slip spectra are estimated. The source variability allows exploring a wide range of tsunami scenarios for a moment magnitude (Mw 8 subduction earthquake in the Mexican Pacific region to conduct thorough sensitivity analyses and to quantify the tsunami height variability. The numerical results indicate a strong sensitivity of maximum tsunami height to major slip locations in the source and indicate major uncertainty at the first peak of tsunami waves.

  6. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting (United States)


    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet....m. The primary purpose of this meeting is to receive information on NEHRP earthquake related...

  7. Earthquake sources and seismic hazard in Southeastern Sicily

    Directory of Open Access Journals (Sweden)

    R. Rigano


    Full Text Available A study of some earthquakes (M > 5.3 affecting Southeastern Sicily was performed to define their seismic sources and to estimate seismic hazard in the region. An analysis of historical reports allowed us to reassess intensities of the 1542, 1693, 1818, 1848 and 1990 earthquakes by using the new European Macroseismic Scale ’98. The new intensity data were used to define parameters and the orientation of seismic sources. The sources obtained were compared with the ones computed using the MCS intensities retrieved from the Catalogue of Strong Italian Earthquakes. The adopted procedure gives results that are statistically significant, but both the epicentre location and source azimuth, in some cases, are strongly affected by the azimuthal gap in the intensity distribution. This is evident mainly for the 1693 January earthquakes. For these earthquakes the macroseismic data uncertainty gives significantly different solutions, and does not allow the events to be associated with known active faults. By handling the new estimated intensity data and using the site seismic histories, the seismic hazard for some localities was calculated. The highest probability of occurrence, for destructive events (I = 10, was obtained in the area between Catania, Lentini and Augusta, suggesting that the seismogenic sources are located near the Ionian coast.

  8. Insights into earthquake hazard map performance from shaking history simulations. (United States)

    Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart


    Why recent large earthquakes caused shaking stronger than shown on earthquake hazard maps for common return periods is under debate. Explanations include: (1) Current probabilistic seismic hazard analysis (PSHA) is deficient. (2) PSHA is fine but some map parameters are wrong. (3) Low-probability events consistent with a map sometimes occur. This issue has two parts. Verification involves how well maps implement PSHA ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than the hazard map without being inconsistent. As reality gives only one history, it is hard to assess whether misfit between a map and actual shaking reflects chance or a map biased by inappropriate parameters.

  9. Tiechanshan-Tunghsiao anticline earthquake analysis: Implications for northwestern Taiwan potential carbon dioxide storage site seismic hazard

    Directory of Open Access Journals (Sweden)

    Ruey-Juin Rau


    Full Text Available We analyze the seismicity and earthquake focal mechanisms beneath the Tiechanshan-Tunghsiao (TCS-TH anticline over the last two decades for seismic hazard evaluation of a potential carbon dioxide storage site in northwestern Taiwan. Seismicity in the TCS-TH anticline indicates both spatial and temporal clustering at a depth range of 7 - 12 km. Thirteen 3.0 ≤ ML ≤ 5.2 earthquake focal mechanisms show a combination of thrust, strike-slip, and normal faulting mechanisms under the TCS-TH anticline. A 1992 ML 5.2 earthquake with a focal depth of ~10 km, the largest event ever recorded beneath the TCS-TH anticline during the last two decades, has a normal fault mechanism with the T-axis trending NNE-SSW and nodal planes oriented NNW-SSE, dipping either gently to the NNE or steeply to the SSW. Thrust fault mechanisms that occurred with mostly E-W or NWW-SEE striking P-axes and strike-slip faulting events indicate NWW-SEE striking P-axes and NNE-SSW trending T-axes, which are consistent with the regional plate convergence direction. For the strike-slip faulting events, if we take the N-S or NNW-SSE striking nodal planes as the fault planes, the strike-slip faults are sinistral motions and correspond to the Tapingting fault, which is a strike-slip fault reactivated from the inherited normal fault and intersects the Tiechanshan and Tunghsiao anticlines.

  10. Are seismic hazard assessment errors and earthquake surprises unavoidable? (United States)

    Kossobokov, Vladimir


    demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  11. Analysis of earthquake parameters to generate hazard maps by integrating AHP and GIS for Küçükçekmece region

    National Research Council Canada - National Science Library

    Erden, T; Karaman, H


    ...) are used for simulating the results of the AHP on a spatial environment. In this study, it is aimed to generate a hierarchical structure of the model for the simulation of an earthquake hazard map (EHM...

  12. Earthquake Hazard Assessment: an Independent Review (United States)

    Kossobokov, Vladimir


    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  13. Great earthquakes hazard in slow subduction zones (United States)

    Marcaillou, B.; Gutscher, M.; Westbrook, G. K.


    Research on the Sumatra-Andaman earthquake of 2004 has challenged two popular paradigms; that the strongest subduction earthquakes strike in regions of rapid plate convergence and that rupture occurs primarily along the contact between the basement of the overriding plate and the downgoing plate. Subduction zones presenting similar structural and geodynamic characteristics (slow convergence and thick wedges of accreted sediment) may be capable of generating great megathrust earthquakes (M>8.5) despite an absence of thrust type earthquakes over the past 40 years. Existing deep seismic sounding data and hypocenters are used to constrain the geometry of several key slow subduction zones (Antilles, Hellenic, Sumatra). This geometry forms the basis for numerical modelling of fore-arc thermal structure, which is applied to calculate the estimated width of the seismogenic portion of the subduction fault plane. The margins with the thickest accretionary wedges are commonly found to have the widest (predicted) seismogenic zone. Furthermore, for these margins there exists a substantial (20-60 km wide) region above the up-dip limit for which the contribution to tsunami generation is poorly understood. As the rigidity (mu) of these high-porosity sediments is low, co-seismic slip here can be expected to be slow. Accordingly, the contribution to seismic moment will be low, but the contribution to tsunami generation may be very high. Indeed, recent seismological data from Nankai indicate very low frequency shallow-thrust earthquakes beneath this portion of the accretionary wedge, long-considered to be "aseismic". We propose that thick accumulations of sediment on the downgoing plate and the presence of a thick accretionary wedge can increase the maximum size of the potential rupture fault plane in two ways; 1) by thermally insulating the downgoing plate and thereby increasing the total downdip length of the fault which can rupture seismically and 2) by "smoothing out" the

  14. Stability assessment of structures under earthquake hazard through GRID technology (United States)

    Prieto Castrillo, F.; Boton Fernandez, M.


    This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding

  15. Global Positioning System data collection, processing, and analysis conducted by the U.S. Geological Survey Earthquake Hazards Program (United States)

    Murray, Jessica R.; Svarc, Jerry L.


    The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.

  16. Decision making based on analysis of benefit versus costs of preventive retrofit versus costs of repair after earthquake hazards (United States)

    Bostenaru Dan, M.


    In this presentation interventions on seismically vulnerable early reinforced concrete skeleton buildings, from the interwar time, at different performance levels, from avoiding collapse up to assuring immediate post-earthquake functionality are considered. Between these two poles there are degrees of damage depending on the performance aim set. The costs of the retrofit and post-earthquake repair differ depending on the targeted performance. Not only an earthquake has impact on a heritage building, but also the retrofit measure, for example on its appearance or its functional layout. This way criteria of the structural engineer, the investor, the architect/conservator/urban planner and the owner/inhabitants from the neighbourhood are considered for taking a benefit-cost decision. Benefit-cost analysis based decision is an element in a risk management process. A solution must be found on how much change to accept for retrofit and how much repairable damage to take into account. There are two impact studies. Numerical simulation was run for the building typology considered for successive earthquakes, selected in a deterministic way (1977, 1986 and two for 1991 from Vrancea, Romania and respectively 1978 Thessaloniki, Greece), considering also the case when retrofit is done between two earthquakes. The typology of buildings itself was studied not only for Greece and Romania, but for numerous European countries, including Italy. The typology was compared to earlier reinforced concrete buildings, with Hennebique system, in order to see to which amount these can belong to structural heritage and to shape the criteria of the architect/conservator. Based on the typology study two model buildings were designed, and for one of these different retrofit measures (side walls, structural walls, steel braces, steel jacketing) were considered, while for the other one of these retrofit techniques (diagonal braces, which permits adding also active measures such as energy

  17. An integrated analysis of source parameters, seismogenic structure, and seismic hazards related to the 2014 MS 6.3 Kangding earthquake, China (United States)

    Xie, Zujun; Zheng, Yong; Liu, Chengli; Shan, Bin; Riaz, Muhammad Shahid; Xiong, Xiong


    On 22 November, 2014, an MS 6.3 earthquake occurred in Kangding County, China. Focal mechanism solution shows that the two nodal planes were 235°/82°/- 173° and 144°/83°/- 8° and the focal depth was 9 km. Seismic slip of the Kangding earthquake was bilateral with about 0.5 m maximum slip. The rupture zone was confined to depths ranging from 5 to 15 km and laterally extended along the slip and strike directions by about 10 and 12 km, respectively. Most of the seismic moment was released in the first 5 s of the rupture, resulting in an earthquake magnitude of MW 6.01. In contrast, a slip model obtained by interferometric synthetic aperture radar (InSAR) data indicates that the rupture zone was longer than that determined from the seismic data and the earthquake magnitude should be about MW 6.2. Although accounting for the contribution of the MS 5.8 aftershock and the other small aftershocks that occurred during the InSAR observations period, the total moment estimated based on the seismic slip model was significantly smaller than that obtained from the InSAR data. Based on our analysis, we found that the inconsistency between the results determined from the seismic data and the InSAR data may be caused by the decrease in the shear modulus at shallow depths, the noise in the InSAR data, and the occurrence of some afterslips in the northwest region of the fault zone. The seismic slip of this earthquake was too small to release the accumulated energy within the entire Xianshuihe fault. We also found that the Coulomb stress in the northwest zone of the Kangding-Daofu seismic gap increased as a result of the historical, 2008 MS 8.0 Wenchuan and the 2014 MS 6.3 Kangding earthquakes, suggesting that this area is expected to be a high seismic hazard region for the future.

  18. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards (United States)

    McNamara, Daniel E.; Yeck, William; Barnhart, William D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, Amod; Hough, S.E.; Benz, Harley M.; Earle, Paul


    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard.Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a ~ 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10–15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  19. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards (United States)

    McNamara, D. E.; Yeck, W. L.; Barnhart, W. D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, A.; Hough, S. E.; Benz, H. M.; Earle, P. S.


    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard. Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10-15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  20. Geotechnical hazards from large earthquakes and heavy rainfalls

    CERN Document Server

    Kazama, Motoki; Lee, Wei


    This book is a collection of papers presented at the International Workshop on Geotechnical Natural Hazards held July 12–15, 2014, in Kitakyushu, Japan. The workshop was the sixth in the series of Japan–Taiwan Joint Workshops on Geotechnical Hazards from Large Earthquakes and Heavy Rainfalls, held under the auspices of the Asian Technical Committee No. 3 on Geotechnology for Natural Hazards of the International Society for Soil Mechanics and Geotechnical Engineering. It was co-organized by the Japanese Geotechnical Society and the Taiwanese Geotechnical Society. The contents of this book focus on geotechnical and natural hazard-related issues in Asia such as earthquakes, tsunami, rainfall-induced debris flows, slope failures, and landslides. The book contains the latest information and mitigation technology on earthquake- and rainfall-induced geotechnical natural hazards. By dissemination of the latest state-of-the-art research in the area, the information contained in this book will help researchers, des...

  1. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs (United States)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui


    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  2. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information (United States)

    Thompson, K. J.; Krantz, D. H.


    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  3. Salient beliefs about earthquake hazards and household preparedness. (United States)

    Becker, Julia S; Paton, Douglas; Johnston, David M; Ronan, Kevin R


    Prior research has found little or no direct link between beliefs about earthquake risk and household preparedness. Furthermore, only limited work has been conducted on how people's beliefs influence the nature and number of preparedness measures adopted. To address this gap, 48 qualitative interviews were undertaken with residents in three urban locations in New Zealand subject to seismic risk. The study aimed to identify the diverse hazard and preparedness-related beliefs people hold and to articulate how these are influenced by public education to encourage preparedness. The study also explored how beliefs and competencies at personal, social, and environmental levels interact to influence people's risk management choices. Three main categories of beliefs were found: hazard beliefs; preparedness beliefs; and personal beliefs. Several salient beliefs found previously to influence the preparedness process were confirmed by this study, including beliefs related to earthquakes being an inevitable and imminent threat, self-efficacy, outcome expectancy, personal responsibility, responsibility for others, and beliefs related to denial, fatalism, normalization bias, and optimistic bias. New salient beliefs were also identified (e.g., preparedness being a "way of life"), as well as insight into how some of these beliefs interact within the wider informational and societal context. © 2013 Society for Risk Analysis.

  4. Roaming earthquakes in China highlight midcontinental hazards (United States)

    Liu, Mian; Wang, Hui


    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  5. Deterministic and Nondeterministic Behavior of Earthquakes and Hazard Mitigation Strategy (United States)

    Kanamori, H.


    Earthquakes exhibit both deterministic and nondeterministic behavior. Deterministic behavior is controlled by length and time scales such as the dimension of seismogenic zones and plate-motion speed. Nondeterministic behavior is controlled by the interaction of many elements, such as asperities, in the system. Some subduction zones have strong deterministic elements which allow forecasts of future seismicity. For example, the forecasts of the 2010 Mw=8.8 Maule, Chile, earthquake and the 2012 Mw=7.6, Costa Rica, earthquake are good examples in which useful forecasts were made within a solid scientific framework using GPS. However, even in these cases, because of the nondeterministic elements uncertainties are difficult to quantify. In some subduction zones, nondeterministic behavior dominates because of complex plate boundary structures and defies useful forecasts. The 2011 Mw=9.0 Tohoku-Oki earthquake may be an example in which the physical framework was reasonably well understood, but complex interactions of asperities and insufficient knowledge about the subduction-zone structures led to the unexpected tragic consequence. Despite these difficulties, broadband seismology, GPS, and rapid data processing-telemetry technology can contribute to effective hazard mitigation through scenario earthquake approach and real-time warning. A scale-independent relation between M0 (seismic moment) and the source duration, t, can be used for the design of average scenario earthquakes. However, outliers caused by the variation of stress drop, radiation efficiency, and aspect ratio of the rupture plane are often the most hazardous and need to be included in scenario earthquakes. The recent development in real-time technology would help seismologists to cope with, and prepare for, devastating tsunamis and earthquakes. Combining a better understanding of earthquake diversity and modern technology is the key to effective and comprehensive hazard mitigation practices.

  6. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 18. Errors in Probabilistic Seismic Hazard Analysis. (United States)


    hazard is conditional on a given t.a. process representation of seismicity, symbolized here by the random process X(t). However, X(t) is not always known...Regionalized Variables and Its Applications, Les Cahiers du Centre de Morphologie Mathematique de Fontainbleau, No. 5. McGuire, R.K. and Shedlock

  7. Estimation of loss caused by earthquakes and secondary technological hazards

    Directory of Open Access Journals (Sweden)

    N. I. Frolova


    Full Text Available Assessment of expected loss and damage caused by earthquakes and secondary technological accidents are of primary importance for the development and implementation of preventive measure plans, as well as for emergency management just after the disaster. The paper addresses the procedures for estimations of loss caused by strong events and secondary hazards with information technology application. Examples of individual seismic risk zoning at Russian federal and regional levels are given, as well as that of scenario earthquakes consequences estimation, taking into account secondary technological hazards.

  8. Harmonized Probabilistic Seismic Hazard Assessment in Europe: Earthquake Geology Applied (United States)

    Woessner, J.; Danciu, L.; Giardini, D.; Share Consortium


    Probabilistic seismic hazard assessment (PSHA) aims to characterize the best available knowledge on seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results from PSHAs form the baseline for informed decision-making and provide essential input to each risk assessment application. SHARE is an EC-FP7 funded project to create a testable time-independent community-based hazard model for the Euro-Mediterranean region. SHARE scientists are creating a model framework and infrastructure for a harmonized PSHA. The results will serve as reference for the Eurocode 8 application and are envisioned to provide homogeneous input for state-of-the art seismic safety assessment for critical industry. Harmonizing hazard is pursued on the input data level and the model building procedure across borders and tectonic features of the European-Mediterranean region. An updated earthquake catalog, a harmonized database of seismogenic sources together with adjusted ground motion prediction equations (GMPEs) form the bases for a borderless assessment. We require transparent and reproducible strategies to estimate parameter values and their uncertainties within the source model assessment and the contributions of the GMPEs. The SHARE model accounts for uncertainties via a logic tree. Epistemic uncertainties within the seismic source-model are represented by four source model options including area sources, fault sources and kernel-smoothing approaches, aleatory uncertainties for activity rates and maximum magnitudes. Epistemic uncertainties for predicted ground motions are considered by multiple GMPEs as a function of tectonic settings and treated as being correlated. For practical implementation, epistemic uncertainties in the source model (i.e. dip and strike angles) are treated as aleatory, and a mean seismicity model is considered. The final results contain the full distribution of ground motion variability. This contribution will feature preliminary

  9. USGS Training in Afghanistan: Modern Earthquake Hazards Assessments (United States)

    Medlin, J. D.; Garthwaite, M.; Holzer, T.; McGarr, A.; Bohannon, R.; Bergen, K.; Vincent, T.


    Afghanistan is located in a tectonically active region where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can present a significant hazard, not only from strong ground shaking, but also from liquefaction and extensive land sliding. The magnitude 6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghanistan to such hazards, and resulted in over 1000 fatalities. The USGS has provided the first of a series of Earth Science training courses to the Afghan Geological Survey (AGS). This course was concerned with modern earthquake hazard assessments, and is an integral part of a larger USGS effort to provide a comprehensive seismic-hazard assessment for Afghanistan. Funding for these courses is provided by the US Agency for International Development Afghanistan Reconstruction Program. The particular focus of this training course, held December 2-6, 2006 in Kabul, was on providing a background in the seismological and geological methods relevant to preparing for future earthquakes. Topics included identifying active faults, modern tectonic theory, geotechnical measurements of near-surface materials, and strong-motion seismology. With this background, participants may now be expected to educate other members of the community and be actively involved in earthquake hazard assessments themselves. The December, 2006, training course was taught by four lecturers, with all lectures and slides being presented in English and translated into Dari. Copies of the lectures were provided to the students in both hardcopy and digital formats. Class participants included many of the section leaders from within the AGS who have backgrounds in geology, geophysics, and engineering. Two additional training sessions are planned for 2007, the first entitled "Modern Concepts in Geology and Mineral Resource Assessments," and the second entitled "Applied Geophysics for Mineral Resource Assessments."

  10. Earthquake Hazard and Risk Assessment Based on Unified Scaling Law for Earthquakes: State of Gujarat, India (United States)

    Parvez, Imtiyaz A.; Nekrasova, Anastasia; Kossobokov, Vladimir


    The Gujarat state of India is one of the most seismically active intercontinental regions of the world. Historically, it has experienced many damaging earthquakes including the devastating 1819 Rann of Kachchh and 2001 Bhuj earthquakes. The effect of the later one is grossly underestimated by the Global Seismic Hazard Assessment Program (GSHAP). To assess a more adequate earthquake hazard for the state of Gujarat, we apply Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter recurrence relation taking into account naturally fractal distribution of earthquake loci. USLE has evident implications since any estimate of seismic hazard depends on the size of the territory considered and, therefore, may differ dramatically from the actual one when scaled down to the proportion of the area of interest (e.g. of a city) from the enveloping area of investigation. We cross-compare the seismic hazard maps compiled for the same standard regular grid 0.2° × 0.2° (1) in terms of design ground acceleration based on the neo-deterministic approach, (2) in terms of probabilistic exceedance of peak ground acceleration by GSHAP, and (3) the one resulted from the USLE application. Finally, we present the maps of seismic risks for the state of Gujarat integrating the obtained seismic hazard, population density based on India's Census 2011 data, and a few model assumptions of vulnerability.

  11. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps (United States)

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha


    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated

  12. Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale (United States)

    Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.


    The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.

  13. The 2016 Kumamoto Earthquakes: Cascading Geological Hazards and Compounding Risks

    Directory of Open Access Journals (Sweden)

    Katsuichiro Goda


    Full Text Available A sequence of two strike-slip earthquakes occurred on 14 and 16 April 2016 in the intraplate region of Kyushu Island, Japan, apart from subduction zones, and caused significant damage and disruption to the Kumamoto region. The analyses of regional seismic catalog and available strong motion recordings reveal striking characteristics of the events, such as migrating seismicity, earthquake surface rupture, and major foreshock-mainshock earthquake sequences. To gain valuable lessons from the events, a UK Earthquake Engineering Field Investigation Team (EEFIT was dispatched to Kumamoto, and earthquake damage surveys were conducted to relate observed earthquake characteristics to building and infrastructure damage caused by the earthquakes. The lessons learnt from the reconnaissance mission have important implications on current seismic design practice regarding the required seismic resistance of structures under multiple shocks and the seismic design of infrastructure subject to large ground deformation. The observations also highlight the consequences of cascading geological hazards on community resilience. To share the gathered damage data widely, geo-tagged photos are organized using Google Earth and the kmz file is made publicly available.

  14. Earthquake induced landslide hazard field observatory in the Avcilar peninsula (United States)

    Bigarre, Pascal; Coccia, Stella; Theoleyre, Fiona; Ergintav, Semih; Özel, Oguz; Yalçinkaya, Esref; Lenti, Luca; Martino, Salvatore; Gamba, Paolo; Zucca, Francesco; Moro, Marco


    Earthquake-triggered landslides have an increasing disastrous impact in seismic regions due to the fast growing urbanization and infrastructures. Just considering disasters from the last fifteen years, among which the 1999 Chi-Chi earthquake, the 2008 Wenchuan earthquake, and the 2011 Tohoku earthquake, these events generated tens of thousands of coseismic landslides. Those resulted in amazing death toll and considerable damages, affecting the regional landscape including its hydrological main features. Despite a strong impetus in research during past decades, knowledge on those geohazards is still fragmentary, while databases of high quality observational data are lacking. These phenomena call for further collaborative researches aiming eventually to enhance preparedness and crisis management. The MARSITE project gathers research groups in a comprehensive monitoring activity developed in the Sea of Marmara Region, one of the most densely populated parts of Europe and rated at high seismic risk level since the 1999 Izmit and Duzce devastating earthquakes. Besides the seismic threat, landslides in Turkey and in this region constitute an important source of loss. The 6th Work Package of MARSITE project gathers 9 research groups to study earthquake-induced landslides focusing on two sub-regional areas of high interest among which the Cekmece-Avcilar peninsula, located westwards of Istanbul, as a highly urbanized concentrated landslide prone area, showing high susceptibility to both rainfalls while affected by very significant seismic site effects. A multidisciplinary research program based on pre-existing studies has been designed with objectives and tasks linked to constrain and tackle progressively some challenging issues related to data integration, modeling, monitoring and mapping technologies. Since the start of the project, progress has been marked on several important points as follows. The photogeological interpretation and analysis of ENVISAT-ERS DIn

  15. Job Hazard Analysis

    National Research Council Canada - National Science Library


    .... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...

  16. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GRAMS, W.H.


    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  17. 2016 one-year seismic hazard forecast for the Central and Eastern United States from induced and natural earthquakes (United States)

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.


    The U.S. Geological Survey (USGS) has produced a 1-year seismic hazard forecast for 2016 for the Central and Eastern United States (CEUS) that includes contributions from both induced and natural earthquakes. The model assumes that earthquake rates calculated from several different time windows will remain relatively stationary and can be used to forecast earthquake hazard and damage intensity for the year 2016. This assessment is the first step in developing an operational earthquake forecast for the CEUS, and the analysis could be revised with updated seismicity and model parameters. Consensus input models consider alternative earthquake catalog durations, smoothing parameters, maximum magnitudes, and ground motion estimates, and represent uncertainties in earthquake occurrence and diversity of opinion in the science community. Ground shaking seismic hazard for 1-percent probability of exceedance in 1 year reaches 0.6 g (as a fraction of standard gravity [g]) in northern Oklahoma and southern Kansas, and about 0.2 g in the Raton Basin of Colorado and New Mexico, in central Arkansas, and in north-central Texas near Dallas. Near some areas of active induced earthquakes, hazard is higher than in the 2014 USGS National Seismic Hazard Model (NHSM) by more than a factor of 3; the 2014 NHSM did not consider induced earthquakes. In some areas, previously observed induced earthquakes have stopped, so the seismic hazard reverts back to the 2014 NSHM. Increased seismic activity, whether defined as induced or natural, produces high hazard. Conversion of ground shaking to seismic intensity indicates that some places in Oklahoma, Kansas, Colorado, New Mexico, Texas, and Arkansas may experience damage if the induced seismicity continues unabated. The chance of having Modified Mercalli Intensity (MMI) VI or greater (damaging earthquake shaking) is 5–12 percent per year in north-central Oklahoma and southern Kansas, similar to the chance of damage caused by natural earthquakes

  18. Monitoring Geologic Hazards and Vegetation Recovery in the Wenchuan Earthquake Region Using Aerial Photography

    Directory of Open Access Journals (Sweden)

    Zhenwang Li


    Full Text Available On 12 May 2008, the 8.0-magnitude Wenchuan earthquake occurred in Sichuan Province, China, triggering thousands of landslides, debris flows, and barrier lakes, leading to a substantial loss of life and damage to the local environment and infrastructure. This study aimed to monitor the status of geologic hazards and vegetation recovery in a post-earthquake disaster area using high-resolution aerial photography from 2008 to 2011, acquired from the Center for Earth Observation and Digital Earth (CEODE, Chinese Academy of Sciences. The distribution and range of hazards were identified in 15 large, representative geologic hazard areas triggered by the Wenchuan earthquake. After conducting an overlay analysis, the variations of these hazards between successive years were analyzed to reflect the geologic hazard development and vegetation recovery. The results showed that in the first year after the Wenchuan earthquake, debris flows occurred frequently with high intensity. Resultantly, with the source material becoming less available and the slope structure stabilizing, the intensity and frequency of debris flows gradually decreased with time. The development rate of debris flows between 2008 and 2011 was 3% per year. The lithology played a dominant role in the formation of debris flows, and the topography and hazard size in the earthquake affected area also had an influence on the debris flow development process. Meanwhile, the overall geologic hazard area decreased at 12% per year, and the vegetation recovery on the landslide mass was 15% to 20% per year between 2008 and 2011. The outcomes of this study provide supporting data for ecological recovery as well as debris flow control and prevention projects in hazard-prone areas.

  19. Seismic hazard in central Italy and the 2016 Amatrice earthquake

    Directory of Open Access Journals (Sweden)

    Carlo Meletti


    Full Text Available The Amatrice earthquake of August 24th, 2016 (Mw 6.0 struck an area that in the national reference seismic hazard model (MPS04 is characterized by expected horizontal peak ground acceleration (PGA with 10% probability of exceedance in 50 years higher than 0.25 g. After the occurrence of moderate-to-large magnitude earthquakes with a strong impact on the population, such as the L’Aquila 2009 and Emilia 2012 ones (Mw 6.1 and 5.9, respectively, possible underestimations of the seismic hazard by MPS04 were investigated, in order to analyze and evaluate the possible need for its update. One of the most common misunderstanding is to compare recorded PGA only with PGA with 10% probability of exceedance in 50 years. Moreover, by definition, probabilistic models cannot be validated (or rejected on the basis of a single event. However, comparisons of forecasted shakings with observed data are useful for understating the consistency of the model. It is then worth highlighting the importance of these comparisons. In fact, MPS04 is the basis for the current Italian building code to provide the effective design procedures and, thus, any modification to the seismic hazard would also affect the building code. In this paper, comparisons between recorded ground motion during the Amatrice earthquake and seismic hazard estimates are performed, showing that the observed accelerations are consistent with the values expected by the MPS04 model.

  20. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.


    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  1. Earthquake hazard in Northeast India – A seismic microzonation ...

    Indian Academy of Sciences (India)

    Department of Geology and Geophysics, Indian Institute of Technology Kharagpur 721 302, India. ∗ e-mail: A comprehensive analytical ..... Earthquake hazard in Northeast India. 815. Figu re. 4. (a. ) Pred om in an t freq u en cy an d. (b) site resp o n se distributio n m a p s o f the. G u w a ha ti reg io n.

  2. Education and earthquake hazard preparedness: How do they fit together?


    Musacchio, G.; Bernhardsdottir, A.E.; Ferreira, M.A.; Falsaperla, S.


    In the context of natural disasters, education is a method to achieve mitigating actions in case of severe damage caused by different sources.In regions prone to seismic activity, education is only a part of what can be defined as "earthquake hazard preparedness". Nevertheless, it is a significant part indeed, as it involves the building of awareness, the establishment of a culture of prevention, and even the increase of safety when it acts on the process of making future ...

  3. Integrating Real-time Earthquakes into Natural Hazard Courses (United States)

    Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.


    Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm

  4. Hazard Assessment and Early Warning of Tsunamis: Lessons from the 2011 Tohoku earthquake (United States)

    Satake, K.


    The March 11, 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history, and was the best recorded subduction-zone earthquakes in the world. In particular, various offshore geophysical observations revealed large horizontal and vertical seafloor movements, and the tsunami was recorded on high-quality, high-sampling gauges. Analysis of such tsunami waveforms shows a temporal and spatial slip distribution during the 2011 Tohoku earthquake. The fault rupture started near the hypocenter and propagated into both deep and shallow parts of the plate interface. Very large, ~25 m, slip off Miyagi on the deep part of plate interface corresponds to an interplate earthquake of M 8.8, the location and size similar to 869 Jogan earthquake model, and was responsible for the large tsunami inundation in Sendai and Ishinomaki plains. Huge slip, more than 50 m, occurred on the shallow part near the trench axis ~3 min after the earthquake origin time. This delayed shallow rupture (M 8.8) was similar to the 1896 "tsunami earthquake," and was responsible for the large tsunami on the northern Sanriku coast, measured at ~100 km north of the largest slip. Thus the Tohoku earthquake can be decomposed into an interplate earthquake and the triggered "tsunami earthquake." The Japan Meteorological Agency issued tsunami warning 3 minutes after the earthquake, and saved many lives. However, their initial estimation of tsunami height was underestimated, because the earthquake magnitude was initially estimated as M 7.9, hence the computed tsunami heights were lower. The JMA attempts to improve the tsunami warning system, including technical developments to estimate the earthquake size in a few minutes by using various and redundant information, to deploy and utilize the offshore tsunami observations, and to issue a warning based on the worst case scenario if a possibility of giant earthquake exists. Predicting a trigger of another large earthquake would still be a challenge

  5. Long term (2004-2013) correlation analysis among SSTAs (Significant Sequences of TIR Anomalies) and Earthquakes (M>4) occurrence over Greece: examples of application within a multi-parametric system for continuous seismic hazard monitoring. (United States)

    Tramutoli, Valerio; Coviello, Irina; Eleftheriou, Alexander; Filizzola, Carolina; Genzano, Nicola; Lacava, Teodosio; Lisi, Mariano; Makris, John P.; Paciello, Rossana; Pergola, Nicola; Satriano, Valeria; vallianatos, filippos


    Real-time integration of multi-parametric observations is expected to significantly contribute to the development of operational systems for time-Dependent Assessment of Seismic Hazard (t-DASH) and earthquake short term (from days to weeks) forecast. However a very preliminary step in this direction is the identification of those parameters (chemical, physical, biological, etc.) whose anomalous variations can be, to some extent, associated to the complex process of preparation of major earthquakes. In this paper one of these parameter (the Earth's emitted radiation in the Thermal Infra-Red spectral region) is considered for its possible correlation with M≥4 earthquakes occurred in Greece in between 2004 and 2013. The RST (Robust Satellite Technique) data analysis approach and RETIRA (Robust Estimator of TIR Anomalies) index were used to preliminarily define, and then to identify, Significant Sequences of TIR Anomalies (SSTAs) in 10 years (2004-2013) of daily TIR images acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite. Taking into account physical models proposed for justifying the existence of a correlation among TIR anomalies and earthquakes occurrence, specific validation rules (in line with the ones used by the Collaboratory for the Study of Earthquake Predictability - CSEP - Project) have been defined to drive the correlation analysis process. The analysis shows that more than 93% of all identified SSTAs occur in the pre-fixed space-time window around (M≥4) earthquakes time and location of occurrence with a false positive rate smaller than 7%. Achieved results, and particularly the very low rate of false positives registered on a so long testing period, seems already sufficient (at least) to qualify TIR anomalies (identified by RST approach and RETIRA index) among the parameters to be considered in the framework of a multi-parametric approach to time-Dependent Assessment of

  6. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (United States)

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.


    : USGS deterministic seismic hazard analysis program and three Next Generation Ground Motion Attenuation (NGA) models. Ground motion calculations incorporated the potential amplification of seismic shaking by near-surface soils defined by a map of the average shear wave velocity in the uppermost 30 m (VS30) developed by CGS. In addition to ground shaking, earthquakes cause ground failure, which can cause severe damage to buildings and lifelines. Ground failure includes surface fault rupture, liquefaction, and seismically induced landslides. For each earthquake scenario, potential surface fault displacements are estimated using deterministic and probabilistic approaches. Liquefaction occurs when saturated sediments lose their strength because of ground shaking. Zones of potential liquefaction are mapped by incorporating areas where loose sandy sediments, shallow groundwater, and strong earthquake shaking coincide in the earthquake scenario. The process for defining zones of potential landslide and rockfall incorporates rock strength, surface slope, existing landslides, with ground motions caused by the earthquake scenario. Each scenario is illustrated with maps of seismic shaking potential and fault displacement, liquefaction, and landslide potential. Seismic shaking is depicted by the distribution of shaking intensity, peak ground acceleration, and 1.0-second spectral acceleration. One-second spectral acceleration correlates well with structural damage to surface facilities. Acceleration greater than 0.2 g is often associated with strong to violent perceived ground shaking and may cause moderate to heavy damage. The extent of strong shaking is influenced by subsurface fault dip and near surface materials. Strong shaking is more widespread in the hanging wall regions of a normal fault. Larger ground motions also occur where young alluvial sediments amplify the shaking. Both of these effects can lead to strong shaking that extends farther from the fault on the valley side

  7. Dynamic evaluation of seismic hazard and risks based on the Unified Scaling Law for Earthquakes (United States)

    Kossobokov, V. G.; Nekrasova, A.


    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A + B•(6 - M) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L, A characterizes the average annual rate of strong (M = 6) earthquakes, B determines the balance between magnitude ranges, and C estimates the fractal dimension of seismic locus in projection to the Earth surface. The parameters A, B, and C of USLE are used to assess, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity or paleo data), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures. The hazard maps for a given territory change dramatically, when the methodology is applied to a certain size moving time window, e.g. about a decade long for an intermediate-term regional assessment or exponentially increasing intervals for a daily local strong aftershock forecasting. The of dynamical seismic hazard and risks assessment is illustrated by applications to the territory of Greater Caucasus and Crimea and the two-year series of aftershocks of the 11 October 2008 Kurchaloy, Chechnya earthquake which case-history appears to be encouraging for further systematic testing as potential short-term forecasting tool.

  8. Seismic hazard analysis of Sinop province, Turkey using ...

    Indian Academy of Sciences (India)

    Using 4.0 and greater magnitude earthquakes which occurred between 1 January 1900 and 31 December 2008 in the Sinop province of Turkey this study presents a seismic hazard analysis based on the probabilistic and statistical methods. According to the earthquake zonation map, Sinop is divided into first, second, third ...

  9. Basic earthquake engineering from seismology to analysis and design

    CERN Document Server

    Sucuoğlu, Halûk


    This book provides senior undergraduate students, master students and structural engineers who do not have a background in the field with core knowledge of structural earthquake engineering that will be invaluable in their professional lives. The basics of seismotectonics, including the causes, magnitude, and intensity of earthquakes, are first explained. Then the book introduces basic elements of seismic hazard analysis and presents the concept of a seismic hazard map for use in seismic design. Subsequent chapters cover key aspects of the response analysis of simple systems and building struc­tures to earthquake ground motions, design spectrum, the adoption of seismic analysis procedures in seismic design codes, seismic design principles and seismic design of reinforced concrete structures. Helpful worked examples on seismic analysis of linear, nonlinear and base isolated buildings, earthquake-resistant design of frame and frame-shear wall systems are included, most of which can be solved using a hand calcu...

  10. Historical cities and earthquakes: Florence during the last nine centuries and evaluations of seismic hazard

    Directory of Open Access Journals (Sweden)

    G. Ferrari


    Full Text Available The authors' aim in the following study is to contribute to the assessment of the seismic hazard of historical cities. From this preliminary analysis the general characteristics of the seismicity affecting Florence and the evaluation of its seismic hazard may be deduced. Florence is a <> city of world tourism, and its extraordinary artistic value and its ability to be utilized constitute a great economic resource. From this perspective, the authors have tackled some aspects of its urban features (demography and main building types, successive phases in the growth of the city, etc., aimed at the pooling of information as a basis for further, more specific analyses of seismic risk. The study is based on a review of 131 seismic events of potential interest for the site of Florence from the 12th century. In the case of each of these earthquakes, it was possible to verify the real seismic effects sustained, and thus to assess the seismic intensity on the site. This also enabled the limits in the application of the standard attenuation laws of to be checked. Of all the earthquakes analyzed. those which caused the greatest effects on the urban area have also been identified: namely, the earthquake of 28 September 1453. and those of 18 May and 6 June 1895, both with Io=VIII MCS. From their overall analysis the authors have further extrapolated the necessary data to statistically evaluate the probabilities of any future earthquake occurring, according to intensity classes.

  11. Evansville Area Earthquake Hazards Mapping Project (EAEHMP) - Progress Report, 2008 (United States)

    Boyd, Oliver S.; Haase, Jennifer L.; Moore, David W.


    Maps of surficial geology, deterministic and probabilistic seismic hazard, and liquefaction potential index have been prepared by various members of the Evansville Area Earthquake Hazard Mapping Project for seven quadrangles in the Evansville, Indiana, and Henderson, Kentucky, metropolitan areas. The surficial geologic maps feature 23 types of surficial geologic deposits, artificial fill, and undifferentiated bedrock outcrop and include alluvial and lake deposits of the Ohio River valley. Probabilistic and deterministic seismic hazard and liquefaction hazard mapping is made possible by drawing on a wealth of information including surficial geologic maps, water well logs, and in-situ testing profiles using the cone penetration test, standard penetration test, down-hole shear wave velocity tests, and seismic refraction tests. These data were compiled and collected with contributions from the Indiana Geological Survey, Kentucky Geological Survey, Illinois State Geological Survey, United States Geological Survey, and Purdue University. Hazard map products are in progress and are expected to be completed by the end of 2009, with a public roll out in early 2010. Preliminary results suggest that there is a 2 percent probability that peak ground accelerations of about 0.3 g will be exceeded in much of the study area within 50 years, which is similar to the 2002 USGS National Seismic Hazard Maps for a firm rock site value. Accelerations as high as 0.4-0.5 g may be exceeded along the edge of the Ohio River basin. Most of the region outside of the river basin has a low liquefaction potential index (LPI), where the probability that LPI is greater than 5 (that is, there is a high potential for liquefaction) for a M7.7 New Madrid type event is only 20-30 percent. Within the river basin, most of the region has high LPI, where the probability that LPI is greater than 5 for a New Madrid type event is 80-100 percent.

  12. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Directory of Open Access Journals (Sweden)

    C. H. Nelson


    Full Text Available We summarize the importance of great earthquakes (Mw ≳ 8 for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1 radiometric dating (14C method, and (2 relative dating, using hemipelagic sediment thickness and sedimentation rates (H method. The H method provides (1 the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2 the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia or very close (San Andreas to the early window for another great earthquake.

    On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km than on passive margins (~1000 km. The great earthquakes along the Cascadia and northern California margins

  13. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri (United States)

    Thenhaus, P.C.


    A sequence of three great earthquakes struck the Central United States during the winter of 1811-1812 in the area of New Madrid, Missouri. they are considered to be the greatest earthquakes in the conterminous U.S because they were felt and caused damage at far greater distances than any other earthquakes in U.S history. The large population currently living within the damage area of these earthquakes means that widespread destruction and loss of life is likely if the sequence were repeated. In contrast to California, where the earthquakes are felt frequently, the damaging earthquakes that have occurred in the Easter U.S-in 155 (Cape Ann, Mass.), 1811-12 (New Madrid, Mo.), 1886 (Charleston S.C) ,and 1897 (Giles County, Va.- are generally regarded as only historical phenomena (fig. 1). The social memory of these earthquakes no longer exists. A fundamental problem in the Eastern U.S, therefore, is that the earthquake hazard is not generally considered today in land-use and civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the "tools" that geoscientists have to study the region. The so-called earthquake hazard is defined  by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. the term "earthquake risk," on the other hand, refers to aspects of the expected damage to manmade strctures and to lifelines as a result of the earthquake hazard.  

  14. Crustal structure and Seismic Hazard studies in Nigeria from ambient noise and earthquakes (United States)

    Kadiri, U. A.


    The crust, upper Mantle and seismic hazard studies have been carried out in Nigeria using noise and earthquake data. The data were acquired from stations in Nigeria and international Agencies. Firstly, known depths of sediments in the Lower Benue Trough (LBT) were collected from wells; Resonance frequency (Fo) and average shear-wave velocities (Vs) were then computed using Matlab. Secondly, average velocities were estimated from noise cross-correlation along seismic stations. Thirdly, the moho depths beneath Ife, Kaduna and Nsukka stations were estimated, as well as Vp/Vs ratio using 2009 earthquake with epicenter in Nigeria. Finally, Statistical and Probabilistic Seismic Hazard Assessment (PSHA) were used to compute seismic hazard parameters in Nigeria and its surroundings. The results showed that, soils on the LBT with average shear wave velocity of about 5684m/s would experience more amplification in case of an earthquake, compared to the basement complex in Nigeria. The Vs beneath the seismic stations in Nigeria were also estimated as 288m/s, 1019m/s, 940.6m/s and 255.02m/s in Ife, Nsukka, Awka, and Abakaliki respectively. The average velocity along the station paths was 4.5km/secs, and the Vp, Vs for depths 100-500km profile in parts of South West Nigeria increased from about 5.83-6.42Km/sec and 3.48-6.31km/s respectively with Vp/Vs ratio decreasing from 1.68 to 1.02. Statistical analysis revealed a trend of increasing earthquake occurrence along the Mid-Atlantic Ridge and tending to West African region. The analysis of PSHA shows the likelihood of earthquakes with different magnitudes occurring in Nigeria and other parts West Africa in future. This work is aimed at addressing critical issues regarding sites effect characterization, improved earthquake location and robust seismic hazards assessment for planning in the choice of sites for critical facilities in Nigeria. Keywords: Sediment thickness, Resonance Frequency, Average Velocity, Seismic Hazard, Nigeria

  15. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)



    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  16. Preparation of Synthetic Earthquake Catalogue and Tsunami Hazard Curves in Marmara Sea using Monte Carlo Simulations (United States)

    Bayraktar, Başak; Özer Sözdinler, Ceren; Necmioǧlu, Öcal; Meral Özel, Nurcan


    The Marmara Sea and its surrounding is one of the most populated areas in Turkey. Many densely populated cities, such as megacity Istanbul with a population of more than 14 million, a great number of industrial facilities in largest capacity and potential, refineries, ports and harbors are located along the coasts of Marmara Sea. The region is highly seismically active. There has been a wide range of studies in this region regarding the fault mechanisms, seismic activities, earthquakes and triggered tsunamis in the Sea of Marmara. The historical documents reveal that the region has been experienced many earthquakes and tsunamis in the past. According to Altinok et al. (2011), 35 tsunami events happened in Marmara Sea between BC 330 and 1999. As earthquakes are expected in Marmara Sea with the break of segments of North Anatolian Fault (NAF) in the future, the region should be investigated in terms of the possibility of tsunamis by the occurrence of earthquakes in specific return periods. This study aims to make probabilistic tsunami hazard analysis in Marmara Sea. For this purpose, the possible sources of tsunami scenarios are specified by compiling the earthquake catalogues, historical records and scientific studies conducted in the region. After compiling all this data, a synthetic earthquake and tsunami catalogue are prepared using Monte Carlo simulations. For specific return periods, the possible epicenters, rupture lengths, widths and displacements are determined with Monte Carlo simulations assuming the angles of fault segments as deterministic. For each earthquake of synthetic catalogue, the tsunami wave heights will be calculated at specific locations along Marmara Sea. As a further objective, this study will determine the tsunami hazard curves for specific locations in Marmara Sea including the tsunami wave heights and their probability of exceedance. This work is supported by SATREPS-MarDim Project (Earthquake and Tsunami Disaster Mitigation in the

  17. Echo-sounding method aids earthquake hazard studies (United States)



    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  18. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra


    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  19. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies (United States)

    Arora, Shreya; Malik, Javed N.


    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  20. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions (United States)

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.


    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  1. Hazard assessment of long-period ground motions for the Nankai Trough earthquakes (United States)

    Maeda, T.; Morikawa, N.; Aoi, S.; Fujiwara, H.


    100 m in horizontal and vertical, respectively. The grid spacing for the deep region is three times coarser. The total number of grid points is about three billion. The 3-D underground structure model used in the FD simulation is the Japan integrated velocity structure model (ERC, 2012). Our simulation is valid for period more than two seconds due to the lowest S-wave velocity and grid spacing. However, because the characterized source model may not sufficiently support short period components, we should be interpreted the reliable period of this simulation with caution. Therefore, we consider the period more than five seconds instead of two seconds for further analysis. We evaluate the long-period ground motions using the velocity response spectra for the period range between five and 20 second. The preliminary simulation shows a large variation of response spectra at a site. This large variation implies that the ground motion is very sensitive to different scenarios. And it requires studying the large variation to understand the seismic hazard. Our further study will obtain the hazard curves for the Nankai Trough earthquake (M 8~9) by applying the probabilistic seismic hazard analysis to the simulation results.

  2. Counterfactual Volcano Hazard Analysis (United States)

    Woo, Gordon


    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  3. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    Energy Technology Data Exchange (ETDEWEB)

    Wong, I.G.; Green, R.K.; Sun, J.I. [Woodward-Clyde Federal Services, Oakland, CA (United States); Pezzopane, S.K. [Geological Survey, Denver, CO (United States); Abrahamson, N.A. [Abrahamson (Norm A.), Piedmont, CA (United States); Quittmeyer, R.C. [Woodward-Clyde Federal Services, Las Vegas, NV (United States)


    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M{sub w}) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M{sub w} 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M{sub w} 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper.

  4. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones (United States)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana


    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  5. The effect of earthquake hazards induced by natural gas mining on Medically Unexplained Physical Symptoms and psychosocial problems: a longitudinal analysis.

    NARCIS (Netherlands)

    Duckers, M.L.; Yzermans, J.


    Study/Objective: To determine whether the chronic threat of exposure to mining-induced earthquakes in the northern part of the Netherlands, is accompanied by a higher prevalence of medically Unexplained Physical Symptoms (MUPS) and psychosocial problems. Background: The Groningen natural gas field

  6. Recent research in earth structure, earthquake and mine seismology, and seismic hazard evaluation in South Africa

    CSIR Research Space (South Africa)

    Wright, C


    Full Text Available Research in earth structure, earthquake and mine seismology, and seismic hazard evaluation in South Africa is summarized for the last four years. Improvements to the South African National Seismograph Network (SANSN) include the gradual replacement...

  7. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India (United States)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.


    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily

  8. The 24th January 2016 Hawassa earthquake: Implications for seismic hazard in the Main Ethiopian Rift (United States)

    Wilks, Matthew; Ayele, Atalay; Kendall, J.-Michael; Wookey, James


    Earthquakes of low to intermediate magnitudes are a commonly observed feature of continental rifting and particularly in regions of Quaternary to Recent volcanism such as in the Main Ethiopian Rift (MER). Although the seismic hazard is estimated to be less in the Hawassa region of the MER than further north and south, a significant earthquake occurred on the 24th January 2016 in the Hawassa caldera basin and close to the Corbetti volcanic complex. The event was felt up to 100 km away and caused structural damage and public anxiety in the city of Hawassa itself. In this paper we first refine the earthquake's location using data from global network and Ethiopian network stations. The resulting location is at 7.0404°N, 38.3478°E and at 4.55 km depth, which suggests that the event occurred on structures associated with the caldera collapse of the Hawassa caldera in the early Pleistocene and not through volcano-tectonic processes at Corbetti. We calculate local and moment magnitudes, which are magnitude scales more appropriate at regional hypocentral distances than (mb) at four stations. This is done using a local scale (attenuation term) previously determined for the MER and spectral analysis for ML and MW respectively and gives magnitude estimates of 4.68 and 4.29. The event indicates predominantly normal slip on a N-S striking fault structure, which suggests that slip continues to occur on Wonji faults that have exploited weaknesses inherited from the preceding caldera collapse. These results and two previous earthquakes in the Hawassa caldera of M > 5 highlight that earthquakes continue to pose a risk to structures within the caldera basin. With this in mind, it is suggested that enhanced monitoring and public outreach should be considered.

  9. Earthquake hazards of active blind-thrust faults under the central Los Angeles basin, California (United States)

    Shaw, John H.; Suppe, John


    We document several blind-thrust faults under the Los Angeles basin that, if active and seismogenic, are capable of generating large earthquakes (M = 6.3 to 7.3). Pliocene to Quaternary growth folds imaged in seismic reflection profiles record the existence, size, and slip rates of these blind faults. The growth structures have shapes characteristic of fault-bend folds above blind thrusts, as demonstrated by balanced kinematic models, geologic cross sections, and axial-surface maps. We interpret the Compton-Los Alamitos trend as a growth fold above the Compton ramp, which extends along strike from west Los Angeles to at least the Santa Ana River. The Compton thrust is part of a larger fault system, including a decollement and ramps beneath the Elysian Park and Palos Verdes trends. The Cienegas and Coyote Hills growth folds overlie additional blind thrusts in the Elysian Park trend that are not closely linked to the Compton ramp. Analysis of folded Pliocene to Quaternary strata yields slip rates of 1.4 ± 0.4 mm/yr on the Compton thrust and 1.7 ± 0.4 mm/yr on a ramp beneath the Elysian Park trend. Assuming that slip is released in large earthquakes, we estimate magnitudes of 6.3 to 6.8 for earthquakes on individual ramp segments based on geometric segment sizes derived from axial surface maps. Multiple-segment ruptures could yield larger earthquakes (M = 6.9 to 7.3). Relations among magnitude, coseismic displacement, and slip rate yield an average recurrence interval of 380 years for single-segment earthquakes and a range of 400 to 1300 years for multiple-segment events. If these newly documented blind thrust faults are active, they will contribute substantially to the seismic hazards in Los Angeles because of their locations directly beneath the metropolitan area.

  10. Non-Poissonian earthquake occurrence in coupled stress release models and its effect on seismic hazard (United States)

    Kuehn, N. M.; Hainzl, S.; Scherbaum, F.


    Most seismic hazard estimations are based on the assumption of a Poisson process for earthquake occurrence, even though both observations and models indicate a departure of real seismic sequences from this simplistic assumption. Instrumental earthquake catalogues show earthquake clustering on regional scales while the elastic rebound theory predicts a periodic recurrence of characteristic earthquakes on longer timescales for individual events. Recent implementations of time-dependent hazard calculations in California and Japan are based on quasi-periodic recurrences of fault ruptures according to renewal models such as the Brownian Passage Time model. However, these renewal models neglect earthquake interactions and the dependence on the stressing history which might destroy any regularity of earthquake recurrences in reality. To explore this, we investigate the (coupled) stress release model, a stochastic version of the elastic rebound hypothesis. In particular, we are interested in the time-variability of the occurrence of large earthquakes and its sensitivity to the occurrence of Gutenberg-Richter type earthquake activity and fault interactions. Our results show that in general large earthquakes occur quasi-periodically in the model: the occurrence probability of large earthquakes is strongly decreased shortly after a strong event and becomes constant on longer timescales. Although possible stress-interaction between adjacent fault zones does not affect the recurrence time distributions in each zone significantly, it leads to a temporal clustering of events on larger regional scales. The non-random characteristics, especially the quasi-periodic behaviour of large earthquakes, are even more pronounced if stress changes due to small earthquakes are less important. The recurrence-time distribution for the largest events is characterized by a coefficient of variation from 0.6 to 0.84 depending on the relative importance of small earthquakes.

  11. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev


    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  12. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used? (United States)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.


    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  13. Assessment of earthquake hazard in Turkey and neighboring

    Directory of Open Access Journals (Sweden)

    G. Birgoren


    Full Text Available The aim of this study is to conduct a probabilistic seismic hazard analysis for Turkey and neighboring regions, using the most recently developed attenuation relationships. The seismicity database is compiled from numerous sources, and the tectonic setting of the region has been studied in detail. Utilizing these two major categories of information together with the selected attenuation relationships, the seismic source zones are determined, and PGA contour maps are produced for specific return periods. The study is intended to serve as a reference for more advanced approaches and to stimulate discussion and suggestions on the database, assumptions and the inputs, and to pave the way for the probabilistic assessment of seismic hazard in the site selection and the design of engineering structures.

  14. Earthquakes (United States)

    Shedlock, Kaye M.; Pakiser, Louis Charles


    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  15. Wenchuan Earthquake Surface Fault Rupture and Disaster: A Lesson on Seismic Hazard Assessment and Mitigation

    Directory of Open Access Journals (Sweden)

    Yi Du


    Full Text Available The Ms 8.0 Wenchuan earthquake occurred along the Longmenshan Faults in China and was a great disaster. Most of the damage and casualties during the quake were concentrated along surface rupture zones: the 240-km-long Beichuan-Yingxiu Fault and the 70-km-long Jiangyou-Guanxian Fault. Although the Longmenshan Faults are well known and studied, the surface Fault ruptures were not considered in mitigation planning, and the associated ground-motion hazard was therefore underestimated. Not considering Fault rupture and underestimating ground-motion hazard contributed to the disastrous effects of the earthquake. The lesson from the Wenchuan earthquake disaster is that the fault rupture hazard must be assessed and considered in mitigation. Furthermore, the deterministic approach is more appropriate for fault rupture hazard assessment than the probabilistic approach.

  16. The Wenchuan, China M8.0 Earthquake: A Lesson and Implication for Seismic Hazard Mitigation (United States)

    Wang, Z.


    The Wenchuan, China M8.0 earthquake caused great damage and huge casualty. 69,197 people were killed, 374,176 people were injured, and 18,341 people are still missing. The estimated direct economic loss is about 126 billion U.S. dollar. The Wenchuan earthquake again demonstrated that earthquake does not kill people, but the built environments and induced hazards, landslides in particular, do. Therefore, it is critical to strengthen the built environments, such buildings and bridges, and to mitigate the induced hazards in order to avoid such disaster. As a part of the so-called North-South Seismic Zone in China, the Wenchuan earthquake occurred along the Longmen Shan thrust belt which forms a boundary between the Qinghai-Tibet Plateau and the Sichuan basin, and there is a long history (~4,000 years) of seismicity in the area. The historical records show that the area experienced high intensity (i.e., greater than IX) in the past several thousand years. In other words, the area is well-known to have high seismic hazard because of its tectonic setting and seismicity. However, only intensity VII (0.1 to 0.15g PGA) has been considered for seismic design for the built environments in the area. This was one of the main reasons that so many building collapses, particularly the school buildings, during the Wenchuan earthquake. It is clear that the seismic design (i.e., the design ground motion or intensity) is not adequate in the Wenchuan earthquake stricken area. A lesson can be learned from the Wenchuan earthquake on the seismic hazard and risk assessment. A lesson can also be learned from this earthquake on seismic hazard mitigation and/or seismic risk reduction.

  17. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008 (United States)

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.


    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: This progress report summarizes the

  18. Retrospective analysis of the Spitak earthquake

    Directory of Open Access Journals (Sweden)

    A. K. Tovmassian


    Full Text Available Based on the retrospective analysis of numerous data and studies of the Spitak earthquake the present work at- tempts to shed light on different aspects of that catastrophic seismic event which occurred in Northern Arme- nia on December 7, 1988. The authors follow a chronological order of presentation, namely: changes in geo- sphere, atmosphere, biosphere during the preparation of the Spitak earthquake, foreshocks, main shock, after- shocks, focal mechanisms, historical seismicity; seismotectonic position of the source, strong motion records, site effects; the macroseismic effect, collapse of buildings and structures; rescue activities; earthquake conse- quences; and the lessons of the Spitak earthquake.

  19. Aftereffects of Subduction-Zone Earthquakes: Potential Tsunami Hazards along the Japan Sea Coast. (United States)

    Minoura, Koji; Sugawara, Daisuke; Yamanoi, Tohru; Yamada, Tsutomu


    The 2011 Tohoku-Oki Earthquake is a typical subduction-zone earthquake and is the 4th largest earthquake after the beginning of instrumental observation of earthquakes in the 19th century. In fact, the 2011 Tohoku-Oki Earthquake displaced the northeast Japan island arc horizontally and vertically. The displacement largely changed the tectonic situation of the arc from compressive to tensile. The 9th century in Japan was a period of natural hazards caused by frequent large-scale earthquakes. The aseismic tsunamis that inflicted damage on the Japan Sea coast in the 11th century were related to the occurrence of massive earthquakes that represented the final stage of a period of high seismic activity. Anti-compressive tectonics triggered by the subduction-zone earthquakes induced gravitational instability, which resulted in the generation of tsunamis caused by slope failing at the arc-back-arc boundary. The crustal displacement after the 2011 earthquake infers an increased risk of unexpected local tsunami flooding in the Japan Sea coastal areas.

  20. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone (United States)

    Boyd, Oliver Salz; Magistrale, Harold


    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (

  1. Recent destructive earthquakes and international collaboration for seismic hazard assessment in the East Asia region (United States)

    Hao, K.; Fujiwara, H.


    Recent destructive earthquakes in East-Asia claimed one third of million of people's lives. People learned from the lessons but forgotten after generations even one sculpted on stones. Probabilistic seismic hazard assessment (SHA) is considered as a scientific way to define earthquake zones and to guide urban plan and construction. NIED promoted SHA as a national mission of Japan over 10 years and as an international cooperation to neighbor countries since the 2008 Wenchuan earthquake. We initiated China-Japan-Korea SHA strategic cooperative program for the next generation map supported by MOST-JST-NRF in 2010. We also initiated cooperative program with Taiwan Earthquake Model from 2012, as well many other parties in the world. Consequently NIED proudly joined Global Earthquake Model (GEM) since its SHA's methodologies and technologies were highly valuated. As a representative of Japan, NIED will continue to work closely with all members of GEM not only for the GEM global components, also for its regional programs. Seismic hazard assessment has to be carrying out under existed information with epistemic uncertainty. We routinely improve the existed models to carefully treat active faults, earthquake records, and magnitudes under the newest authorized information provided by Earthquake Research Committee, Headquarters for Earthquake Research Promotion. After the 2011 Tohoku earthquake, we have been re-considering the national SHA maps in even long-term and low probabilities. We have setup a platform of to exchange the SHA information and share our experiences, lessons and knowledge internationally. Some probabilistic SHA concepts, seismic risk mitigation issues need constantly to be promoted internationally through outreach and media. Major earthquakes in East Asian region which claimed one third of million of people's lives (slab depth with contour (Hayes et al., 2011)).

  2. MGR External Events Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    L. Booth


    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  3. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation (United States)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.


    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  4. Turning the rumor of the May 11, 2011, earthquake prediction in Rome, Italy, into an information day on earthquake hazard

    Directory of Open Access Journals (Sweden)

    Concetta Nostro


    Full Text Available A devastating earthquake was predicted to hit Rome on May 11, 2011. This prediction was never officially released, but it grew on the internet and was amplified by the media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment, and this fed the credibility of the earthquake prediction. During the months preceding May 2011, the Istituto Nazionale di Geofisica e Vulcanologia (INGV was overwhelmed with requests for information about this prediction, by the inhabitants of Rome and by tourists. Given the echo of this earthquake prediction, on May 11, 2011, the INGV decided to organize an Open Day at its headquarters in Rome, to inform the public about Italian seismicity and earthquake physics. The Open Day was preceded by a press conference two days before, to talk with journalists about this prediction, and to present the Open Day. During this ‘Day’, 13 new videos were also posted on our YouTube/INGVterremoti channel to explain earthquake processes and hazards, and to provide periodic updates on seismicity in Italy from the seismicity monitoring room. On May 11, 2011, the INGV headquarters was peacefully invaded by over 3,000 visitors, from 10:00 am to 9:00 pm: families, students with and without teachers, civil protection groups, and many journalists. This initiative that was built up in a few weeks has had very large feedback, and was a great opportunity to talk with journalists and people about earthquake prediction, and more in general about the seismic risk in Italy.

  5. Earthquake induced landslide hazard: a multidisciplinary field observatory in the Marmara SUPERSITE (United States)

    Bigarré, Pascal


    Earthquake-triggered landslides have an increasing disastrous impact in seismic regions due to the fast growing urbanization and infrastructures. Just considering disasters from the last fifteen years, among which the 1999 Chi-Chi earthquake, the 2008 Wenchuan earthquake, and the 2011 Tohoku earthquake, these events generated tens of thousands of coseismic landslides. Those resulted in amazing death toll and considerable damages, affecting the regional landscape including its hydrological main features. Despite a strong impetus in research during past decades, knowledge on those geohazards is still fragmentary, while databases of high quality observational data are lacking. These phenomena call for further collaborative researches aiming eventually to enhance preparedness and crisis management. As one of the three SUPERSITE concept FP7 projects dealing with long term high level monitoring of major natural hazards at the European level, the MARSITE project gathers research groups in a comprehensive monitoring activity developed in the Sea of Marmara Region, one of the most densely populated parts of Europe and rated at high seismic risk level since the 1999 Izmit and Duzce devastating earthquakes. Besides the seismic threat, landslides in Turkey and in this region constitute an important source of loss. The 1999 Earthquake caused extensive landslides while tsunami effects were observed during the post-event surveys in several places along the coasts of the Izmit bay. The 6th Work Package of MARSITE project gathers 9 research groups to study earthquake-induced landslides focusing on two sub-regional areas of high interest. First, the Cekmece-Avcilar peninsula, located westwards of Istanbul, is a highly urbanized concentrated landslide prone area, showing high susceptibility to both rainfalls while affected by very significant seismic site effects. Second, the off-shore entrance of the Izmit Gulf, close to the termination of the surface rupture of the 1999 earthquake

  6. Time-dependent neo-deterministic seismic hazard scenarios for the 2016 Central Italy earthquakes sequence (United States)

    Peresan, Antonella; Kossobokov, Vladimir; Romashkova, Leontina; Panza, Giuliano F.


    Predicting earthquakes and related ground shaking is widely recognized among the most challenging scientific problems, both for societal relevance and intrinsic complexity of the problem. The development of reliable forecasting tools requires their rigorous formalization and testing, first in retrospect, and then in an experimental real-time mode, which imply a careful application of statistics to data sets of limited size and different accuracy. Accordingly, the operational issues of prospective validation and use of time-dependent neo-deterministic seismic hazard scenarios are discussed, reviewing the results in their application in Italy and surroundings. Long-term practice and results obtained for the Italian territory in about two decades of rigorous prospective testing, support the feasibility of earthquake forecasting based on the analysis of seismicity patterns at the intermediate-term middle-range scale. Italy is the only country worldwide where two independent, globally tested, algorithms are simultaneously applied, namely CN and M8S, which permit to deal with multiple sets of seismic precursors to allow for a diagnosis of the intervals of time when a strong event is likely to occur inside a given region. Based on routinely updated space-time information provided by CN and M8S forecasts, an integrated procedure has been developed that allows for the definition of time-dependent seismic hazard scenarios, through the realistic modeling of ground motion by the neo-deterministic approach (NDSHA). This scenario-based methodology permits to construct, both at regional and local scale, scenarios of ground motion for the time interval when a strong event is likely to occur within the alerted areas. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are routinely updated since 2006. The issues and results from real-time testing of the integrated NDSHA scenarios are illustrated, with special

  7. Oklahoma experiences largest earthquake during ongoing regional wastewater injection hazard mitigation efforts (United States)

    Yeck, William; Hayes, Gavin; McNamara, Daniel E.; Rubinstein, Justin L.; Barnhart, William; Earle, Paul; Benz, Harley M.


    The 3 September 2016, Mw 5.8 Pawnee earthquake was the largest recorded earthquake in the state of Oklahoma. Seismic and geodetic observations of the Pawnee sequence, including precise hypocenter locations and moment tensor modeling, shows that the Pawnee earthquake occurred on a previously unknown left-lateral strike-slip basement fault that intersects the mapped right-lateral Labette fault zone. The Pawnee earthquake is part of an unprecedented increase in the earthquake rate in Oklahoma that is largely considered the result of the deep injection of waste fluids from oil and gas production. If this is, indeed, the case for the M5.8 Pawnee earthquake, then this would be the largest event to have been induced by fluid injection. Since 2015, Oklahoma has undergone wide-scale mitigation efforts primarily aimed at reducing injection volumes. Thus far in 2016, the rate of M3 and greater earthquakes has decreased as compared to 2015, while the cumulative moment—or energy released from earthquakes—has increased. This highlights the difficulty in earthquake hazard mitigation efforts given the poorly understood long-term diffusive effects of wastewater injection and their connection to seismicity.

  8. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis (United States)

    Woo, Gordon


    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  9. Oregon Hazard Explorer for Lifelines Program (OHELP): A web-based geographic information system tool for assessing potential Cascadia earthquake hazard (United States)

    Sharifi Mood, M.; Olsen, M. J.; Gillins, D. T.; Javadnejad, F.


    The Cascadia Subduction Zone (CSZ) has the ability to generate earthquake as powerful as 9 moment magnitude creating great amount of damage to structures and facilities in Oregon. Series of deterministic earthquake analysis are performed for M9.0, M8.7, M8.4 and M8.1 presenting persistent, long lasting shaking associated with other geological threats such as ground shaking, landslides, liquefaction-induced ground deformations, fault rupture vertical displacement, tsunamis, etc. These ground deformation endangers urban structures, foundations, bridges, roadways, pipelines and other lifelines. Lifeline providers in Oregon, including private and public practices responsible for transportation, electric and gas utilities, water and wastewater, fuel, airports, and harbors face an aging infrastructure that was built prior to a full understanding of this extreme seismic risk. As recently experienced in Chile and Japan, a three to five minutes long earthquake scenario, expected in Oregon, necessities a whole different method of risk mitigation for these major lifelines than those created for shorter shakings from crustal earthquakes. A web-based geographic information system tool is developed to fully assess the potential hazard from the multiple threats impending from Cascadia subduction zone earthquakes in the region. The purpose of this website is to provide easy access to the latest and best available hazard information over the web, including work completed in the recent Oregon Resilience Plan (ORP) (OSSPAC, 2013) and other work completed by the Department of Geology and Mineral Industries (DOGAMI) and the United States Geological Survey (USGS). As a result, this tool is designated for engineers, planners, geologists, and others who need this information to help make appropriate decisions despite the fact that this web-GIS tool only needs minimal knowledge of GIS to work with.

  10. Earthquake Prediction Research In Iceland, Applications For Hazard Assessments and Warnings (United States)

    Stefansson, R.

    Earthquake prediction research in Iceland, applications for hazard assessments and warnings. The first multinational earthquake prediction research project in Iceland was the Eu- ropean Council encouraged SIL project of the Nordic countries, 1988-1995. The path selected for this research was to study the physics of crustal processes leading to earth- quakes. It was considered that small earthquakes, down to magnitude zero, were the most significant for this purpose, because of the detailed information which they pro- vide both in time and space. The test area for the project was the earthquake prone region of the South Iceland seismic zone (SISZ). The PRENLAB and PRENLAB-2 projects, 1996-2000 supported by the European Union were a direct continuation of the SIL project, but with a more multidisciplinary approach. PRENLAB stands for "Earthquake prediction research in a natural labo- ratory". The basic objective was to advance our understanding in general on where, when and how dangerous NH10earthquake motion might strike. Methods were devel- oped to study crustal processes and conditions, by microearthquake information, by continuous GPS, InSAR, theoretical modelling, fault mapping and paleoseismology. New algorithms were developed for short term warnings. A very useful short term warning was issued twice in the year 2000, one for a sudden start of an eruption in Volcano Hekla February 26, and the other 25 hours before a second (in a sequence of two) magnitude 6.6 (Ms) earthquake in the South Iceland seismic zone in June 21, with the correct location and approximate size. A formal short term warning, although not going to the public, was also issued before a magnitude 5 earthquake in November 1998. In the presentation it will be shortly described what these warnings were based on. A general hazard assessmnets was presented in scientific journals 10-15 years ago assessing within a few kilometers the location of the faults of the two 2000 earthquakes and suggesting

  11. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited) (United States)

    Applegate, D.


    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  12. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)


    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  13. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models. (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.


    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  14. Earthquake Hazards Program: Risk-Targeted Ground Motion Calculator (United States)

    U.S. Geological Survey, Department of the Interior — This tool is used to calculate risk-targeted ground motion values from probabilistic seismic hazard curves in accordance with the site-specific ground motion...

  15. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone (United States)

    Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf


    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  16. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail:; Bayrak, Erdem, E-mail: [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: [Ağrı İbrahim Çeçen University, Ağrı (Turkey)


    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  17. Earthquake induced liquefaction hazard, probability and risk assessment in the city of Kolkata, India: its historical perspective and deterministic scenario (United States)

    Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.


    Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically

  18. EFEHR - the European Facilities for Earthquake Hazard and Risk: beyond the web-platform (United States)

    Danciu, Laurentiu; Wiemer, Stefan; Haslinger, Florian; Kastli, Philipp; Giardini, Domenico


    European Facilities for Earthquake Hazard and Risk (EEFEHR) represents the sustainable community resource for seismic hazard and risk in Europe. The EFEHR web platform is the main gateway to access data, models and tools as well as provide expertise relevant for assessment of seismic hazard and risk. The main services (databases and web-platform) are hosted at ETH Zurich and operated by the Swiss Seismological Service (Schweizerischer Erdbebendienst SED). EFEHR web-portal ( collects and displays (i) harmonized datasets necessary for hazard and risk modeling, e.g. seismic catalogues, fault compilations, site amplifications, vulnerabilities, inventories; (ii) extensive seismic hazard products, namely hazard curves, uniform hazard spectra and maps for national and regional assessments. (ii) standardized configuration files for re-computing the regional seismic hazard models; (iv) relevant documentation of harmonized datasets, models and web-services. Today, EFEHR distributes full output of the 2013 European Seismic Hazard Model, ESHM13, as developed within the SHARE project (; the latest results of the 2014 Earthquake Model of the Middle East (EMME14), derived within the EMME Project (; the 2001 Global Seismic Hazard Assessment Project (GSHAP) results and the 2015 updates of the Swiss Seismic Hazard. New datasets related to either seismic hazard or risk will be incorporated as they become available. We present the currents status of the EFEHR platform, with focus on the challenges, summaries of the up-to-date datasets, user experience and feedback, as well as the roadmap to future technological innovation beyond the web-platform development. We also show the new services foreseen to fully integrate with the seismological core services of European Plate Observing System (EPOS).

  19. Sensitivity analysis of tall buildings in Semarang, Indonesia due to fault earthquakes with maximum 7 Mw (United States)

    Partono, Windu; Pardoyo, Bambang; Atmanto, Indrastono Dwi; Azizah, Lisa; Chintami, Rouli Dian


    Fault is one of the dangerous earthquake sources that can cause building failure. A lot of buildings were collapsed caused by Yogyakarta (2006) and Pidie (2016) fault source earthquakes with maximum magnitude 6.4 Mw. Following the research conducted by Team for Revision of Seismic Hazard Maps of Indonesia 2010 and 2016, Lasem, Demak and Semarang faults are three closest earthquake sources surrounding Semarang. The ground motion from those three earthquake sources should be taken into account for structural design and evaluation. Most of tall buildings, with minimum 40 meter high, in Semarang were designed and constructed following the 2002 and 2012 Indonesian Seismic Code. This paper presents the result of sensitivity analysis research with emphasis on the prediction of deformation and inter-story drift of existing tall building within the city against fault earthquakes. The analysis was performed by conducting dynamic structural analysis of 8 (eight) tall buildings using modified acceleration time histories. The modified acceleration time histories were calculated for three fault earthquakes with magnitude from 6 Mw to 7 Mw. The modified acceleration time histories were implemented due to inadequate time histories data caused by those three fault earthquakes. Sensitivity analysis of building against earthquake can be predicted by evaluating surface response spectra calculated using seismic code and surface response spectra calculated from acceleration time histories from a specific earthquake event. If surface response spectra calculated using seismic code is greater than surface response spectra calculated from acceleration time histories the structure will stable enough to resist the earthquake force.


    Energy Technology Data Exchange (ETDEWEB)

    R. Longwell; J. Keifer; S. Goodin


    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  1. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis (United States)

    Klügel, Jens-Uwe


    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  2. Assessment of Earthquake Hazard Parameters with Bayesian Approach Method Around Karliova Triple Junction, Eastern Turkey (United States)

    Türker, Tugba; Bayrak, Yusuf


    In this study, the Bayesian Approach method is used to evaluate earthquake hazard parameters of maximum regional magnitude (Mmax), β value, and seismic activity rate or intensity (λ) and their uncertainties for next 5, 10, 25, 50, 100 years around Karlıova Triple Junction (KTJ). A compiled earthquake catalog that is homogenous for Ms ≥ 3.0 was completed during the period from 1900 to 2017. We are divided into four different seismic source regions based on epicenter distribution, tectonic, seismicity, faults around KTJ. We two historical earthquakes (1866, Ms=7.2 for Region 3 (Between Bingöl-Karlıova-Muş-Bitlis (Bahçeköy Fault Zone-Uzunpınar Fault Zone-Karakoçan Fault-Muę Fault Zones –Kavakbaşı Fault)) and 1874, Ms=7.1 for Region 4 (Between Malatya-Elaziğ-Tunceli (Palu Basin-Pütürge Basin-Erkenek Fault-Malatya Fault)) are included around KTJ. The computed Mmax values are between 7.71 and 8.17. The quantiles of functions of distributions of true and apparent magnitude on a given time interval [0, T] are evaluated. The quantiles of functions of distributions of apparent and true magnitudes for next time intervals of 5, 10, 25, 50, and 100 years are calculated for confidence limits of probability levels of 50, 70, and 90 % around KTJ. According to the computed earthquake hazard parameters, Erzincan Basin-Ovacık Fault-Pülümur Fault-Yedisu Basin region was the most seismic active regions of KTJ. Erzincan Basin-Ovacik Fault-Pulumur Fault-Yedisu Basin region is estimated the highest earthquake magnitude 7.16 with a 90 % probability level in the next 100 years which the most dangerous region compared to other regions. The results of this study can be used in earthquake hazard studies of the East Anatolian region.

  3. 14 CFR 437.29 - Hazard analysis. (United States)


    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  4. St. Louis Area Earthquake Hazards Mapping Project - December 2008-June 2009 Progress Report (United States)

    Williams, R.A.; Bauer, R.A.; Boyd, O.S.; Chung, J.; Cramer, C.H.; Gaunt, D.A.; Hempen, G.L.; Hoffman, D.; McCallister, N.S.; Prewett, J.L.; Rogers, J.D.; Steckel, P.J.; Watkins, C.M.


    This report summarizes the mission, the project background, the participants, and the progress of the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) for the period from December 2008 through June 2009. During this period, the SLAEHMP held five conference calls and two face-to-face meetings in St. Louis, participated in several earthquake awareness public meetings, held one outreach field trip for the business and government community, collected and compiled new borehole and digital elevation data from partners, and published a project summary.

  5. Next-Level ShakeZoning for Earthquake Hazard Definition in Nevada (United States)

    Louie, J. N.; Savran, W. H.; Flinchum, B. A.; Dudley, C.; Prina, N.; Pullammanappallil, S.; Pancha, A.


    We are developing "Next-Level ShakeZoning" procedures tailored for defining earthquake hazards in Nevada. The current Federally sponsored tools- the USGS hazard maps and ShakeMap, and FEMA HAZUS- were developed as statistical summaries to match earthquake data from California, Japan, and Taiwan. The 2008 Wells and Mogul events in Nevada showed in particular that the generalized statistical approach taken by ShakeMap cannot match actual data on shaking from earthquakes in the Intermountain West, even to first order. Next-Level ShakeZoning relies on physics and geology to define earthquake shaking hazards, rather than statistics. It follows theoretical and computational developments made over the past 20 years, to capitalize on detailed and specific local data sets to more accurately model the propagation and amplification of earthquake waves through the multiple geologic basins of the Intermountain West. Excellent new data sets are now available for Las Vegas Valley. Clark County, Nevada has completed the nation's very first effort to map earthquake hazard class systematically through an entire urban area using Optim's SeisOpt° ReMi technique, which was adapted for large-scale data collection. Using the new Parcel Map in computing shaking in the Valley for scenario earthquakes is crucial for obtaining realistic predictions of ground motions. In an educational element of the project, a dozen undergraduate students have been computing 50 separate earthquake scenarios affecting Las Vegas Valley, using the Next-Level ShakeZoning process. Despite affecting only the upper 30 meters, the Vs30 geotechnical shear-velocity from the Parcel Map shows clear effects on 3-d shaking predictions computed so far at frequencies from 0.1 Hz up to 1.0 Hz. The effect of the Parcel Map on even the 0.1-Hz waves is prominent even with the large mismatch of wavelength to geotechnical depths. Amplifications and de-amplifications affected by the Parcel Map exceed a factor of two, and are

  6. Seismic hazard analysis for Jayapura city, Papua

    Energy Technology Data Exchange (ETDEWEB)

    Robiana, R., E-mail:; Cipta, A. [Geological Agency, Diponegoro Road No.57, Bandung, 40122 (Indonesia)


    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  7. 21 CFR 120.7 - Hazard analysis. (United States)


    ... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.7 Hazard... to occur and thus, constitutes a food hazard that must be addressed in the HACCP plan. A food hazard... intended consumer. (e) HACCP plans for juice need not address the food hazards associated with...

  8. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team


    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  9. User’s Guide - Seismic Hazard Analysis (United States)


    Eartquake Magnitude Cutoff 8.5 example 8.8 Enter Site Longitude (Degrees) 117 example 115.0 Enter Site Latitude (Degrees) 38 example 38.5 Any Chnges? Y / H...the art for assessing earthquake hazards in the United States catalogue of strong motion eartquake records, Wtaerways Experiment Station, Vicks- burg

  10. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018) (United States)

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.


    to the NSHM scenario were developed for the Hilton Creek and Hartley Springs Faults to account for different opinions in how far these two faults extend into Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice: the deterministic seismic hazard analysis program developed by Art Frankel of USGS and three Next Generation Ground Motion Attenuation (NGA) models. Ground motion calculations incorporated the potential amplification of seismic shaking by near-surface soils defined by a map of the average shear wave velocity in the uppermost 30 m (VS30) developed by CGS.In addition to ground shaking and shaking-related ground failure such as liquefaction and earthquake induced landslides, earthquakes cause surface rupture displacement, which can lead to severe damage of buildings and lifelines. For each earthquake scenario, potential surface fault displacements are estimated using deterministic and probabilistic approaches. Liquefaction occurs when saturated sediments lose their strength because of ground shaking. Zones of potential liquefaction are mapped by incorporating areas where loose sandy sediments, shallow groundwater, and strong earthquake shaking coincide in the earthquake scenario. The process for defining zones of potential landslide and rockfall incorporates rock strength, surface slope, and existing landslides, with ground motions caused by the scenario earthquake.Each scenario is illustrated with maps of seismic shaking potential and fault displacement, liquefaction, and landslide potential. Seismic shaking is depicted by the distribution of shaking intensity, peak ground acceleration, and 1.0-second spectral acceleration. One-second spectral acceleration correlates well with structural damage to surface facilities. Acceleration greater than 0.2 g is often associated with strong ground shaking and may cause moderate to heavy damage. The extent of strong shaking is influenced by subsurface fault dip and near

  11. Earthquake catalogs for the 2017 Central and Eastern U.S. short-term seismic hazard model (United States)

    Mueller, Charles S.


    The U. S. Geological Survey (USGS) makes long-term seismic hazard forecasts that are used in building codes. The hazard models usually consider only natural seismicity; non-tectonic (man-made) earthquakes are excluded because they are transitory or too small. In the past decade, however, thousands of earthquakes related to underground fluid injection have occurred in the central and eastern U.S. (CEUS), and some have caused damage.  In response, the USGS is now also making short-term forecasts that account for the hazard from these induced earthquakes. Seismicity statistics are analyzed to develop recurrence models, accounting for catalog completeness. In the USGS hazard modeling methodology, earthquakes are counted on a map grid, recurrence models are applied to estimate the rates of future earthquakes in each grid cell, and these rates are combined with maximum-magnitude models and ground-motion models to compute the hazard The USGS published a forecast for the years 2016 and 2017.Here, we document the development of the seismicity catalogs for the 2017 CEUS short-term hazard model.  A uniform earthquake catalog is assembled by combining and winnowing pre-existing source catalogs. The initial, final, and supporting earthquake catalogs are made available here.

  12. A new earthquake catalogue for seismic hazard assessment of the NPP (Nuclear Power Plant) Jaslovske Bohunice, Slovakia, site (United States)

    Kysel, Robert; Kristek, Jozef; Moczo, Peter; Csicsay, Kristian; Cipciar, Andrej; Srbecky, Miroslav


    national agencies, we analyzed and estimated relations between them. For declustering we applied two independent methods. In the window method we applied parameters of the time-space windows proposed by Burkhard & Grünthal (2009). In the cluster method (Reasenberg 1985) we applied alternative sets of input parameters. For investigating time completeness we divided the catalogue into four subcatalogues corresponding to different seismogeological domains. The completeness was determined from the plots displaying cumulative number of events (for given subcatalogue and interval of magnitude) as a function of time. The homogenized catalogue consists of 2 652 earthquakes with moment magnitude larger than 1.5. The catalogue was subsequently used as an input source for hazard analysis.

  13. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.


    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  14. Seismic hazard in the Po Plain and the 2012 Emilia earthquakes

    Directory of Open Access Journals (Sweden)

    Carlo Meletti


    Full Text Available The Emilia earthquakes of May 20, 2012 (Ml 5.9, INGV; Mw 6.11, and May 29, 2012 (Ml 5.8, INGV; Mw 5.96, struck an area that in the national reference seismic hazard model [MPS04;, and Stucchi et al. 2011] is characterized by expected horizontal peak ground acceleration (PGA with a 10% probability of exceedance in 50 years that ranges between 0.10 g and 0.15 g (Figure 1, which is a medium level of seismic hazard in Italy. The strong impact of the earthquakes on a region that is not included among the most hazardous areas of Italy, and the ground motion data recorded by accelerometric networks, have given the impression to the population and the media that the current seismic hazard map is not correct, and thus needs to be updated. Since the MPS04 seismic hazard model was adopted by the current Italian building code [Norme Tecniche per le Costruzioni 2008, hereafter termed NTC08;] as the basis to define seismic action (the design spectra, any modification to the seismic hazard model would also affect the building code. The aim of this paper is to briefly present the data that support the seismic hazard model in the area, and to perform some comparisons between recorded ground motion with seismic hazard estimates and design spectra. All of the comparisons presented in this study are for the horizontal components only, as the Italian hazard model did not perform any estimates for the vertical component. […

  15. Science, hazards, and policy questions for intraplate earthquakes in eastern North America (United States)

    Stein, S.; Newman, A.; Sella, G.; Dixon, T.; Liu, M.; Dokka, R.; Tomasello, J.


    Intraplate earthquakes in eastern North America and similar continental interiors pose unresolved scientific and societal issues. Resolving these issues will be challenging, and bears on our understanding of lithospheric and mantle rheology, continental evolution, and the earthquake process. Their causes can be viewed as some combination of two end-member models. In one, earthquakes occur almost randomly in a continent containing many long-lived fossil weak zones. Minor stress variations stress due to platewide driving forces and local stresses such as from glacial-isostatic adjustment and other density variations cause transient seismicity as the locus of strain release migrates. If so, present regions of seismicity do not significantly differ from similar weak zones that are less active. Alternatively, seismicity concentrates on long-lived weak zones. For example, if such a zone under the New Madrid area relaxed recently, transient release of accumulated stress could cause large earthquakes more frequently than implied by geodetic or earthquake frequency-magnitude data. Such models can explain the lack of surface strain accumulation shown by GPS data, but there is little evidence for such weak zones or their recent initiation. Assessing the resulting hazard requires assumptions about the size, recurrence rate, and ground motion resulting from the larger earthquakes, none of which is well known. Hence hazard estimates have large uncertainties and, at least for New Madrid, are near the high end of possible estimates. The uncertainties also make choosing mitigation strategies challenging. For example, the proposed upgrade of New Madrid zone building codes to California-level seems likely to impose societal costs significantly exceeding the benefits.

  16. Induced and Natural Seismicity: Earthquake Hazards and Risks in Ohio: (United States)

    Besana-Ostman, G. M.; Worstall, R.; Tomastik, T.; Simmers, R.


    To adapt with increasing need to regulate all operations related to both the Utica and Marcellus shale play within the state, ODNR had recently strengthen its regulatory capability through implementation of stricter permit requirements, additional human resources and improved infrastructure. These ODNR's efforts on seismic risk reduction related to induced seismicity led to stricter regulations and many infrastructure changes related particularly to Class II wells. Permit requirement changes and more seismic monitoring stations were implemented together with additional injection data reporting from selected Class II well operators. Considering the possible risks related to seismic events in a region with relatively low seismicity, correlation between limited seismic data and injection volume information were undertaken. Interestingly, initial results showed some indications of both plugging and fracturing episodes. The real-time data transmission from seismic stations and availability of injection volume data enabled ODNR to interact with operators and manage wells dynamically. Furthermore, initial geomorphic and structural analyses indicated possible active faults in the northern and western portion of the state oriented NE-SW. The newly-mapped structures imply possible relatively bigger earthquakes in the region and consequently higher seismic risks. With the above-mentioned recent changes, ODNR have made critical improvement of its principal regulatory role in the state for oil and gas operations but also an important contribution to the state's seismic risk reduction endeavors. Close collaboration with other government agencies and the public, and working together with the well operators enhanced ODNR's capability to build a safety culture and achieve further public and industry participation towards a safer environment. Keywords: Induced seismicity, injection wells, seismic risks

  17. The Integrated Hazard Analysis Integrator (United States)

    Morris, A. Terry; Massie, Michael J.


    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  18. Probabilistic Earthquake-tsunami Hazard Assessment:The First Step Towards Resilient Coastal Communities


    De Risi, Raffaele; Goda, Katsu


    As more population migrates to coastal regions worldwide, earthquake-triggered tsunamis pose a greater threat than ever before. Stakeholders, decision makers, and emergency managers face an urgent need for operational decision-support tools that provide robust and accurate hazard assessments, when human lives and built environment are at risk. To meet this need, this study presents a new probabilistic procedure for estimating the likelihood that seismic intensity and tsunami inundation will e...

  19. Hazard maps of earthquake induced permanent displacements validated by site numerical simulation (United States)

    Vessia, Giovanna; Pisano, Luca; Parise, Mario; Tromba, Giuseppe


    Hazard maps of seismically induced instability at the urban scale can be drawn by means of GIS spatial interpolation tools starting from (1) a Digital terrain model (DTM) and (2) geological and geotechnical hydro-mechanical site characterization. These maps are commonly related to a fixed return period of the natural phenomenon under study, or to a particular hazard scenario from the most significant past events. The maps could be used to guide the planning activity as well as the emergency actions, but the main limit of such maps is that typically no reliability analyses is performed. Spatial variability and uncertainties in subsoil properties, poor description of geomorphological evidence of active instability, and geometrical approximations and simplifications in DTMs, among the others, could be responsible for inaccurate maps. In this study, a possible method is proposed to control and increase the overall reliability of an hazard scenario map for earthquake-induced slope instability. The procedure can be summarized as follows: (1) GIS Statistical tools are used to improve the spatial distribution of the hydro-mechanical properties of the surface lithologies; (2) Hazard maps are drawn from the preceding information layer on both groundwater and mechanical properties of surficial deposits combined with seismic parameters propagated by means of Ground Motion Propagation Equations; (3) Point numerical stability analyses carried out by means of the Finite Element Method (e.g. Geostudio 2004) are performed to anchor hazard maps prediction to point quantitative analyses. These numerical analyses are used to generate a conversion scale from urban to point estimates in terms of permanent displacements. Although this conversion scale differs from case to case, it could be suggested as a general method to convert the results of large scale map analyses to site hazard assessment. In this study, the procedure is applied to the urban area of Castelfranci (Avellino province

  20. Reducing Vulnerability of Ports and Harbors to Earthquake and Tsunami Hazards (United States)

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.


    Recent scientific research suggests the Pacific Northwest could experience catastrophic earthquakes in the near future, both from distant and local sources, posing a significant threat to coastal communities. Damage could result from numerous earthquake-related hazards, such as severe ground shaking, soil liquefaction, landslides, land subsidence/uplift, and tsunami inundation. Because of their geographic location, ports and harbors are especially vulnerable to these hazards. Ports and harbors, however, are important components of many coastal communities, supporting numerous activities critical to the local and regional economy and possibly serving as vital post-event, response-recovery transportation links. A collaborative, multi-year initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to earthquake and tsunami hazards, involving Oregon Sea Grant (OSG), Washington Sea Grant (WSG), the National Oceanic and Atmospheric Administration Coastal Services Center (CSC), and the U.S. Geological Survey Center for Science Policy (CSP). Specific products of this research, planning, and outreach initiative include a regional stakeholder issues and needs assessment, a community-based mitigation planning process, a Geographic Information System (GIS) — based vulnerability assessment methodology, an educational web-site and a regional data archive. This paper summarizes these efforts, including results of two pilot port-harbor community projects, one in Yaquina Bay, Oregon and the other in Sinclair Inlet, Washington. Finally, plans are outlined for outreach to other port and harbor communities in the Pacific Northwest and beyond, using "getting started" workshops and a web-based tutorial.

  1. Earthquake shaking hazard estimates and exposure changes in the conterminous United States (United States)

    Jaiswal, Kishor S.; Petersen, Mark D.; Rukstales, Kenneth S.; Leith, William S.


    A large portion of the population of the United States lives in areas vulnerable to earthquake hazards. This investigation aims to quantify population and infrastructure exposure within the conterminous U.S. that are subjected to varying levels of earthquake ground motions by systematically analyzing the last four cycles of the U.S. Geological Survey's (USGS) National Seismic Hazard Models (published in 1996, 2002, 2008 and 2014). Using the 2013 LandScan data, we estimate the numbers of people who are exposed to potentially damaging ground motions (peak ground accelerations at or above 0.1g). At least 28 million (~9% of the total population) may experience 0.1g level of shaking at relatively frequent intervals (annual rate of 1 in 72 years or 50% probability of exceedance (PE) in 50 years), 57 million (~18% of the total population) may experience this level of shaking at moderately frequent intervals (annual rate of 1 in 475 years or 10% PE in 50 years), and 143 million (~46% of the total population) may experience such shaking at relatively infrequent intervals (annual rate of 1 in 2,475 years or 2% PE in 50 years). We also show that there is a significant number of critical infrastructure facilities located in high earthquake-hazard areas (Modified Mercalli Intensity ≥ VII with moderately frequent recurrence interval).

  2. Regional liquefaction hazard evaluation following the 2010-2011 Christchurch (New Zealand) earthquake sequence (United States)

    Begg, John; Brackley, Hannah; Irwin, Marion; Grant, Helen; Berryman, Kelvin; Dellow, Grant; Scott, David; Jones, Katie; Barrell, David; Lee, Julie; Townsend, Dougal; Jacka, Mike; Harwood, Nick; McCahon, Ian; Christensen, Steve


    Following the damaging 4 Sept 2010 Mw7.1 Darfield Earthquake, the 22 Feb 2011 Christchurch Earthquake and subsequent damaging aftershocks, we completed a liquefaction hazard evaluation for c. 2700 km2 of the coastal Canterbury region. Its purpose was to distinguish at a regional scale areas of land that, in the event of strong ground shaking, may be susceptible to damaging liquefaction from areas where damaging liquefaction is unlikely. This information will be used by local government for defining liquefaction-related geotechnical investigation requirements for consent applications. Following a review of historic records of liquefaction and existing liquefaction assessment maps, we undertook comprehensive new work that included: a geologic context from existing geologic maps; geomorphic mapping using LiDAR and integrating existing soil map data; compilation of lithological data for the surficial 10 m from an extensive drillhole database; modelling of depth to unconfined groundwater from existing subsurface and surface water data. Integrating and honouring all these sources of information, we mapped areas underlain by materials susceptible to liquefaction (liquefaction-prone lithologies present, or likely, in the near-surface, with shallow unconfined groundwater) from areas unlikely to suffer widespread liquefaction damage. Comparison of this work with more detailed liquefaction susceptibility assessment based on closely spaced geotechnical probes in Christchurch City provides a level of confidence in these results. We tested our susceptibility map by assigning a matrix of liquefaction susceptibility rankings to lithologies recorded in drillhole logs and local groundwater depths, then applying peak ground accelerations for four earthquake scenarios from the regional probabilistic seismic hazard model (25 year return = 0.13g; 100 year return = 0.22g; 500 year return = 0.38g and 2500 year return = 0.6g). Our mapped boundary between liquefaction-prone areas and areas

  3. 2017 one-year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes (United States)

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert A.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.


    We produce the 2017 one-year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one-year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one-year) in five focus areas: Oklahoma-Kansas, the Raton Basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 magnitude (M) ≥ 4 and three M ≥ 5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma-Kansas focus area two earthquakes with M ≥ 4 occurred near Trinidad, Colorado (in the Raton Basin focus area), but no earthquakes with M ≥ 2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared to 2015, which may be related to decreased wastewater injection, caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  4. 2017 One‐year seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes (United States)

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert; Llenos, Andrea L.; Ellsworth, William L; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.


    We produce a one‐year 2017 seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one‐year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic‐hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one year) in five focus areas: Oklahoma–Kansas, the Raton basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 moment magnitude (M) ≥4 and 3 M≥5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma–Kansas focus area, two earthquakes with M≥4 occurred near Trinidad, Colorado (in the Raton basin focus area), but no earthquakes with M≥2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground‐shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared with 2015, which may be related to decreased wastewater injection caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  5. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid


    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  6. The analysis of historical seismograms: an important tool for seismic hazard assessment. Case histories from French and Italian earthquakes; L'analyse des sismogrammes historiques: un outil important pour l'evaluation de l'alea sismique. Etudes de cas de tremblements de terre en France et en Italie

    Energy Technology Data Exchange (ETDEWEB)

    Pino, N.A. [Istituto Nazionale di Geofisica e Vulcanologia, Osservatorio Vesuviano, Via Diocleziano 328, 80124 Napoli (Italy)


    Seismic hazard assessment relies on the knowledge of the source characteristics of past earthquakes. Unfortunately, seismic waveform analysis, representing the most powerful tool for the investigation of earthquake source parameters, is only possible for events occurred in the last 100-120 years, i.e., since seismographs with known response function were developed. Nevertheless, during this time significant earthquakes have been recorded by such instruments and today, also thanks to technological progress, these data can be recovered and analysed by means of modern techniques. In this paper, aiming at giving a general sketch of possible analyses and attainable results in historical seismogram studies, I briefly describe the major difficulties in processing the original waveforms and present a review of the results that I obtained from previous seismogram analysis of selected significant historical earthquakes occurred during the first decades of the 20. century, including (A) the December 28, 1908, Messina straits (southern Italy), (B) the June 11, 1909, Lambesc (southern France) - both of which are the strongest ever recorded instrumentally in their respective countries - and (C) the July 13, 1930, Irpinia (southern Italy) events. For these earthquakes, the major achievements are represented by the assessment of the seismic moment (A, B, C), the geometry and kinematics of faulting (B, C), the fault length and an approximate slip distribution (A, C). The source characteristics of the studied events have also been interpreted in the frame of the tectonic environment active in the respective region of interest. In spite of the difficulties inherent to the investigation of old seismic data, these results demonstrate the invaluable and irreplaceable role of historical seismogram analysis in defining the local seismo-genic potential and, ultimately, for assessing the seismic hazard. The retrieved information is crucial in areas where important civil engineering works

  7. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula (United States)

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.


    -motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

  8. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model (United States)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza


    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  9. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model (United States)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza


    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  10. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau (United States)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail


    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  11. 14 CFR 437.55 - Hazard analysis. (United States)


    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  12. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources (United States)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.


    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction

  13. RISMUR II: New seismic hazard and risk study in Murcia Region after the Lorca Earthquake, 2011 (United States)

    Benito, Belen; Gaspar, Jorge; Rivas, Alicia; Quiros, Ligia; Ruiz, Sandra; Hernandez, Roman; Torres, Yolanda; Staller, Sandra


    The Murcia Region, is one of the highest seimic activity of Spain, located SE Iberian Peninsula. A system of active faults are included in the región, where the most recent damaging eartquakes took place in our country: 1999, 2002, 2005 and 2011. The last one ocurred in Lorca, causing 9 deads and notably material losses, including the artistic stock. The seismic emergency plann of the Murcia Region was developed in 2006, based of the results of the risk Project RISMUR I, which among other conslusions pointed out Lorca as one of the municipalities with highest risk in the province,. After the Lorca earthquake in 2011, a revisión of the previous study has been developed through the Project RISMUR II, including data of this earthquake , as well as updted Data Base of: seismicity, active faults, strong motion records, cadastre, vulnerability, etc. In adittion, the new study includes, some methodology innovations: modelization of faults as independent units for hazard assessment, analytic methods for risk estimations using data of the earthquake for calibration of capacity and fragility curves. In this work the results of RISMUR II are presented, which are compared with those reached in RISMUR I. The main conclusions are: Increasing of the hazard along the central system fault SW-NE (Alhama de Murcia, Totana nad Carracoy), which involve highest expected damages in the nearest populations to these faults: Lorca, Totana, Alcantarilla and Murcia.

  14. No longer so clueless in seattle: Current assessment of earthquake hazards (United States)

    Weaver, C.S.


    The Pacific Northwest is an active subduction zone. Because of this tectonic setting, there are three distinct earthquake source zones in earthquake hazard assessments of the Seattle area. Offshore, the broad sloping interface between the Juan de Fuca and the North America plates produces earthquakes as large as magnitude 9; on the average these events occur every 400-600 years. The second source zone is within the subducting Juan de Fuca plate as it bends, at depths of 40-60 km, beneath the Puget lowland. Five earthquakes in this zone this century have had magnitudes greater than 6, including one magnitude 7.1 event in 1949. The third zone, the crust of the North America plate, is the least well known. Paleoseismic evidence shows that an event of approximate magnitude 7 occurred on the Seattle fault about 1000 years ago. Potentially very damaging to the heavily urbanized areas of Puget Sound, the rate of occurrence and area over which large magnitude crustal events are to be expected is the subject of considerable research.


    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett


    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  16. Earthquake Hazard and Risk in Sub-Saharan Africa: current status of the Global Earthquake model (GEM) initiative in the region (United States)

    Ayele, Atalay; Midzi, Vunganai; Ateba, Bekoa; Mulabisana, Thifhelimbilu; Marimira, Kwangwari; Hlatywayo, Dumisani J.; Akpan, Ofonime; Amponsah, Paulina; Georges, Tuluka M.; Durrheim, Ray


    Large magnitude earthquakes have been observed in Sub-Saharan Africa in the recent past, such as the Machaze event of 2006 (Mw, 7.0) in Mozambique and the 2009 Karonga earthquake (Mw 6.2) in Malawi. The December 13, 1910 earthquake (Ms = 7.3) in the Rukwa rift (Tanzania) is the largest of all instrumentally recorded events known to have occurred in East Africa. The overall earthquake hazard in the region is on the lower side compared to other earthquake prone areas in the globe. However, the risk level is high enough for it to receive attention of the African governments and the donor community. The latest earthquake hazard map for the sub-Saharan Africa was done in 1999 and updating is long overdue as several development activities in the construction industry is booming allover sub-Saharan Africa. To this effect, regional seismologists are working together under the GEM (Global Earthquake Model) framework to improve incomplete, inhomogeneous and uncertain catalogues. The working group is also contributing to the UNESCO-IGCP (SIDA) 601 project and assessing all possible sources of data for the catalogue as well as for the seismotectonic characteristics that will help to develop a reasonable hazard model in the region. In the current progress, it is noted that the region is more seismically active than we thought. This demands the coordinated effort of the regional experts to systematically compile all available information for a better output so as to mitigate earthquake risk in the sub-Saharan Africa.

  17. Development of direct multi-hazard susceptibility assessment method for post-earthquake reconstruction planning in Nepal (United States)

    Mavrouli, Olga; Rana, Sohel; van Westen, Cees; Zhang, Jianqiang


    After the devastating 2015 Gorkha earthquake in Nepal, reconstruction activities have been delayed considerably, due to many reasons, of a political, organizational and technical nature. Due to the widespread occurrence of co-seismic landslides, and the expectation that these may be aggravated or re-activated in future years during the intense monsoon periods, there is a need to evaluate for thousands of sites whether these are suited for reconstruction. In this evaluation multi-hazards, such as rockfall, landslides, debris flow, and flashfloods should be taken into account. The application of indirect knowledge-based, data-driven or physically-based approaches is not suitable due to several reasons. Physically-based models generally require a large number of parameters, for which data is not available. Data-driven, statistical methods, depend on historical information, which is less useful after the occurrence of a major event, such as an earthquake. Besides, they would lead to unacceptable levels of generalization, as the analysis is done based on rather general causal factor maps. The same holds for indirect knowledge-driven methods. However, location-specific hazards analysis is required using a simple method that can be used by many people at the local level. In this research, a direct scientific method was developed where local level technical people can easily and quickly assess the post-earthquake multi hazards following a decision tree approach, using an app on a smartphone or tablet. The methods assumes that a central organization, such as the Department of Soil Conservation and Watershed Management, generates spatial information beforehand that is used in the direct assessment at a certain location. Pre-earthquake, co-seismic and post-seismic landslide inventories are generated through the interpretation of Google Earth multi-temporal images, using anaglyph methods. Spatial data, such as Digital Elevation Models, land cover maps, and geological maps are

  18. Geological Deformations and Potential Hazards Triggered by the 01-12-2010 Haiti Earthquake: Insights from Google Earth Imagery (United States)

    Doblas, M.; Benito, B.; Torres, Y.; Belizaire, D.; Dorfeuille, J.; Aretxabala, A.


    In this study we compare the different Google Earth imagery (GEI) available before and after the 01-12-2010 earthquake of Haiti and carry out a detailed analysis of the superficial seismic-related geological deformations in the following sites: 1) the capital Port-Au-Prince and other cities (Carrefour and Gresslier); 2) the mountainous area of the Massif de la Selle which is transected by the "Enriquillo-Plaintain-Garden" (EPG) interplate boundary-fault (that supposedly triggered the seism); 3) some of the most important river channels and their corresponding deltas (Momanche, Grise and Frorse). The initial results of our researches were published in March 2010 in a special web page created by the scientific community to try to mitigate the devastating effects of this catastrophe ( Six types of superficial geological deformations triggered by the seismic event have been identified with the GEI: liquefaction structures, chaotic rupture zones, coastal and domal uplifts, river-delta turnovers, faults/ruptures and landslides. Potential geological hazards triggered by the Haiti earthquake include landslides, inundations, reactivation of active tectonic elements (e.g., fractures), river-delta turnovers, etc. We analyzed again the GEI after the rain period and, as expected, most of the geological deformations that we initially identified had been erased and/or modified by the water washout or buried by the sediments. In this sense the GEI constitutes an invaluable instrument in the analysis of seismic geological hazards: we still have the possibility to compare all the images before and after the seism that are recorded in its useful "time tool". These are in fact the only witnesses of most of the geological deformations triggered by the Haiti earthquake that remain stored in the virtual archives of the GEI. In fact a field trip to the area today would be useless as most of these structures have disappeared. We will show

  19. Earthquake scenario in West Bengal with emphasis on seismic hazard microzonation of the city of Kolkata, India (United States)

    Nath, S. K.; Adhikari, M. D.; Maiti, S. K.; Devaraj, N.; Srivastava, N.; Mohapatra, L. D.


    Seismic microzonation is a process of estimating site-specific effects due to an earthquake on urban centers for its disaster mitigation and management. The state of West Bengal, located in the western foreland of the Assam-Arakan Orogenic Belt, the Himalayan foothills and Surma Valley, has been struck by several devastating earthquakes in the past, indicating the need for a seismotectonic review of the province, especially in light of probable seismic threat to its capital city of Kolkata, which is a major industrial and commercial hub in the eastern and northeastern region of India. A synoptic probabilistic seismic hazard model of Kolkata is initially generated at engineering bedrock (Vs30 ~ 760 m s-1) considering 33 polygonal seismogenic sources at two hypocentral depth ranges, 0-25 and 25-70 km; 158 tectonic sources; appropriate seismicity modeling; 14 ground motion prediction equations for three seismotectonic provinces, viz. the east-central Himalaya, the Bengal Basin and Northeast India selected through suitability testing; and appropriate weighting in a logic tree framework. Site classification of Kolkata performed following in-depth geophysical and geotechnical investigations places the city in D1, D2, D3 and E classes. Probabilistic seismic hazard assessment at a surface-consistent level - i.e., the local seismic hazard related to site amplification performed by propagating the bedrock ground motion with 10% probability of exceedance in 50 years through a 1-D sediment column using an equivalent linear analysis - predicts a peak ground acceleration (PGA) range from 0.176 to 0.253 g in the city. A deterministic liquefaction scenario in terms of spatial distribution of liquefaction potential index corresponding to surface PGA distribution places 50% of the city in the possible liquefiable zone. A multicriteria seismic hazard microzonation framework is proposed for judicious integration of multiple themes, namely PGA at the surface, liquefaction potential

  20. The 1843 earthquake: a maximising scenario for tsunami hazard assessment in the Northern Lesser Antilles? (United States)

    Roger, Jean; Zahibo, Narcisse; Dudon, Bernard; Krien, Yann


    The French Caribbean Islands are located over the Lesser Antilles active subduction zone where a handful of earthquakes historically reached magnitude Mw=6.0 and more. According to available catalogs these earthquakes have been sometimes able to trigger devastating local or regional tsunamis, either directly by the shake or indirectly by induced landslides. For example, these islands have severely suffered during the Mw~7.5 Virgin Islands earthquake (1867) triggering several meters high waves in the whole Lesser Antilles Arc and, more recently, during the Mw=6.3 Les Saintes earthquake (2004) followed by a local 1 m high tsunami. However, in 1839 a Mw~7.5 subduction earthquake occured offshore Martinica followed a few years after by the more famous 1843 Mw~8.5 megathrust event, with an epicenter located approximately between Guadeloupe and Antigua, but both without any catastrophic tsunami being reported. In this study we discuss the potential impact of a maximum credible scenario of tsunami generation with such a Mw=8.5 rupture at the subduction interface using available geological information, numerical modeling of tsunami generation and propagation and high resolution bathymetric data within the framework of tsunami hazard assessment for the French West Indies. Despite the fact that the mystery remains unresolved concerning the lack of historical tsunami data especially for the 1843 event, modeling results show that the tsunami impact is not uniformly distributed in the whole archipelago and could show important heterogeneities in terms of maximum wave heights for specific places. This is easily explained by the bathymetry and the presence of several islands around the mainland leading to resonance phenomena, and because of the existence of a fringing coral reef surrounding partially those islands.

  1. Wenchuan Earthquake Surface Fault Rupture and Disaster: A Lesson on Seismic Hazard Assessment and Mitigation


    Yi Du; Furen Xie; Zhenming Wang


    The M s 8.0 Wenchuan earthquake occurred along the Longmenshan Faults in China and was a great disaster. Most of the damage and casualties during the quake were concentrated along surface rupture zones: the 240-km-long Beichuan-Yingxiu Fault and the 70-km-long Jiangyou-Guanxian Fault. Although the Longmenshan Faults are well known and studied, the surface Fault ruptures were not considered in mitigation planning, and the associated ground-motion hazard was therefore underestimated. Not consid...

  2. Estimate of airborne release of plutonium from Babcock and Wilcox plant as a result of severe wind hazard and earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Mishima, J.; Schwendiman, L.C.; Ayer, J.E.


    As part of an interdisciplinary study to evaluate the potential radiological consequences of wind hazard and earthquake upon existing commercial mixed oxide fuel fabrication plants, the potential mass airborne releases of plutonium (source terms) from such events are estimated. The estimated souce terms are based upon the fraction of enclosures damaged to three levels of severity (crush, puncture penetrate, and loss of external filter, in order of decreasing severity), called damage ratio, and the airborne release if all enclosures suffered that level of damage. The discussion of damage scenarios and source terms is divided into wind hazard and earthquake scenarios in order of increasing severity. The largest airborne releases from the building were for cases involving the catastrophic collapse of the roof over the major production areas--wind hazard at 110 mph and earthquakes with peak ground accelerations of 0.20 to 0.29 g. Wind hazards at higher air velocities and earthquakes with higher ground acceleration do not result in significantly greater source terms. The source terms were calculated as additional mass of respirable particles released with time up to 4 days; and, under these assumptions, approximately 98% of the mass of material of concern is made airborne from 2 h to 4 days after the event. The overall building source terms from the damage scenarios evaluated are shown in a table. The contribution of individual areas to the overall building source term is presented in order of increasing severity for wind hazard and earthquake.

  3. Earthquake risk reduction in the United States: An assessment of selected user needs and recommendations for the National Earthquake Hazards Reduction Program

    Energy Technology Data Exchange (ETDEWEB)



    This Assessment was conducted to improve the National Earthquake Hazards Reduction Program (NEHRP) by providing NEHRP agencies with information that supports their user-oriented setting of crosscutting priorities in the NEHRP strategic planning process. The primary objective of this Assessment was to take a ``snapshot`` evaluation of the needs of selected users throughout the major program elements of NEHRP. Secondary objectives were to conduct an assessment of the knowledge that exists (or is being developed by NEHRP) to support earthquake risk reduction, and to begin a process of evaluating how NEHRP is meeting user needs. An identification of NEHRP`s strengths also resulted from the effort, since those strengths demonstrate successful methods that may be useful to NEHRP in the future. These strengths are identified in the text, and many of them represent important achievements since the Earthquake Hazards Reduction Act was passed in 1977.

  4. Summary of November 2010 meeting to evaluate turbidite data for constraining the recurrence parameters of great Cascadia earthquakes for the update of national seismic hazard maps (United States)

    Frankel, Arthur D.


    , 1996), which were the basis for seismic provisions in the 2000 International Building Code. These hazard maps used the paleoseismic studies to constrain the recurrence rate of great CSZ earthquakes. Goldfinger and his colleagues have since collected many more deep ocean cores and done extensive analysis on the turbidite deposits that they identified in the cores (Goldfinger and others, 2003, 2008, in press; Goldfinger, 2011). Using their dating of the sediments and correlation of features in the logs of density and magnetic susceptibility between cores, they developed a detailed chronology of great earthquakes along the CSZ for the past 10,000 years (Goldfinger and others, in press). These correlations consist of attempting to match the peaks and valleys in logs of density and magnetic susceptibility between cores separated, in some cases, by hundreds of kilometers. Based on this work, Goldfinger and others (2003, 2008, in press) proposed that the turbidite evidence indicated the occurrence of great earthquakes (Mw 8) that only ruptured the southern portion of the CSZ, as well as earthquakes with about Mw 9 that ruptured the entire length of the CSZ. For the southernmost portion of the CSZ, Goldfinger and others (in press) proposed a recurrence time of Mw 8 or larger earthquakes of about 230 years. This proposed recurrence time was shorter than the 500 year time that was incorporated in one scenario in the NSHM’s. It is important to note that the hazard maps of 1996 and later also included a scenario or set of scenarios with a shorter recurrence time for Mw 8 earthquakes, using rupture zones that are distributed along the length of the CSZ (Frankel and others, 1996; Petersen and others, 2008). Originally, this scenario was meant to correspond to the idea that some of the 500-year averaged ruptures seen in the paleoseismic evidence could have been a series of Mw 8 earthquakes that occurred over a short period of time (a few decades), rather than Mw 9 earthquakes

  5. Coulomb static stress changes before and after the 23 October 2011 Van, eastern Turkey, earthquake (MW= 7.1: implications for the earthquake hazard mitigation

    Directory of Open Access Journals (Sweden)

    M. Utkucu


    Full Text Available Coulomb stress changes before and after the 23 October 2011 Van, eastern Turkey, earthquake have been analysed using available data related to the background and the aftershock seismicity and the source faults. The coseismic stress changes of the background seismicity had slightly promoted stress over the rupture plane of the 2011 Van earthquake, while it yielded a stress shadow over the Gürpı nar Fault which has been argued to have produced the 7 April 1646 Van earthquake. The stress shadow over the Gürpi nar fault has become more pronounced following the occurrence of the 2011 Van earthquake, meaning that the repetition of the 1646 Van earthquake has been further suppressed. Spatial distribution and source mechanisms of the 2011 Van earthquake's aftershocks have been utilised to define four clusters with regard to their relative location to the mainshock rupture. In addition, the aftershock sequence covers a much broader area toward the northeast. Correlations between the observed spatial patterns of the aftershocks and the coseismic Coulomb stress changes caused by the mainshock are determined by calculating the stress changes over both optimally oriented and specified fault planes. It is shown here that there is an apparent correlation between the mainshock stress changes and the observed spatial pattern of the aftershock occurrence, demonstrating the usefulness of the stress maps in constraining the likely locations of the upcoming aftershocks and mitigating earthquake hazard.

  6. Coulomb static stress changes before and after the 23 October 2011 Van, eastern Turkey, earthquake (MW= 7.1): implications for the earthquake hazard mitigation (United States)

    Utkucu, M.; Durmuş, H.; Yalçın, H.; Budakoğlu, E.; Işık, E.


    Coulomb stress changes before and after the 23 October 2011 Van, eastern Turkey, earthquake have been analysed using available data related to the background and the aftershock seismicity and the source faults. The coseismic stress changes of the background seismicity had slightly promoted stress over the rupture plane of the 2011 Van earthquake, while it yielded a stress shadow over the Gürpı nar Fault which has been argued to have produced the 7 April 1646 Van earthquake. The stress shadow over the Gürpi nar fault has become more pronounced following the occurrence of the 2011 Van earthquake, meaning that the repetition of the 1646 Van earthquake has been further suppressed. Spatial distribution and source mechanisms of the 2011 Van earthquake's aftershocks have been utilised to define four clusters with regard to their relative location to the mainshock rupture. In addition, the aftershock sequence covers a much broader area toward the northeast. Correlations between the observed spatial patterns of the aftershocks and the coseismic Coulomb stress changes caused by the mainshock are determined by calculating the stress changes over both optimally oriented and specified fault planes. It is shown here that there is an apparent correlation between the mainshock stress changes and the observed spatial pattern of the aftershock occurrence, demonstrating the usefulness of the stress maps in constraining the likely locations of the upcoming aftershocks and mitigating earthquake hazard.

  7. Regional seismic hazard for Revithoussa, Greece: an earthquake early warning Shield and selection of alert signals

    Directory of Open Access Journals (Sweden)

    Y. Xu


    Full Text Available The feasibility of an earthquake early warning Shield in Greece is being explored as a European demonstration project. This will be the first early warning system in Europe. The island of Revithoussa is a liquid natural gas storage facility near Athens from which a pipeline runs to a gas distribution centre in Athens. The Shield is being centred on these facilities. The purpose here is to analyze seismicity and seismic hazard in relation to the Shield centre and the remote sensor sites in the Shield network, eventually to help characterize the hazard levels, seismic signals and ground vibration levels that might be observed or create an alert situation at a station. Thus this paper mainly gives estimation of local seismic hazard in the regional working area of Revithoussa by studying extreme peak ground acceleration (PGA and magnitudes. Within the Shield region, the most important zone to be detected is WNW from the Shield centre and is at a relatively short distance (50 km or less, the Gulf of Corinth (active normal faults region. This is the critical zone for early warning of strong ground shaking. A second key region of seismicity is at an intermediate distance (100 km or more from the centre, the Hellenic seismic zone south or southeast from Peloponnisos. A third region to be detected would be the northeastern region from the centre and is at a relatively long distance (about 150 km, Lemnos Island and neighboring region. Several parameters are estimated to characterize the seismicity and hazard. These include: the 50-year PGA with 90% probability of not being exceeded (pnbe using Theodulidis & Papazachos strong motion attenuation for Greece, PGANTP; the 50-year magnitude and also at the 90% pnbe, M50 and MP50, respectively. There are also estimates of the earthquake that is most likely to be felt at a damaging intensity level, these are the most perceptible earthquakes at intensities VI, VII and VIII with magnitudes MVI, MVII and MVIII

  8. Tectonic Origin of the 1899 Yakutat Bay Earthquakes, Alaska, and Insights into Future Hazards (United States)

    Gulick, S. S.; LeVoir, M. A.; Haeussler, P. J.; Saustrup, S.


    On September 10th the largest of four earthquakes (Mw 8.2) that occurred in southeast Alaska on 1899 produced a 6 m tsunami and may have produced as much as 14 m of co-seismic uplift. This earthquake had an epicenter somewhere near Yakutat or Disenchantment Bays. These bays lie at the transition between the Fairweather Fault (the Pacific-North American strike-slip plate boundary), and the Yakutat Terrane-North American subduction zone. The deformation front of this subduction zone is thought to include the eastern fault in the Pamplona Zone offshore, the Malaspina Fault onshore, and the Esker Creek Fault near Yakutat Bay. The 10 September 1899 event could have taken place on a Yakutat-North American megathrust that daylights in Yakutat or Disenchantment Bay. Alternatively, the 10 September 1899 earthquake could have originated from the Fairweather-Boundary and Yakutat faults, transpressive components of the Fairweather strike-slip system present in the Yakutat Bay region, or from thrusting along the Yakutat and Otemaloi Faults on the southeast flank of Yakutat Bay. Characterizing fault slip during the Alaskan earthquakes of 1899 is vital to assessing both subduction zone structure and seismic hazards in the Yakutat Bay area. Each possible fault model has a different implication for modern hazards. These results will be used to update seismic hazard and fault maps and assess future risk to the Yakutat Bay and surrounding communities. During Aug. 6-17th, we anticipate acquiring high-resolution, marine multichannel seismic data aboard the USGS vessel Alaskan Gyre in Yakutat and Disenchantment Bays to search for evidence of recent faulting and directly test these competing theories for the 10 September 1899 event. This survey uses the University of Texas Institute for Geophysics' mini-GI gun, 24-channel seismic streamer, portable seismic compressor system, and associated gun control and data acquisition system to acquire the data. The profiles have a nominal common

  9. Evaluating earthquake hazards in the Los Angeles region; an earth-science perspective (United States)

    Ziony, Joseph I.


    Potentially destructive earthquakes are inevitable in the Los Angeles region of California, but hazards prediction can provide a basis for reducing damage and loss. This volume identifies the principal geologically controlled earthquake hazards of the region (surface faulting, strong shaking, ground failure, and tsunamis), summarizes methods for characterizing their extent and severity, and suggests opportunities for their reduction. Two systems of active faults generate earthquakes in the Los Angeles region: northwest-trending, chiefly horizontal-slip faults, such as the San Andreas, and west-trending, chiefly vertical-slip faults, such as those of the Transverse Ranges. Faults in these two systems have produced more than 40 damaging earthquakes since 1800. Ninety-five faults have slipped in late Quaternary time (approximately the past 750,000 yr) and are judged capable of generating future moderate to large earthquakes and displacing the ground surface. Average rates of late Quaternary slip or separation along these faults provide an index of their relative activity. The San Andreas and San Jacinto faults have slip rates measured in tens of millimeters per year, but most other faults have rates of about 1 mm/yr or less. Intermediate rates of as much as 6 mm/yr characterize a belt of Transverse Ranges faults that extends from near Santa Barbara to near San Bernardino. The dimensions of late Quaternary faults provide a basis for estimating the maximum sizes of likely future earthquakes in the Los Angeles region: moment magnitude .(M) 8 for the San Andreas, M 7 for the other northwest-trending elements of that fault system, and M 7.5 for the Transverse Ranges faults. Geologic and seismologic evidence along these faults, however, suggests that, for planning and designing noncritical facilities, appropriate sizes would be M 8 for the San Andreas, M 7 for the San Jacinto, M 6.5 for other northwest-trending faults, and M 6.5 to 7 for the Transverse Ranges faults. The

  10. Geodetic constraints on frictional properties and earthquake hazard in the Imperial Valley, Southern California (United States)

    Lindsey, Eric O.; Fialko, Yuri


    We analyze a suite of geodetic observations across the Imperial Fault in southern California that span all parts of the earthquake cycle. Coseismic and postseismic surface slips due to the 1979 M 6.6 Imperial Valley earthquake were recorded with trilateration and alignment surveys by Harsh (1982) and Crook et al. (1982), and interseismic deformation is measured using a combination of multiple interferometric synthetic aperture radar (InSAR)-viewing geometries and continuous and survey-mode GPS. In particular, we combine more than 100 survey-mode GPS velocities with InSAR data from Envisat descending tracks 84 and 356 and ascending tracks 77 and 306 (149 total acquisitions), processed using a persistent scatterers method. The result is a dense map of interseismic velocities across the Imperial Fault and surrounding areas that allows us to evaluate the rate of interseismic loading and along-strike variations in surface creep. We compare available geodetic data to models of the earthquake cycle with rate- and state-dependent friction and find that a complete record of the earthquake cycle is required to constrain key fault properties including the rate-dependence parameter (a - b) as a function of depth, the extent of shallow creep, and the recurrence interval of large events. We find that the data are inconsistent with a high (>30 mm/yr) slip rate on the Imperial Fault and investigate the possibility that an extension of the San Jacinto-Superstition Hills Fault system through the town of El Centro may accommodate a significant portion of the slip previously attributed to the Imperial Fault. Models including this additional fault are in better agreement with the available observations, suggesting that the long-term slip rate of the Imperial Fault is lower than previously suggested and that there may be a significant unmapped hazard in the western Imperial Valley.

  11. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal (United States)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.


    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  12. Network similarity and statistical analysis of earthquake seismic data (United States)

    Deyasi, Krishanu; Chakraborty, Abhijit; Banerjee, Anirban


    We study the structural similarity of earthquake networks constructed from seismic catalogs of different geographical regions. A hierarchical clustering of underlying undirected earthquake networks is shown using Jensen-Shannon divergence in graph spectra. The directed nature of links indicates that each earthquake network is strongly connected, which motivates us to study the directed version statistically. Our statistical analysis of each earthquake region identifies the hub regions. We calculate the conditional probability of the forthcoming occurrences of earthquakes in each region. The conditional probability of each event has been compared with their stationary distribution.

  13. Sensitivity Analysis of Evacuation Speed in Hypothetical NPP Accident by Earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)


    Effective emergency response in emergency situation of nuclear power plant (NPP) can make consequences be different therefore it is regarded important when establishing an emergency response plan and assessing the risk of hypothetical NPP accident. Situation of emergency response can be totally changed when NPP accident caused by earthquake or tsunami is considered due to the failure of roads and buildings by the disaster. In this study evacuation speed has been focused among above various factors and reasonable evacuation speed in earthquake scenario has been investigated. Finally, sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Evacuation scenario can be entirely different in the situation of seismic hazard and the sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Various references were investigated and earthquake evacuation model has been developed considering that evacuees may convert their evacuation method from using a vehicle to walking when they face the difficulty of using a vehicle due to intense traffic jam, failure of buildings and roads, and etc. The population dose within 5 km / 30 km have been found to be increased in earthquake situation due to decreased evacuation speed and become 1.5 - 2 times in the severest earthquake evacuation scenario set up in this study. It is not agreed that using same emergency response model which is used for normal evacuation situations when performing level 3 probabilistic safety assessment for earthquake and tsunami event. Investigation of data and sensitivity analysis for constructing differentiated emergency response model in the event of seismic hazard has been carried out in this study.

  14. The smart cluster method. Adaptive earthquake cluster identification and analysis in strong seismic regions (United States)

    Schaefer, Andreas M.; Daniell, James E.; Wenzel, Friedemann


    Earthquake clustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation for probabilistic seismic hazard assessment. This study introduces the Smart Cluster Method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal cluster identification. It utilises the magnitude-dependent spatio-temporal earthquake density to adjust the search properties, subsequently analyses the identified clusters to determine directional variation and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010-2011 Darfield-Christchurch sequence, a reclassification procedure is applied to disassemble subsequent ruptures using near-field searches, nearest neighbour classification and temporal splitting. The method is capable of identifying and classifying earthquake clusters in space and time. It has been tested and validated using earthquake data from California and New Zealand. A total of more than 1500 clusters have been found in both regions since 1980 with M m i n = 2.0. Utilising the knowledge of cluster classification, the method has been adjusted to provide an earthquake declustering algorithm, which has been compared to existing methods. Its performance is comparable to established methodologies. The analysis of earthquake clustering statistics lead to various new and updated correlation functions, e.g. for ratios between mainshock and strongest aftershock and general aftershock activity metrics.

  15. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States (United States)

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.


    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in

  16. Time-dependent neo-deterministic seismic hazard scenarios: Preliminary report on the M6.2 Central Italy earthquake, 24th August 2016

    CERN Document Server

    Peresan, Antonella; Romashkova, Leontina; Magrin, Andrea; Soloviev, Alexander; Panza, Giuliano F


    A scenario-based Neo-Deterministic approach to Seismic Hazard Assessment (NDSHA) is available nowadays, which permits considering a wide range of possible seismic sources as the starting point for deriving scenarios by means of full waveforms modeling. The method does not make use of attenuation relations and naturally supplies realistic time series of ground shaking, including reliable estimates of ground displacement, readily applicable to complete engineering analysis. Based on the neo-deterministic approach, an operational integrated procedure for seismic hazard assessment has been developed that allows for the definition of time dependent scenarios of ground shaking, through the routine updating of earthquake predictions, performed by means of the algorithms CN and M8S. The integrated NDSHA procedure for seismic input definition, which is currently applied to the Italian territory, combines different pattern recognition techniques, designed for the space-time identification of strong earthquakes, with al...

  17. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios received increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.

  18. Pulsed Electric Processing of the Seismic-Active Fault for Earthquake Hazard Mitigation (United States)

    Novikov, V. A.; Zeigarnik, V. A.; Konev, Yu. B.; Klyuchkin, V. N.


    Previous field and laboratory investigations performed in Russia (1999-2008) showed a possibility of application of high-power electric current pulses generated by pulsed MHD power system for triggering the weak seismicity and release of tectonic stresses in the Earth crust for earthquake hazard mitigation. The mechanism of the influence of man-made electromagnetic field on the regional seismicity is not clear yet. One of possible cause of the phenomenon may be formation of cracks in the rocks under fluid pressure increase due to Joule heat generation by electric current injected into the Earth crust. Detailed 3D-calculaton of electric current density in the Earth crust of Northern Tien Shan provided by pulsed MHD power system connected to grounded electric dipole showed that at the depth of earthquake epicenters (> 5km) the electric current density is lower than 10-7 A/m2 that is not sufficient for increase of pressure in the fluid-saturated porous geological medium due to Joule heat generation, which may provide formation of cracks resulting in the fault propagation and release of tectonic stresses in the Earth crust. Nevertheless, under certain conditions, when electric current will be injected into the fault through the casing pipes of deep wells with preliminary injection of conductive fluid into the fault, the current density may be high enough for significant increase of mechanic pressure in the porous two-phase geological medium. Numerical analysis of a crack formation triggered by high-power electric pulses based on generation of mechanical pressure in the geological medium was carried out. It was shown that calculation of mechanical pressure impulse due to high-power electrical current in the porous two-phase medium may be performed neglecting thermal conductance by solving the non-stationary equation of piezo-conductivity with Joule heat generation. For calculation of heat generation the known solution of the task of current spreading from spherical or

  19. Canister storage building hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.; Garvin, L.J.


    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  20. Cold Vacuum Drying Facility hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.


    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  1. Earthquake response analysis of 11-story RC building that suffered damage in 2011 East Japan Earthquake (United States)

    Shibata, Akenori; Masuno, Hidemasa


    An eleven-story RC apartment building suffered medium damage in the 2011 East Japan earthquake and was retrofitted for re-use. Strong motion records were obtained near the building. This paper discusses the inelastic earthquake response analysis of the building using the equivalent single-degree-of-freedom (1-DOF) system to account for the features of damage. The method of converting the building frame into 1-DOF system with tri-linear reducing-stiffness restoring force characteristics was given. The inelastic response analysis of the building against the earthquake using the inelastic 1-DOF equivalent system could interpret well the level of actual damage.

  2. A summary of hazard datasets and guidelines supported by the Global Earthquake Model during the first implementation phase

    Directory of Open Access Journals (Sweden)

    Marco Pagani


    Full Text Available The Global Earthquake Model (GEM initiative promotes open, transparent and collaborative science aimed at the assessment of earthquake risk and its reduction worldwide. During the first implementation phase (2009-2014 GEM sponsored five projects aimed at the creation of global datasets and guidelines toward the creation of open, transparent and, as far as possible, homogeneous hazard input models. These projects concentrated on the following global databases and models: an instrumental catalogue, a historical earthquake archive and catalogue, a geodetic strain rate model, a database of active faults, and set of ground motion prediction equations. This paper describes the main outcomes of these projects illustrating some initial applications as well as challenges in the creation of hazard models.

  3. The Abanico del Quindio alluvial fan, Armenia, Colombia: Active tectonics and earthquake hazard (United States)

    Vargas, Carlos A.; Nieto, Marco; Monsalve, Hugo; Montes, Luis; Valdes, Mireya


    The Abanico del Quindío (AQ) fan, a volcaniclastic deposit from the Ruiz-Tolima volcanic complex (RTVC), Colombia, provides insight into recent deformation in the Central Andes. The use of geological observations, geophysical measurements, and estimates of fault-scarp ages constrain timing of recent tectonic activity. Gravity and magnetic analyses, along with geomorphologic cartography, allow the detection of lateral variations in basement distribution and at least three structural trends that cut the AQ: the Armenia fault (NNE), El Danubio fault (NNW), and Hojas Anchas fault (E-W). Recent deformation in the zone results from slip on the Armenia and El Danubio faults and suggests a maximum interval magnitude of 5.1 < Mw < 6.3, with ages ranging between 2560 ± 480 yr B.P. and 4120 ± 780 yr B.P. Although no surface ruptures are associated with historical events on the fault segments in this zone, blind structures may have influenced the hypocentral distribution of events recorded after the Armenia Earthquake ( Mw 6.2, 25-01-1999). Further geophysical studies are needed to understand the Romeral Fault System and assess the earthquake hazard for the city of Armenia.

  4. Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Eisses, A.; Kell, A.; Kent, G. [UNR; Driscoll, N. [UCSD; Karlin, R.; Baskin, R. [USGS; Louie, J. [UNR; Pullammanappallil, S. [Optim


    Amy Eisses, Annie M. Kell, Graham Kent, Neal W. Driscoll, Robert E. Karlin, Robert L. Baskin, John N. Louie, Kenneth D. Smith, Sathish Pullammanappallil, 2011, Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada: presented at American Geophysical Union Fall Meeting, San Francisco, Dec. 5-9, abstract NS14A-08.

  5. Metrics, Bayes, and BOGSAT: Recognizing and Assessing Uncertainties in Earthquake Hazard Maps (United States)

    Stein, S. A.; Brooks, E. M.; Spencer, B. D.


    Recent damaging earthquakes in areas predicted to be relatively safe illustrate the need to assess how seismic hazard maps perform. At present, there is no agreed way of assessing how well a map performed. The metric implicit in current maps, that during a time interval predicted shaking will be exceeded only at a specific fraction of sites, is useful but permits maps to be nominally successful although they significantly underpredict or overpredict shaking, or nominally unsuccessful but predict shaking well. We explore metrics that measure the effects of overprediction and underprediction. Although no single metric fully characterizes map behavior, using several metrics can provide useful insight for comparing and improving maps. A related question is whether to regard larger-than-expected shaking as a low-probability event allowed by a map, or to revise the map to show increased hazard. Whether and how much to revise a map is complicated, because a new map that better describes the past may or may not better predict the future. The issue is like deciding after a coin has come up heads a number of times whether to continue assuming that the coin is fair and the run is a low-probability event, or to change to a model in which the coin is assumed to be biased. This decision can be addressed using Bayes' Rule, so that how much to change depends on the degree of one's belief in the prior model. Uncertainties are difficult to assess for hazard maps, which require subjective assessments and choices among many poorly known or unknown parameters. However, even rough uncertainty measures for estimates/predictions from such models, sometimes termed BOGSATs (Bunch Of Guys Sitting Around Table) by risk analysts, can give users useful information to make better decisions. We explore the extent of uncertainty via sensitivity experiments on how the predicted hazard depends on model parameters.

  6. Historical analysis of US pipeline accidents triggered by natural hazards (United States)

    Girgin, Serkan; Krausmann, Elisabeth


    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  7. Variation of some Planetary seismic hazard indices on the occasion of Lefkada, Greece, earthquake of 17 November, 2015. (United States)

    Contadakis, Michael E.; Arabelos, Demetrios N.; Vergos, George; Spatala, Spyrous; Skeberis, Christos; Xenos, Tomas D.


    By the term "Planetary seismic hazard indices" we mean parameters or observables which indicate the degree of the mutual interactions of tectonic active areas on the earth surface with some parts or phenomena of the Geosphere and the near Earth space. In this paper we investigate the variation of the tidal triggering effect efficiency, by means of the tidal seismicity compliance parameter p, (Arabelos et al. 2016, Contadakis et al. 2009, Contadakis et al. 2012, Vergos et al. 2015), as well as the lower Ionosphere variations, by means of the variation of the High-Frequency limit, fo, of the ionospheric turbulence content (Contadakis et al. 2009, Contadakis et al. 2012, Contadakis et al. 2015) with the time and space proximity to the site of the earthquake occurrence. The results of our investigation are: (1) The mapping of the tidal seismicity compliance parameter p, over the Greece indicate an increasing tectonic stress criticality for the year 2015 of the area of Ionian islands in relation to other areas in Greece, pointing to the area of a possible strong earthquake. (2) The High- Frequency limit fo, of the ionospheric turbulence content, measured analyzing TEC variations, increases as the site and the moment of the earthquake occurrence is approaching, pointing to the earthquake locus. (3) Finally, The analyzed data from the receiver of INFREP network in Thessaloniki (Skeberis et al. 2015), Greece (40.59N, 22,78E), which monitor VLF transmitters based in Tavolara , Niscemi, Italy, Keflavik, Iceland, and Anthorn, UKthe show that the signals from the two VLF European transmitters, transmitted over Lefkada, indicate enhanced high frequency variations, in accordance to the result of the TEC analysis, the last ten days before the moment of the earthquake occurrence. References Arabelos, D.N., Contadakis, E.M.,Vergos, G.,Spatalas, S.D., 2016, Variation of the Earth tide-seismicity compliance parameter during the recent seismic activity in Fthiotida, central Greece

  8. Soil response along the coastal plain of Israel for seismic hazard assessments and earthquake scenario applications (United States)

    Shapira, A.; Zaslavsky, Y.; Gorstein, M.; Kalmanovich, M.


    About 2 million inhabitants in Israel, almost one third of total population of the country, live in a narrow strip along the coast between the towns of Ashqelon in the south and Haifa in the north (130x10 km^2). Due to the high population density, this region may be considered a high seismic risk zone. The objective of this study was to derive the ground shaking characteristics, resonance frequencies and amplification factors for different site conditions along the coast. The quantitative assessment of the site response to seismic motions is made from the horizontal-to-vertical spectral ratios of ambient noise measurements at 190 sites. The loose sediments of sand and alluvium units yield amplification factors of 2-3 in the frequency range 1.2-3.5 Hz. In the Carmel coast, the complex calcareous sandstone and loose sediments, with a total thickness of 15-30m, that covers the Judea Group carbonates, may yield amplification factor up to 8 at frequency ranging from 2 to 6 Hz. The observed resonance frequencies and their amplifications were correlated with analytical functions that correspond to 1-D subsurface models. In many cases, we could not obtain useful data from borehole, to estimate the depth to the half-space. The thickness of the resonating soil layers was better estimated by correlating regional geological information with the mapped distribution of the predominant frequency and the maximum amplification across the investigated area. The Coastal Plain area was divided into six geographical zones, each one characterized by fundamental frequency and amplification factor. For each zone we adjusted the characteristic soil-column model. Under the assumption that the soil layers will behave linearly to the expected seismic ground motions and incorporating the local site conditions, we implemented the Stochastic Evaluation of Earthquake Hazard (Shapira and van Eck, 1993) procedure, to assess the site-specific uniform hazard in terms of peak ground acceleration

  9. Estimating Earthquake Hazards in the San Pedro Shelf Region, Southern California (United States)

    Baher, S.; Fuis, G.; Normark, W. R.; Sliter, R.


    The San Pedro Shelf (SPS) region of the inner California Borderland offshore southern California poses a significant seismic hazard to the contiguous Los Angeles Area, as a consequence of late Cenozoic compressional reactivation of mid-Cenozoic extensional faults. The extent of the hazard, however, is poorly understood because of the complexity of fault geometries and uncertainties in earthquake locations. The major faults in the region include the Palos Verdes, THUMS Huntington Beach and the Newport-Inglewood fault zones. We report here the analysis and interpretation of wide-angle seismic-reflection and refraction data recorded as part of the Los Angeles Region Seismic Experiment line 1 (LARSE 1), multichannel seismic (MCS) reflection data obtained by the USGS (1998-2000) and industry borehole stratigraphy. The onshore-offshore velocity model, which is based on forward modeling of the refracted P-wave arrival times, is used to depth migrate the LARSE 1 section. Borehole stratigraphy allows correlation of the onshore and offshore velocity models because state regulations prevent collection of deep-penetration acoustic data nearshore (within 3 mi.). Our refraction study is an extension of ten Brink et al., 2000 tomographic inversion of LARSE I data. They found high velocities (> 6 km/sec) at about ~3.5 km depth from the Catalina Fault (CF) to the SPS. We find these velocities, shallower (around 2 km depth) beneath the Catalina Ridge (CR) and SPS, but at a depth 2.5-3.0 km elsewhere in the study region. This change in velocity structure can provide additional constraints for the tectonic processes of this region. The structural horizons observed in the LARSE 1 reflection data are tied to adjacent MCS lines. We find localized folding and faulting at depth (~2 km) southwest of the CR and on the SPS slope. Quasi-laminar beds, possible of pelagic origin follow the contours of earlier folded (wavelength ~1 km) and faulted Cenozoic sedimentary and volcanic rocks. Depth to

  10. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (United States)


    ... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard... fish or fishery product being processed in the absence of those controls. (b) The HACCP plan. Every...

  11. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    Energy Technology Data Exchange (ETDEWEB)

    Türker, Tuğba, E-mail: [Karadeniz Technical University, Department of Geophysics, Trabzon/Turkey (Turkey); Bayrak, Yusuf, E-mail: [Ağrı İbrahim Çeçen University, Ağrı/Turkey (Turkey)


    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M{sub S}=7.3 and 1897, M{sub S}=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M{sub S} magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99

  12. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    Energy Technology Data Exchange (ETDEWEB)

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith


    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  13. Y-12 site-specific earthquake response analysis and soil liquefaction assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, S.B.; Hunt, R.J.; Manrod, W.E. III


    A site-specific earthquake response analysis and soil liquefaction assessment were performed for the Oak Ridge Y-12 Plant. The main purpose of these studies was to use the results of the analyses for evaluating the safety of the performance category -1, -2, and -3 facilities against the natural phenomena seismic hazards. Earthquake response was determined for seven (7), one dimensional soil columns (Fig. 12) using two horizontal components of the PC-3 design basis 2000-year seismic event. The computer program SHAKE 91 (Ref. 7) was used to calculate the absolute response accelerations on top of ground (soil/weathered shale) and rock outcrop. The SHAKE program has been validated for horizontal response calculations at periods less than 2.0 second at several sites and consequently is widely accepted in the geotechnical earthquake engineering area for site response analysis.

  14. Risk analysis based on hazards interactions (United States)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost


    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (

  15. Lateral spread hazard mapping of the northern Salt Lake Valley, Utah, for a M7.0 scenario earthquake (United States)

    Olsen, M.J.; Bartlett, S.F.; Solomon, B.J.


    This paper describes the methodology used to develop a lateral spread-displacement hazard map for northern Salt Lake Valley, Utah, using a scenario M7.0 earthquake occurring on the Salt Lake City segment of the Wasatch fault. The mapping effort is supported by a substantial amount of geotechnical, geologic, and topographic data compiled for the Salt Lake Valley, Utah. ArcGIS?? routines created for the mapping project then input this information to perform site-specific lateral spread analyses using methods developed by Bartlett and Youd (1992) and Youd et al. (2002) at individual borehole locations. The distributions of predicted lateral spread displacements from the boreholes located spatially within a geologic unit were subsequently used to map the hazard for that particular unit. The mapped displacement zones consist of low hazard (0-0.1 m), moderate hazard (0.1-0.3 m), high hazard (0.3-1.0 m), and very high hazard (> 1.0 m). As expected, the produced map shows the highest hazard in the alluvial deposits at the center of the valley and in sandy deposits close to the fault. This mapping effort is currently being applied to the southern part of the Salt Lake Valley, Utah, and probabilistic maps are being developed for the entire valley. ?? 2007, Earthquake Engineering Research Institute.

  16. Earthquakes (United States)

    ... earthquake occurs in a populated area, it may cause property damage, injuries, and even deaths. If you live in a coastal area, there is the possibility of a tsunami. Damage from earthquakes can also lead to floods or fires. Although there are no guarantees of ...

  17. Research on the spatial analysis method of seismic hazard for island (United States)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying


    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  18. GNSS-monitoring of Natural Hazards: Ionospheric Detection of Earthquakes and Volcano Eruptions (United States)

    Shults, K.; Astafyeva, E.; Lognonne, P. H.


    During the last few decades earthquakes as sources of strong perturbations in the ionosphere have been reported by many researchers, and in the last few years the seismo-ionosphere coupling has been more and more discussed (e.g., Calais and Minster, 1998, Phys. Earth Planet. Inter., 105, 167-181; Afraimovich et al., 2010, Earth, Planets, Space, V.62, No.11, 899-904; Rolland et al., 2011, Earth Planets Space, 63, 853-857). Co-volcanic ionospheric perturbations have come under the scrutiny of science only in recent years but observations have already shown that mass and energy injections of volcanic activities can also excite oscillations in the ionosphere (Heki, 2006, Geophys. Res. Lett., 33, L14303; Dautermann et al., 2009, Geophys. Res., 114, B02202). The ionospheric perturbations are induced by acoustic and gravity waves generated in the neutral atmosphere by seismic source or volcano eruption. The upward propagating vibrations of the atmosphere interact with the plasma in the ionosphere by the particle collisions and excite variations of electron density detectable with dual-frequency receivers of the Global Navigation Satellite System (GNSS). In addition to co-seismic ionospheric disturbances (CID) observations, ionospheric GNSS measurements have recently proved to be useful to obtain ionospheric images for the seismic fault allowing to provide information on its' parameters and localization (Astafyeva et al., 2011, Geophys. Res. Letters, 38, L22104). This work describes how the GNSS signals can be used for monitoring of natural hazards on examples of the 9 March 2011 M7.3 Tohoku Foreshock and April 2015 M7.8 Nepal earthquake as well as the April 2015 Calbuco volcano eruptions. We also show that use of high-resolution GNSS data can aid to plot the ionospheric images of seismic fault.

  19. Subduction zone and crustal dynamics of western Washington; a tectonic model for earthquake hazards evaluation (United States)

    Stanley, Dal; Villaseñor, Antonio; Benz, Harley


    The Cascadia subduction zone is extremely complex in the western Washington region, involving local deformation of the subducting Juan de Fuca plate and complicated block structures in the crust. It has been postulated that the Cascadia subduction zone could be the source for a large thrust earthquake, possibly as large as M9.0. Large intraplate earthquakes from within the subducting Juan de Fuca plate beneath the Puget Sound region have accounted for most of the energy release in this century and future such large earthquakes are expected. Added to these possible hazards is clear evidence for strong crustal deformation events in the Puget Sound region near faults such as the Seattle fault, which passes through the southern Seattle metropolitan area. In order to understand the nature of these individual earthquake sources and their possible interrelationship, we have conducted an extensive seismotectonic study of the region. We have employed P-wave velocity models developed using local earthquake tomography as a key tool in this research. Other information utilized includes geological, paleoseismic, gravity, magnetic, magnetotelluric, deformation, seismicity, focal mechanism and geodetic data. Neotectonic concepts were tested and augmented through use of anelastic (creep) deformation models based on thin-plate, finite-element techniques developed by Peter Bird, UCLA. These programs model anelastic strain rate, stress, and velocity fields for given rheological parameters, variable crust and lithosphere thicknesses, heat flow, and elevation. Known faults in western Washington and the main Cascadia subduction thrust were incorporated in the modeling process. Significant results from the velocity models include delineation of a previously studied arch in the subducting Juan de Fuca plate. The axis of the arch is oriented in the direction of current subduction and asymmetrically deformed due to the effects of a northern buttress mapped in the velocity models. This

  20. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  1. Site Specific Probabilistic Seismic Hazard and Risk Analysis for Surrounding Communities of The Geysers Geothermal Development Area (United States)

    Miah, M.; Hutchings, L. J.; Savy, J. B.


    We conduct a probabilistic seismic hazard and risk analysis from induced and tectonic earthquakes for a 50 km radius area centered on The Geysers, California and for the next ten years. We calculate hazard with both a conventional and physics-based approach. We estimate site specific hazard. We convert hazard to risk of nuisance and damage to structures per year and map the risk. For the conventional PSHA we assume the past ten years is indicative of hazard for the next ten years from Msurprising since they were calculated by completely independent means. The conventional approach used the actual catalog of the past ten years of earthquakes to estimate the hazard for the next ten year. While the physics-based approach used geotechnical modeling to calculate the catalog for the next ten years. Similarly, for the conventional PSHA, we utilized attenuation relations from past earthquakes recorded at the Geysers to translate the ground motion from the source to the site. While for the physics-based approach we calculated ground motion from simulation of actual earthquake rupture. Finally, the source of the earthquakes was the actual source for the conventional PSHA. While, we assumed random fractures for the physics-based approach. From all this, we consider the calculation of the conventional approach, based on actual data, to validate the physics-based approach used.

  2. Challenges to Seismic Hazard Analysis of Critical Infrastructures (United States)

    Klügel, J.


    meaningful design basis contemporary methods of deterministic seismic hazard analysis will be discussed. It will be demonstrated that modern deterministic methods can be employed successfully both for the design of critical infrastructures as well as a starting point for the development of modern probabilistic scenario-based methods. As a challenge for future developments the question of an alternative presentation of the results of a seismic hazard analysis will be considered. In addition to traditional seismic hazard maps based on spectral or peak ground accelerations it is suggested to develop similar maps in terms of Arias-intensity and of CAV-values. This will support the selection of appropriate earthquake time histories by civil engineers responsible for the design of critical infrastructures.

  3. The large earthquake and tsunami of AD 365 in the Hellenic Arc revisited: implications for tsunami hazard assessment (United States)

    Novikova, T.; Ezz, M.; Kijko, A.; Papadopoulos, G. A.


    The large tsunamigenic earthquake that shook the eastern Mediterranean in the second half of the 4th century AD with a magnitude which is only roughly estimated as being of at least 8 has been considered as one of the largest earthquakes ever reported in the Mediterranean Sea. A general consensus exists that it occurred on 21 July 365 in the western Hellenic Arc causing a co-seismic uplift in western Crete up to c. 9 m. However, the rupture zone is not well constrained so far. From historical and geological documentation it has been supported that the tsunami inundated not only in near-field but also in remote coastal sites in the entire basin of the Mediterranean. We reexamine critically the available documentary sources and geological information, create an inventory of the most credible coastal sites to have inundated and conclude that the tsunami propagation zone very likely was less than what was considered so far. From dislocation modeling scenarios we reproduce several candidate seismic sources and simulate numerically the resulting tsunami. Then, calculated tsunami wave heights and runups are compared with the observed ones in credible coastal sites to determine the most candidate rupture zone of the earthquake. Implications of such a determination for the tsunami hazard assessment particularly for the West Hellenic Arc as well as for Alexandria is of great interest since there is the most reliable description of tsunami impact. Finally, by extending probabilistic hazard assessment from earthquakes to tsunamis we perform tsunami hazard assessment for Alexandria for a future 365-type tsunami by employing a combination of probability evaluation of earthquakes in the tectonic segment that generated the 365 event and of the tsunami numerical simulation.

  4. Multicriteria analysis in hazards assessment in Libya (United States)

    Zeleňáková, Martina; Gargar, Ibrahim; Purcz, Pavol


    Environmental hazards (natural and man-made) have always constituted problem in many developing and developed countries. Many applications proved that these problems could be solved through planning studies and detailed information about these prone areas. Determining time and location and size of the problem are important for decision makers for planning and management activities. It is important to know the risk represented by those hazards and take actions to protect against them. Multicriteria analysis methods - Analytic hierarchy process, Pairwise comparison, Ranking method are used to analyse which is the most dangerous hazard facing Libya country. The multicriteria analysis ends with a more or less stable ranking of the given alternatives and hence a recommendation as to which alternative(s) problems should be preferred. Regarding our problem of environmental risk assessment, the result will be a ranking or categorisation of hazards with regard to their risk level.

  5. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007 (United States)

    Wills, Chris J.; Weldon, Ray J.; Bryant, W.A.


    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database . CGS has been primarily responsible for updating and editing of the fault parameters, with extensive input from USGS and SCEC scientists.

  6. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective (United States)

    Wyss, Max


    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  7. A Summary of Instrumental Data on the Recent Strong Vrancea Earthquakes, and Implications for Seismic Hazard (United States)

    Sandi, Horea; Borcia, Ioan Sorin


    The paper is intended to summarize the most important instrumental data of direct relevance for engineering activities, obtained in connection with the strong Vrancea earthquakes of 4 March 1977, 30 August 1986, 30 May 1990, and 31 May 1990, and to point out some significant consequences and conclusions derived on this basis. Two main objectives of this analysis may be emphasized: (a) in-depth analysis of the radiation pattern; and (b) analysis of the spectral contents of ground motion in connection with the features of local conditions, and with the intention of assessing the relative importance of two main factors: source mechanism and long-distance wave propagation, versus features of local geological conditions. Some specific methodological developments used in this context may be mentioned: (a) use of a new approach to the quantification of ground motion intensity on the basis of instrumental (accelerographic) information; (b) analysis of radiation pattern in spectral and directivity terms; (c) parametric analysis of site-specific transfer functions for the local sequences of geological layers; and (d) a critical view on the outcome of post-earthquake survey techniques, keeping in view the implications of the spectral features of ground motion. The main results obtained are related to: (a) ground motion radiation features that have to be taken into account in connection with the data on the source mechanisms of the successive events dealt with; (b) expected spectral features of future strong ground motion at different sites; (c) methodological developments proposed for the assessment of local transfer functions; and (d) implications for microzonation activities.

  8. Spatial Analysis of Earthquake Fatalities in the Middle East, 1970-2008: First Results (United States)

    Khaleghy Rad, M.; Evans, S. G.; Brenning, A.


    earthquake at the epicenter. On the other hand, it is in inverse (negative) relation to elapsed time since 1970, focal depth and GDP of the country affected. These spatial and temporal patterns of life loss are consistent with the patterns expected within our conceptual framework in relationship with hazard, exposed population and proxies of vulnerability. Our findings suggest that for earthquakes with comparable physical characteristics, the number of fatalities has been falling since 1970 in the Middle East region. We interpret this as an overall reduction of vulnerability of the Middle East during 1970-2008. Ongoing research is focusing on more detailed analysis of particular indicators of vulnerability reduction such as the development of earthquake building codes and preparedness, and on the spatial disaggregation of exposed population and the attenuation of earthquake magnitude.

  9. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Kubicek


    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  10. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan


    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  11. Safety analysis of nuclear containment vessels subjected to strong earthquakes and subsequent tsunamis

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Feng; Li, Hong Zhi [Dept. Structural Engineering, Tongji University, Shanghai (China)


    Nuclear power plants under expansion and under construction in China are mostly located in coastal areas, which means they are at risk of suffering strong earthquakes and subsequent tsunamis. This paper presents a safety analysis for a new reinforced concrete containment vessel in such events. A finite element method-based model was built, verified, and first used to understand the seismic performance of the containment vessel under earthquakes with increased intensities. Then, the model was used to assess the safety performance of the containment vessel subject to an earthquake with peak ground acceleration (PGA) of 0.56g and subsequent tsunamis with increased inundation depths, similar to the 2011 Great East earthquake and tsunami in Japan. Results indicated that the containment vessel reached Limit State I (concrete cracking) and Limit State II (concrete crushing) when the PGAs were in a range of 0.8–1.1g and 1.2–1.7g, respectively. The containment vessel reached Limit State I with a tsunami inundation depth of 10 m after suffering an earthquake with a PGA of 0.56g. A site-specific hazard assessment was conducted to consider the likelihood of tsunami sources.

  12. Vulnerability assessment of archaeological sites to earthquake hazard: An indicator based method integrating spatial and temporal aspects

    Directory of Open Access Journals (Sweden)

    Despina Minos-Minopoulos


    Full Text Available Across the world, numerous sites of cultural heritage value are at risk from a variety of human-induced and natural hazards such as war and earthquakes. Here we present and test a novel indicator-based method for assessing the vulnerability of archaeological sites to earthquakes. Vulnerability is approached as a dynamic element assessed through a combination of spatial and temporal parameters. The spatial parameters examine the susceptibility of the sites to the secondary Earthquake Environmental Effects of ground liquefaction, landslides and tsunami and are expressed through the Spatial Susceptibility Index (SSi. Parameters of physical vulnerability, economic importance and visitors density examine the temporal vulnerability of the sites expressed through the Temporal Vulnerability Index (TVi. The equally weighted sum of the spatial and temporal indexes represents the total Archaeological Site Vulnerability Index (A.S.V.I.. The A.S.V.I method is applied at 16 archaeological sites across Greece, allowing an assessment of their vulnerability. This then allows the establishment of a regional and national priority list for considering future risk mitigation. Results indicate that i the majority of the sites have low to moderate vulnerability to earthquake hazard, ii Neratzia Fortress on Kos and Heraion on Samos are characterised as highly vulnerable and should be prioritised for further studies and mitigation measures, and iii the majority of the sites are susceptible to at least one Earthquake Environmental Effect and present relatively high physical vulnerability attributed to the existing limited conservation works. This approach highlights the necessity for an effective vulnerability assessment methodology within the existing framework of disaster risk management for cultural heritage.

  13. Pedestrian Evacuation Analysis for Tsunami Hazards (United States)

    Jones, J. M.; Ng, P.; Wood, N. J.


    Recent catastrophic tsunamis in the last decade, as well as the 50th anniversary of the 1964 Alaskan event, have heightened awareness of the threats these natural hazards present to large and increasing coastal populations. For communities located close to the earthquake epicenter that generated the tsunami, strong shaking may also cause significant infrastructure damage, impacting the road network and hampering evacuation. There may also be insufficient time between the earthquake and first wave arrival to rely on a coordinated evacuation, leaving at-risk populations to self-evacuate on foot and across the landscape. Emergency managers evaluating these coastal risks need tools to assess the evacuation potential of low-lying areas in order to discuss mitigation options, which may include vertical evacuation structures to provide local safe havens in vulnerable communities. The U.S. Geological Survey has developed the Pedestrian Evacuation Analyst software tool for use by researchers and emergency managers to assist in the assessment of a community's evacuation potential by modeling travel times across the landscape and producing both maps of travel times and charts of population counts with corresponding times. The tool uses an anisotropic (directionally dependent) least cost distance model to estimate evacuation potential and allows for the variation of travel speed to measure its effect on travel time. The effectiveness of vertical evacuation structures on evacuation time can also be evaluated and compared with metrics such as travel time maps showing each structure in place and graphs displaying the percentage change in population exposure for each structure against the baseline. Using the tool, travel time maps and at-risk population counts have been generated for some coastal communities of the U.S. Pacific Northwest and Alaska. The tool can also be used to provide valuable decision support for tsunami vertical evacuation siting.

  14. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection (United States)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.


    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  15. A physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region (United States)

    Console, Rodolfo; Nardi, Anna; Carluccio, Roberto


    The characteristic earthquake hypothesis is not strongly supported by observational data because of the relatively short duration of historical and even paleoseismological records. For instance, for the Calabria (Southern Italy) region, historical information on strong earthquakes exist for at least two thousand years, but they can be considered complete for M > 6.0 only for the latest few centuries. As a consequence, characteristic earthquakes are seldom reported for individual fault segments, and hazard assessment is not reliably estimated by means of only minor seismicity reported in the historical catalogs. Even if they cannot substitute the information contained in a good historical catalog, physics-based earthquake simulators have become popular in the recent literature, and their application has been justified by a number of reasons. In particular, earthquake simulators can provide interesting information on which renewal models can better describe the recurrence statistics, and how this is affected by features as local fault geometry and kinematics. The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 100,000 events of magnitudes ≥ 4.5. The algorithm on which this simulator is based is constrained by several physical elements, as an average slip rate due to tectonic loading for every single segment in the investigated fault system, the process of rupture growth and termination, and interaction between earthquake sources, including small magnitude events. Events nucleated in one segment are allowed to expand into neighboring segments, if they are separated by a given maximum range of distance. The application of our simulation algorithm to Calabria region provides typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term periodicity of strong earthquakes, short

  16. Multifractal analysis of earthquakes in Kumaun Himalaya and its surrounding region (United States)

    Roy, P. N. S.; Mondal, S. K.


    Himalayan seismicity is related to continuing northward convergence of Indian plate against Eurasian plate. Earthquakes in this region are mainly caused due to release of elastic strain energy. The Himalayan region can be attributed to highly complex geodynamic process and therefore is best suited for multifractal seismicity analysis. Fractal analysis of earthquakes (mb ≥ 3.5) occurred during 1973-2008 led to the detection of a clustering pattern in the narrow time span. This clustering was identified in three windows of 50 events each having low spatial correlation fractal dimension ( D C ) value 0.836, 0.946 and 0.285 which were mainly during the span of 1998 to 2005. This clustering may be considered as an indication of a highly stressed region. The Guttenberg Richter b-value was determined for the same subsets considered for the D C estimation. Based on the fractal clustering pattern of events, we conclude that the clustered events are indicative of a highly stressed region of weak zone from where the rupture propagation eventually may nucleate as a strong earthquake. Multifractal analysis gave some understanding of the heterogeneity of fractal structure of the seismicity and existence of complex interconnected structure of the Himalayan thrust systems. The present analysis indicates an impending strong earthquake, which might help in better hazard mitigation for the Kumaun Himalaya and its surrounding region.

  17. Repository Subsurface Preliminary Fire Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan


    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  18. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah (United States)

    Gori, Paula L.


    INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and

  19. Earthquakes and faults at Mt. Etna (Italy): time-dependent approach to the seismic hazard of the eastern flank (United States)

    Peruzza, L.; Azzaro, R.; D'Amico, S.; Tuve', T.


    A time dependent approach to seismic hazard assessment, based on a renewal model using the Brownian Passage Time (BPT) distribution, has been applied to the best-known seismogenic faults at Mt. Etna volcano. These structures have been characterised by frequent coseismic surface displacement, and a long list of historically well-documented earthquakes occurred in the last 200 years (CMTE catalogue, Azzaro et al., 2000, 2002, 2006). Seismic hazard estimates, given in terms of earthquake rupture forecast, are conditioned to the time elapsed since the last event: impending events are expected on the S. Tecla Fault, and secondly on the Moscatello Fault, both involved in the highly active, geodynamic processes affecting the eastern flank of Mt. Etna. Mean recurrence time of major events is calibrated by merging the inter-event times observed at each fault; aperiodicity is tuned on b-values, following the approach proposed by Zoeller et al. (2008). Finally we compare these mean recurrence times with the values obtained by using only geometrical and kinematic information, as defined in Peruzza et al. (2008) for faults in Italy. Time-dependent hazard assessment is compared with the stationary assumption of seismicity, and validated in a retrospective forward model. Forecasted rates in a 5 years perspective (1st April 2009 to 1st April 2014), on magnitude bins compatible with macroseismic data are available for testing in the frame of the CSEP (Collaboratory for the study of Earthquake Predictability, project. Azzaro R., Barbano M.S., Antichi B., Rigano R.; 2000: Macroseismic catalogue of Mt. Etna earthquakes from 1832 to 1998. Acta Volcanol., con CD-ROM, 12 (1), 3-36. Azzaro R., D'Amico S., Mostaccio A., Scarfì L.; 2002: Terremoti con effetti macrosismici in Sicilia orientale - Calabria meridionale nel periodo Gennaio 1999 - Dicembre 2001. Quad. di Geof., 27, 1-59. Azzaro R., D'Amico S., Mostaccio A

  20. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation (United States)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.


    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  1. Gambling score in earthquake prediction analysis (United States)

    Molchan, G.; Romashkova, L.


    The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.

  2. The long-term geologic hazards in areas struck by large-magnitude earthquakes

    NARCIS (Netherlands)

    Wasowski, Janusz; Jibson, Randell W.; Huang, Runqiu; van Asch, Theo


    Large-magnitude earthquakes occur every year, but most hit remote and uninhabited regions and thus go unnoticed. Although populated areas are affected infrequently by large earthquakes, each time the outcomes are devastating in terms of life and property loss. The human and economic costs of natural

  3. Meeting of the Central and Eastern U.S. (CEUS) Earthquake Hazards Program October 28–29, 2009 (United States)

    Tuttle, Martitia; Boyd, Oliver; McCallister, Natasha


    On October 28th and 29th, 2009, the U.S. Geological Survey Earthquake Hazards Program held a meeting of Central and Eastern United States investigators and interested parties in Memphis, Tennessee. The purpose of the meeting was to bring together the Central and Eastern United States earthquake-hazards community to present and discuss recent research results, to promote communication and collaboration, to garner input regarding future research priorities, to inform the community about research opportunities afforded by the 2010–2012 arrival of EarthScope/USArray in the central United States, and to discuss plans for the upcoming bicentennial of the 1811–1812 New Madrid earthquakes. The two-day meeting included several keynote speakers, oral and poster presentations by attendees, and breakout sessions. The meeting is summarized in this report and can be subdivided into four primary sections: (1) summaries of breakout discussion groups; (2) list of meeting participants; (3) submitted abstracts; and (4) slide presentations. The abstracts and slides are included “as submitted” by the meeting participants and have not been subject to any formal peer review process; information contained in these sections reflects the opinions of the presenter at the time of the meeting and does not constitute endorsement by the U.S. Geological Survey.

  4. Citizens at risk from earthquake hazard in Dhaka city: Scaling risk factors from household to city region level

    Directory of Open Access Journals (Sweden)

    Ferdous Israt


    Full Text Available Dhaka city is under the looming threat of cataclysmic earthquake. However, the factors from which the citizens are at risk may not the same for its all parts. Dividing the city into three geographical scales: Old (Shankhari Bazaar, Developed (Segunbaghicha and Newly Developing (Uttara 3rd Phase areas, this research explores the risk factors of earthquake hazard from household to city-region level. Based on FGD at community level, in-depth interview of experts and policymakers, observation and secondary sources of data the study finds citizens of Old Dhaka are at high risk because of the obsolete and dilapidated building structures they live in whereas unauthorized high rise buildings is a massive threat for the dwellers living in developed Dhaka. The results of this research highlight that fact that enormous filling of low-lying lands enhances high earthquake risk and may cause severe liquefaction effects to the residents of newly developing areas of Dhaka. The comprehensive outcomes of this study are emphasized on raising the on-going public awareness programs, following the building codes strictly and implementing the disaster risk reduction approach into land use planning which can possibly reduce earthquake risk in Dhaka city.

  5. 9 CFR 417.2 - Hazard Analysis and HACCP Plan. (United States)


    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard...) Physical hazards. (b) The HACCP plan. (1) Every establishment shall develop and implement a written HACCP...

  6. Detailed Analysis of a Multiplet Earthquake Sequence (United States)

    Iglesias, A.; Singh, S. K.; Garduño, V. H.


    The Mexican National Seismological Service reported a sequence of four small earthquakes (2.5 < M < 3.0) occurring in Morelia, a city of 1,000,000, which is the capital city of Michoacán State. A careful revision of the records from a three-component broad band station, located ~10 km far from the earthquakes, showed a sequence of 7 earthquakes in a period of about 36 hours. Waveforms are remarkably similar between them and they may be considered as a "multiplet". In this work, we use the records from the broad-band station and a coda wave interferometry based methodology to obtain the relative distance between pair of events. The 21 inter-event distances obtained are considered as over-determined system for the relative positions between events. A non-linear damped scheme is used to solve the over-determined system and to obtain the spatial distribution of the 7 earthquakes. Results show (1) distances between events are < 200 m, and (2) the sequence has an approximate linear distribution.

  7. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)


    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  8. The hazard map of ML6.6 0206 Meinong earthquake near Guanmiao and its Neotectonic implication (United States)

    Chung, L. H.; Shyu, J. B. H.; Huang, M. H.; Yang, K. M.; Le Beon, M.; Lee, Y. H.; Chuang, R.; Yi, D.


    The serious damage was occurred in SW Taiwan by ML 6.6 0206 Meinong earthquake. Based on InSAR result, 10 cm oval-raised surface deformation is 15 km away from its epicenter, and two obviously N-S trend sharp phase change nearby Guanmiao area. Our field investigation shows bulling damage and surface fracture are high related with the two sharp phase change. Here, we perform the detailed shallow underground geometry by using reflection seismic data, geologic data, and field hazard investigation. This N-S trend surface deformation may be induced by local shallow folding, while the huge uplift west of Guanmiao may be related with pure shear deformation of thick clayey Gutingkeng (GTK) Formation. Our results imply that not only a moderate lower crustal earthquake can trigger active structure at shallower depth, but also those minor shallow active structures are occurred serious damage and surface deformation.

  9. An integrated approach to earthquake-induced landslide hazard zoning based on probabilistic seismic scenario for Phlegrean Islands (Ischia, Procida and Vivara), Italy (United States)

    Caccavale, Mauro; Matano, Fabio; Sacchi, Marco


    In this study we present an integrated approach to assess earthquake-induced landslide hazard at the source area of the slope instability process. The method has been applied to the case study of Ischia, Procida and Vivara islands that represent an integral part of the Campi Flegrei, a densely populated, active volcanic area, located at the NW margin of the Naples Bay, Italy. The proposed method follows a stepwise procedure including: 1) Probabilistic Seismic Hazard Analysis (PSHA); 2) assessment of site and topographic effects; 3) input of the PSHA outputs into a classic sliding rigid-block analysis for slope instability (Newmark's approach); 4) construction of landslide frequency - magnitude curves for the estimate of the slope failure probability as a function of defined Newmark's threshold values under different probabilistic seismic scenarios; 5) construction of earthquake-induced landslide hazard maps at the source area, based on the integration of the probabilistic approach and the geological, morphological and geotechnical database available for the study area. The Probabilistic Seismic Hazard Analysis (PSHA) is aimed at the definition of the seismic input with different annual exceedance frequency. PSHA results, expressed in terms of Peak Ground Acceleration (PGA) at the bedrock, are calculated for 14 return periods (T) ranging from 10 to 2000 yr. PGA values have been corrected for the site effect associated with geological and morphologic conditions for each selected return period. Secondly, the corrected PGA values have been used as an input for the classic sliding rigid-block Newmark's approach, implemented in a Geographic Information System (GIS) to assess the relative potential for slope failure (landslide susceptibility) both in static (Factor of Safety, FS) and dynamic (Critical acceleration, ac) conditions. The combination of T-dependent, site-corrected PGA with the critical acceleration allowed for the calculation of the expected Newmark

  10. Earthquake Hazard and Segmented Fault Evolution, Hat Creek Fault, Northern California (United States)

    Blakeslee, M. W.; Kattenhorn, S. A.


    Precise insight into surface rupture and the evolution and mechanical interaction of segmented normal fault systems is critical for assessing the potential seismic hazard. The Hat Creek fault is a ~35 km long, NNW trending segmented normal fault system located on the western boundary of the Modoc Plateau and within the extending backarc basin of the Cascadia subduction zone in northern California. The Hat Creek fault has a prominent surface rupture showing evidence of multiple events in the past 15 ka, although there have been no historic earthquakes. In response to interactions with volcanic activity, the fault system has progressively migrated several km westward, causing older scarps to become seemingly inactive, and producing three distinct, semi-parallel scarps with different ages. The oldest scarp, designated the “Rim”, is the farthest west and has up to 352 m of throw. The relatively younger “Pali” scarp has up to 174 m of throw. The young “Active” scarp has a maximum throw of 65 m in the 24±6 ka Hat Creek basalt, with 20 m of throw in ~15 ka glacial gravels (i.e., a Holocene slip rate of ~1.3 mm/yr). Changes in the geometry and kinematics of the separate scarps during the faulting history imply the orientation of the stress field has rotated clockwise, now inducing oblique right-lateral motion. Previous studies suggested that the Active scarp consists of 7 left-stepping segments with a cumulative length of 23.5 km. We advocate that the Active scarp is actually composed of 8 or 9 segments and extends 4 km longer than previous estimates. This addition to the active portion of the fault is based on detailed mapping of a young surface rupture in the northern portion of the fault system. This ~30 m high young scarp offsets lavas that erupted from Cinder Butte, a low shield volcano, but has a similar geometry and properties as the Active scarp in the Hat Creek basalt. At its northern end, the Active scarp terminates at Cinder Butte. Our mapping

  11. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis. (United States)

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W


    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  12. Java Programs for Using Newmark's Method and Simplified Decoupled Analysis to Model Slope Performance During Earthquakes (United States)

    Jibson, Randall W.; Jibson, Matthew W.


    Landslides typically cause a large proportion of earthquake damage, and the ability to predict slope performance during earthquakes is important for many types of seismic-hazard analysis and for the design of engineered slopes. Newmark's method for modeling a landslide as a rigid-plastic block sliding on an inclined plane provides a useful method for predicting approximate landslide displacements. Newmark's method estimates the displacement of a potential landslide block as it is subjected to earthquake shaking from a specific strong-motion record (earthquake acceleration-time history). A modification of Newmark's method, decoupled analysis, allows modeling landslides that are not assumed to be rigid blocks. This open-file report is available on CD-ROM and contains Java programs intended to facilitate performing both rigorous and simplified Newmark sliding-block analysis and a simplified model of decoupled analysis. For rigorous analysis, 2160 strong-motion records from 29 earthquakes are included along with a search interface for selecting records based on a wide variety of record properties. Utilities are available that allow users to add their own records to the program and use them for conducting Newmark analyses. Also included is a document containing detailed information about how to use Newmark's method to model dynamic slope performance. This program will run on any platform that supports the Java Runtime Environment (JRE) version 1.3, including Windows, Mac OSX, Linux, Solaris, etc. A minimum of 64 MB of available RAM is needed, and the fully installed program requires 400 MB of disk space.

  13. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases (United States)

    Dunbar, P. K.; McCullough, H. L.


    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  14. The Effects of the Passage of Time from the 2011 Tohoku Earthquake on the Public’s Anxiety about a Variety of Hazards

    Directory of Open Access Journals (Sweden)

    Kazuya Nakayachi


    Full Text Available This research investigated whether the Japanese people’s anxiety about a variety of hazards, including earthquakes and nuclear accidents, has changed over time since the Tohoku Earthquake in 2011. Data from three nationwide surveys conducted in 2008, 2012, and 2015 were compared to see the change in societal levels of anxiety toward 51 types of hazards. The same two-phase stratified random sampling method was used to create the list of participants in each survey. The results showed that anxiety about earthquakes and nuclear accidents had increased for a time after the Tohoku Earthquake, and then decreased after a four-year time frame with no severe earthquakes and nuclear accidents. It was also revealed that the anxiety level for some hazards other than earthquakes and nuclear accidents had decreased at ten months after the Earthquake, and then remained unchanged after the four years. Therefore, ironically, a major disaster might decrease the public anxiety in general at least for several years.

  15. The 1856 Djijelli (Algeria earthquake and tsunami: source parameters and implications for tsunami hazard in the Balearic Islands

    Directory of Open Access Journals (Sweden)

    J. Roger


    Full Text Available In 1856, one (or two destructive earthquake(s occurred off Djijelli (Algeria and probably triggered a tsunami in the western Mediterranean Sea. Following recently published results of marine campaigns along the North-Algerian margin, a new source hypothesis for the earthquake has been proposed, and is constituted with a set of three "en échelon" fault segments positioned in agreement with previous studies of this earthquake and with macroseismic data available. The geometrical parameters for this source, in agreement with a Mw = 7.2 earthquake, display an average 40° NW dip, a 80° strike and mean dimensions of 80 km (length × 20 km (width. A coseismic slip of 1.5 m is consistent with an average convergence rate of about 5–6 mm/yr and a recurrence period of 300–400 years. They are then introduced in the tsunami modelling code to study the propagation across the Mediterranean Sea with a special attention towards the Balearic Islands. A focus on the two major towns, Palma (Majorca and Mahon (Minorca Harbours shows that these places are not the most exposed (maximum water heights less than 1 m by tsunami waves coming from this part of the African margin. Specific amplifications revealed by modelling occur off the southern coast of Minorca and the southeastern coast of Majorca, mostly related to submarine bathymetric features, and are able to produce coastal wave heights larger than 1 to 2 m as offshore Alcalfar (Minorca. A deep submarine canyon southward Minorca leads to the amplification of waves up to two times on both sides of the canyon. However these modellings could not be compared to any historical observations, non-existent for these sites. This work is a contribution to the study of tsunami hazard in western Mediterranean based on modelling, and offers a first assessment of the tsunami exposure in the Balearic Islands.

  16. The 1856 Djijelli (Algeria) earthquake and tsunami: source parameters and implications for tsunami hazard in the Balearic Islands (United States)

    Roger, J.; Hébert, H.


    In 1856, one (or two) destructive earthquake(s) occurred off Djijelli (Algeria) and probably triggered a tsunami in the western Mediterranean Sea. Following recently published results of marine campaigns along the North-Algerian margin, a new source hypothesis for the earthquake has been proposed, and is constituted with a set of three "en échelon" fault segments positioned in agreement with previous studies of this earthquake and with macroseismic data available. The geometrical parameters for this source, in agreement with a Mw = 7.2 earthquake, display an average 40° NW dip, a 80° strike and mean dimensions of 80 km (length) × 20 km (width). A coseismic slip of 1.5 m is consistent with an average convergence rate of about 5 6 mm/yr and a recurrence period of 300 400 years. They are then introduced in the tsunami modelling code to study the propagation across the Mediterranean Sea with a special attention towards the Balearic Islands. A focus on the two major towns, Palma (Majorca) and Mahon (Minorca) Harbours shows that these places are not the most exposed (maximum water heights less than 1 m) by tsunami waves coming from this part of the African margin. Specific amplifications revealed by modelling occur off the southern coast of Minorca and the southeastern coast of Majorca, mostly related to submarine bathymetric features, and are able to produce coastal wave heights larger than 1 to 2 m as offshore Alcalfar (Minorca). A deep submarine canyon southward Minorca leads to the amplification of waves up to two times on both sides of the canyon. However these modellings could not be compared to any historical observations, non-existent for these sites. This work is a contribution to the study of tsunami hazard in western Mediterranean based on modelling, and offers a first assessment of the tsunami exposure in the Balearic Islands.

  17. Global volcanic earthquake swarm database and preliminary analysis of volcanic earthquake swarm duration

    Directory of Open Access Journals (Sweden)

    S. R. McNutt


    Full Text Available Global data from 1979 to 1989 pertaining to volcanic earthquake swarms have been compiled into a custom-designed relational database. The database is composed of three sections: 1 a section containing general information on volcanoes, 2 a section containing earthquake swarm data (such as dates of swarm occurrence and durations, and 3 a section containing eruption information. The most abundant and reliable parameter, duration of volcanic earthquake swarms, was chosen for preliminary analysis. The distribution of all swarm durations was found to have a geometric mean of 5.5 days. Precursory swarms were then separated from those not associated with eruptions. The geometric mean precursory swarm duration was 8 days whereas the geometric mean duration of swarms not associated with eruptive activity was 3.5 days. Two groups of precursory swarms are apparent when duration is compared with the eruption repose time. Swarms with durations shorter than 4 months showed no clear relationship with the eruption repose time. However, the second group, lasting longer than 4 months, showed a significant positive correlation with the log10 of the eruption repose period. The two groups suggest that different suites of physical processes are involved in the generation of volcanic earthquake swarms.

  18. Workflow Management of the SCEC Computational Platforms for Physics-Based Seismic Hazard Analysis (United States)

    Jordan, T. H.; Callaghan, S.; Maechling, P. J.; Juve, G.; Deelman, E.; Rynge, M.; Vahi, K.; Silva, F.


    Earthquake simulation has the potential to substantially improve seismic hazard and risk forecasting, but the practicality of using simulation results is limited by the scale and complexity of the computations. Here we will focus on the experience of the Southern California Earthquake Center (SCEC) in applying workflow management tools to facilitate physics-based seismic hazard analysis. This system-level problem can be partitioned into a series of computational pathways according to causal sequences described in terms of conditional probabilities. For example, the exceedance probabilities of shaking intensities at geographically distributed sites conditional on a particular fault rupture (a ground motion prediction model or GMPM) can be combined with the probabilities of different ruptures (an earthquake rupture forecast or ERF) to create a seismic hazard map. Deterministic simulations of ground motions from very large suites (millions) of ruptures, now feasible through high-performance computational facilities such as SCEC's CyberShake Platform, are allowing seismologists to replace empirical GMPMs with physics-based models that more accurately represent wave propagation through heterogeneous geologic structures, such as the sedimentary basins that amplify seismic shaking. One iteration of the current broadband CyberShake hazard model for the Los Angeles region, which calculates ground motions deterministically up to 0.5 Hz and stochastically up to 10 Hz, requires the execution of about 3.3 billion jobs, taking 12.8 million computer hours and producing 10 TB of simulation data. We will show how the scalability and reliability of CyberShake calculations on some of the nation's largest computers has been improved using the Pegasus Workflow Management System. We will also describe the current challenges of scaling these calculations up by an order of magnitude to create a California-wide hazard model, which will be based on the new Uniform California Earthquake

  19. The safety evaluation of earthquake emergency shelter based on the finite element analysis (United States)

    Sun, Baitao; Yu, Jingjing; Yan, Peilei


    The earthquake emergency shelter is the powerful safeguard to resist the natural hazard, human accident and other accidents, so evaluate the buildings whether can be the earthquake emergency shelter appear to be particularly important. So far the adoptive evaluation system inland has not the united criterion and subjectivity, hence it is necessary to realize the quantitative evaluation. The paper set the example of Nenjiang county to make the safety evaluation, the method is combining the measured project profile and the calculative anti-earthquake performance index, comprehensive assessment the buildings' anti-earthquake redundancy, finally providing the identification results. Initially summary, the safety identification method to earthquake emergency shelter has definite guiding significance.

  20. Seismic Hazard analysis of Adjaria Region in Georgia (United States)

    Jorjiashvili, Nato; Elashvili, Mikheil


    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  1. The 2011 Lorca earthquake in the context of seismic hazard and risk in Murcia; El terremoto de Lorca (2011) en el contexto de la peligrosidad y el riesgo sismico en Murcia

    Energy Technology Data Exchange (ETDEWEB)

    Belin Oterino, B.; Rivas Medina, A.; Gaspar-Escribano, J. M.; Murphy, P.


    An analysis of the different aspects related to the May 11th, 2011 Lorca earthquake is presented, covering recorded ground motions, damage observed in different building typologies, and contrasting these observations with previous results on seismic hazard and seismic risk obtained in the province of Murcia. The essential question addressed in the analysis is whether observed ground motions and physical damage can be considered as expected or as anomalous in the frame of seismic risk in southeastern Spain. In this respect, a number of reflections are carried out and several learned lessons from the earthquake are extracted, which leads to the proposal of different recommendations for the future revision of the Spanish earthquake-resistant provisions, as well as for defining risk reduction measurements in the region. (Author) 25 refs.

  2. Probability estimates of seismic event occurrence compared to health hazards - Forecasting Taipei's Earthquakes (United States)

    Fung, D. C. N.; Wang, J. P.; Chang, S. H.; Chang, S. C.


    Using a revised statistical model built on past seismic probability models, the probability of different magnitude earthquakes occurring within variable timespans can be estimated. The revised model is based on Poisson distribution and includes the use of best-estimate values of the probability distribution of different magnitude earthquakes recurring from a fault from literature sources. Our study aims to apply this model to the Taipei metropolitan area with a population of 7 million, which lies in the Taipei Basin and is bounded by two normal faults: the Sanchaio and Taipei faults. The Sanchaio fault is suggested to be responsible for previous large magnitude earthquakes, such as the 1694 magnitude 7 earthquake in northwestern Taipei (Cheng et. al., 2010). Based on a magnitude 7 earthquake return period of 543 years, the model predicts the occurrence of a magnitude 7 earthquake within 20 years at 1.81%, within 79 years at 6.77% and within 300 years at 21.22%. These estimates increase significantly when considering a magnitude 6 earthquake; the chance of one occurring within the next 20 years is estimated to be 3.61%, 79 years at 13.54% and 300 years at 42.45%. The 79 year period represents the average lifespan of the Taiwan population. In contrast, based on data from 2013, the probability of Taiwan residents experiencing heart disease or malignant neoplasm is 11.5% and 29%. The inference of this study is that the calculated risk that the Taipei population is at from a potentially damaging magnitude 6 or greater earthquake occurring within their lifetime is just as great as of suffering from a heart attack or other health ailments.

  3. Tsunami hazard assessments with consideration of uncertain earthquake slip distribution and location (United States)

    Sepúlveda, Ignacio; Liu, Philip L.-F.; Grigoriu, Mircea; Pritchard, Matthew


    This paper proposes a stochastic approach to model the earthquake uncertainties in terms of the rupture location and the slip distribution for a future event, with an expected earthquake magnitude. Once the statistical properties of earthquake uncertainties are described, they are then propagated into the tsunami response and the inundation at assessed coastal areas. The slip distribution is modeled as a random field within a nonrectangular rupture area. The Karhunen-Lòeve (K-L) expansion method is used to generate samples of the random slip, and a translation model is employed to obtain target probability properties. A strategy is developed to specify the accuracy of the random samples in terms of numbers of subfaults of the rupture area and the truncation of the K-L expansion. The propagation of uncertainty into the tsunami response is performed by means of a Stochastic Reduced Order Model. To illustrate the methodology, we investigated a study case in north Chile. We first demonstrate that the stochastic approach generates consistent earthquake samples with respect to the target probability properties. We also show that the results obtained from SROM are more accurate than those obtained with classic Monte Carlo simulations. To validate the methodology, we compared the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainties are as relevant as the uncertainty of earthquakes.

  4. Natural Hazard Public Policy Implications of the May 12, 2008 M7.9 Wenchuan Earthquake, Sichuan, China (United States)

    Cydzik, K.; Hamilton, D.; Stenner, H. D.; Cattarossi, A.; Shrestha, P. L.


    by the earthquake have allowed survivors to begin rebuilding their lives. However, the long-term impact of the earthquake continues to make headlines. Post-earthquake landslides and debris flows initiated by storm events have continued to impart devastation on the region. Events such as the Wenchuan Earthquake provide unique opportunities for engineers, scientists, and policy makers to collaborate for purposes of exploring the details of natural hazards and developing sound policies to protect lives and property in the future.

  5. A cross section of the Los Angeles Area: Seismically active fold and thrust belt, The 1987 Whittier Narrows earthquake, and earthquake hazard (United States)

    Davis, Thomas L.; Namson, Jay; Yerkes, Robert F.


    Retrodeformable cross sections across the Los Angeles area interpret the Pliocene to Quaternary deformation to be a developing basement-involved fold and thrust belt. The fold and thrust belt is seismically active as evidenced by the 1987 Whittier Narrows earthquake (ML = 5.9) and the 1971 San Fernando earthquake (MW = 6.6). The structural geology of the Los Angeles area is dominated by three major compressional uplift trends: (1) the Palos Verdes anticlinorium and western shelf, (2) the Santa Monica Mountains anticlinorium, and (3) the Verdugo Mountains-San Rafael Hills and the San Gabriel Mountains. These trends result from major thrust ramps off a detachment(s) at 10-15 km depth. Thrusts of the Verdugo Mountains-San Rafael Hills and the San Gabriel Mountains reach the surface; the other two uplifts are associated with blind thrusts. Compressional seismicity is concentrated along these thrust ramps. The 1987 Whittier Narrows earthquake probably occurred on the Elysian Park thrust which underlies the Santa Monica Mountains anticlinorium. The thrust interpretation accounts for the geometry of the anticlinorium, the seismological characteristics of the earthquake, and the geometry of coseismic uplift. The earthquake and aftershocks occurred within a structurally complex, narrow zone of Miocene and Pliocene northwest trending faults that cross the anticlinorium at a high angle. These northwest trending faults are interpreted to be reactivated faults now behaving as tears in the Elysian Park thrust and not the result of active right-lateral deformation extending into the Whittier Narrows area. Our analysis suggests the Whittier Narrows earthquake sequence occurred within a structurally weakened zone along the Elysian Park thrust. We also suggest that the Whittier fault is not an important Quaternary structure and may not be seismogenic. The regional cross section is a nonunique solution, and other possible solutions are considered. Multiple solutions arise from the

  6. Fractal analysis of the spatial distribution of earthquakes along the Hellenic Subduction Zone (United States)

    Papadakis, Giorgos; Vallianatos, Filippos; Sammonds, Peter


    slope of the recurrence curve to forecast earthquakes in Colombia. Earth Sci. Res. J., 8, 3-9. Makropoulos, K., Kaviris, G., Kouskouna, V., 2012. An updated and extended earthquake catalogue for Greece and adjacent areas since 1900. Nat. Hazards Earth Syst. Sci., 12, 1425-1430. Papadakis, G., Vallianatos, F., Sammonds, P., 2013. Evidence of non extensive statistical physics behavior of the Hellenic Subduction Zone seismicity. Tectonophysics, 608, 1037-1048. Papaioannou, C.A., Papazachos, B.C., 2000. Time-independent and time-dependent seismic hazard in Greece based on seismogenic sources. Bull. Seismol. Soc. Am., 90, 22-33. Robertson, M.C., Sammis, C.G., Sahimi, M., Martin, A.J., 1995. Fractal analysis of three-dimensional spatial distributions of earthquakes with a percolation interpretation. J. Geophys. Res., 100, 609-620. Turcotte, D.L., 1997. Fractals and chaos in geology and geophysics. Second Edition, Cambridge University Press. Vallianatos, F., Michas, G., Papadakis, G., Sammonds, P., 2012. A non-extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece). Acta Geophys., 60, 758-768.

  7. The 1930 Irpinia earthquake: collection and analysis of historical waveforms (United States)

    Ferrari, G.; Megna, A.; Nardi, A.; Palombo, B.; Perniola, B.; Pino, N.


    The 1930 Irpinia earthquake is one of the most destructive events recorded by instruments in Italy. Several large events occurred in the same area before (1456, 1694, 1702, 1732, 1910) and after (1962, 1980, 1983) 1930. It has been hypothesized that significant differences characterized the source geometry. Early work carried out by several authors on macroseismic studies and a single-station waveform analysis, suggests a quasi-strike slip mechanism on an approximately EW-oriented fault plain. Conversely, all the major events in the area display normal fault mechanisms on Apennine-oriented (NW-SE) fault planes. In the present work we have collected about 45 waveforms for the 1930 earthquake, recorded in various European observatories, aiming to find precious hints on source geometry and kinematics. The seismograms have been rasterized, digitized and processed within the framework of the SISMOS project. The study of this earthquake is part of a wider ongoing research program on the 20th century Irpinia earthquakes (1910, 1030, 1962 and 1980) within the collaboration between the TROMOS and SISMOS projects of the National Institute of Geophysics and Volcanology. The search and recovery of the historical recordings is a unique opportunity to shed light upon scientific aspects related to this kind of investigation. Preliminary results about the 1930 earthquake waveform analysis are presented here.

  8. Seismic hazard analysis of Tianjin area based on strong ground motion prediction (United States)

    Zhao, Boming


    Taking Tianjin as an example, this paper proposed a methodology and process for evaluating near-fault strong ground motions from future earthquakes to mitigate earthquake damage for the metropolitan area and important engineering structures. The result of strong ground motion was predicted for Tianjin main faults by the hybrid method which mainly consists of 3D finite difference method and stochastic Green’s function. Simulation is performed for 3D structures of Tianjin region and characterized asperity models. The characterized asperity model describing source heterogeneity is introduced following the fault information from the project of Tianjin Active Faults and Seismic Hazard Assessment. We simulated the worst case that two earthquakes separately occur. The results indicate that the fault position, rupture process and the sedimentary deposits of the basin significantly affect amplification of the simulated ground motion. Our results also demonstrate the possibility of practical simulating wave propagation including basin induced surface waves in broad frequency-band, for seismic hazard analysis near the fault from future earthquakes in urbanized areas.

  9. A new Bayesian Earthquake Analysis Tool (BEAT) (United States)

    Vasyura-Bathke, Hannes; Dutta, Rishabh; Jónsson, Sigurjón; Mai, Martin


    Modern earthquake source estimation studies increasingly use non-linear optimization strategies to estimate kinematic rupture parameters, often considering geodetic and seismic data jointly. However, the optimization process is complex and consists of several steps that need to be followed in the earthquake parameter estimation procedure. These include pre-describing or modeling the fault geometry, calculating the Green's Functions (often assuming a layered elastic half-space), and estimating the distributed final slip and possibly other kinematic source parameters. Recently, Bayesian inference has become popular for estimating posterior distributions of earthquake source model parameters given measured/estimated/assumed data and model uncertainties. For instance, some research groups consider uncertainties of the layered medium and propagate these to the source parameter uncertainties. Other groups make use of informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed that efficiently explore the often high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational demands of these methods are high and estimation codes are rarely distributed along with the published results. Even if codes are made available, it is often difficult to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in earthquake source estimations, we undertook the effort of producing BEAT, a python package that comprises all the above-mentioned features in one

  10. Fire hazard analysis for fusion energy experiments

    Energy Technology Data Exchange (ETDEWEB)

    Alvares, N.J.; Hasegawa, H.K.


    The 2XIIB mirror fusion facility at Lawrence Livermore Laboratory (LLL) was used to evaluate the fire safety of state-of-the-art fusion energy experiments. The primary objective of this evaluation was to ensure the parallel development of fire safety and fusion energy technology. Through fault-tree analysis, we obtained a detailed engineering description of the 2XIIB fire protection system. This information helped us establish an optimum level of fire protection for experimental fusion energy facilities as well as evaluate the level of protection provided by various systems. Concurrently, we analyzed the fire hazard inherent to the facility using techniques that relate the probability of ignition to the flame spread and heat-release potential of construction materials, electrical and thermal insulations, and dielectric fluids. A comparison of the results of both analyses revealed that the existing fire protection system should be modified to accommodate the range of fire hazards inherent to the 2XIIB facility.

  11. Decision analysis for INEL hazardous waste storage

    Energy Technology Data Exchange (ETDEWEB)

    Page, L.A.; Roach, J.A.


    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  12. Tsunami hazards to U.S. coasts from giant earthquakes in Alaska (United States)

    Ryan, Holly; von Huene, Roland; Scholl, Dave; Kirby, Steve


    In the aftermath of Japan's devastating 11 March 2011Mw 9.0 Tohoku earthquake and tsunami, scientists are considering whether and how a similar tsunami could be generated along the Alaskan-Aleutian subduction zone (AASZ). A tsunami triggered by an earthquake along the AASZ would cross the Pacific Ocean and cause extensive damage along highly populated U.S. coasts, with ports being particularly vulnerable. For example, a tsunami in 1946 generated by a Mw 8.6 earthquake near Unimak Pass, Alaska (Figure 1a), caused signifcant damage along the U.S. West Coast, took 150 lives in Hawaii, and inundated shorelines of South Pacific islands and Antarctica [Fryer et al., 2004; Lopez and Okal, 2006]. The 1946 tsunami occurred before modern broadband seismometers were in place, and the mechanisms that created it remain poorly understood.

  13. Update of the USGS 2016 One-year Seismic Hazard Forecast for the Central and Eastern United States From Induced and Natural Earthquakes (United States)

    Petersen, M. D.; Mueller, C. S.; Moschetti, M. P.; Hoover, S. M.; Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.; Rubinstein, J. L.; McGarr, A.; Rukstales, K. S.


    The U.S. Geological Survey released a 2016 one-year forecast for seismic hazard in the central and eastern U.S., which included the influence from both induced and natural earthquakes. This forecast was primarily based on 2015 declustered seismicity rates but also included longer-term rates, 10- and 20- km smoothing distances, earthquakes between Mw 4.7 and maximum magnitudes of 6.0 or 7.1, and 9 alternative ground motion models. Results indicate that areas in Oklahoma, Kansas, Colorado, New Mexico, Arkansas, Texas, and the New Madrid Seismic Zone have a significant chance for damaging ground shaking levels in 2016 (greater than 1% chance of exceeding 0.12 PGA and MMI VI). We evaluate this one-year forecast by considering the earthquakes and ground shaking levels that occurred during the first half of 2016 (earthquakes not included in the forecast). During this period the full catalog records hundreds of events with M ≥ 3.0, but the declustered catalog eliminates most of these dependent earthquakes and results in much lower numbers of earthquakes. The declustered catalog based on USGS COMCAT indicates a M 5.1 earthquake occurred in the zone of highest hazard on the map. Two additional earthquakes of M ≥ 4.0 occurred in Oklahoma, and about 82 earthquakes of M ≥ 3.0 occurred with 77 in Oklahoma and Kansas, 4 in Raton Basin Colorado/New Mexico, and 1 near Cogdell Texas. In addition, 72 earthquakes occurred outside the zones of induced seismicity with more than half in New Madrid and eastern Tennessee. The catalog rates in the first half of 2016 and the corresponding seismic hazard were generally lower than in 2015. For example, the zones for Irving, Venus, and Fashing, Texas; Sun City, Kansas; and north-central Arkansas did not experience any earthquakes with M≥ 2.7 during this period. The full catalog rates were lower by about 30% in Raton Basin and the Oklahoma-Kansas zones but the declustered catalog rates did not drop as much. This decrease in earthquake

  14. An Overview of Soil Models for Earthquake Response Analysis

    Directory of Open Access Journals (Sweden)

    Halida Yunita


    Full Text Available Earthquakes can damage thousands of buildings and infrastructure as well as cause the loss of thousands of lives. During an earthquake, the damage to buildings is mostly caused by the effect of local soil conditions. Depending on the soil type, the earthquake waves propagating from the epicenter to the ground surface will result in various behaviors of the soil. Several studies have been conducted to accurately obtain the soil response during an earthquake. The soil model used must be able to characterize the stress-strain behavior of the soil during the earthquake. This paper compares equivalent linear and nonlinear soil model responses. Analysis was performed on two soil types, Site Class D and Site Class E. An equivalent linear soil model leads to a constant value of shear modulus, while in a nonlinear soil model, the shear modulus changes constantly,depending on the stress level, and shows inelastic behavior. The results from a comparison of both soil models are displayed in the form of maximum acceleration profiles and stress-strain curves.

  15. A critical analysis of earthquakes and urban planning in Turkey. (United States)

    Sengezer, Betül; Koç, Ercan


    The land use plans and policies of developed countries that live with the threat of earthquakes are gaining importance in reducing or eliminating the long-term threat to people and property. In developing countries, however, these plans and policies seem to increase the level of vulnerability. This paper examines the effects of the earthquakes that have occurred in Turkey since 1992, with a particular focus on urbanisation and planning policies. It is based on extensive surveys carried out on location immediately after the earthquakes in Erzincan and Kocaeli-Gölcük in 1992 and 1999, respectively. The analysis takes into account several factors, including the height of buildings, geological conditions and the construction period. The authors conclude that land use planning can serve as a very useful instrument for mitigating the extent of disaster damage if it is part of an appropriate planning system. In the case of Turkey, the planning system needs to be reorganised for this purpose.

  16. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis (United States)

    Schaefer, K.; Prakash, A.; Witte, W.


    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  17. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.


    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  18. Integrating human factors into process hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kariuki, S.G. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany); Loewe, K. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany)]. E-mail:


    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors.

  19. Integrating landslide and liquefaction hazard and loss estimates with existing USGS real-time earthquake information products (United States)

    Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan


    The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products. 

  20. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.


    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  1. Current issues and related activities in seismic hazard analysis in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong-Moon [Korea Atomic Energy Research Inst., Taejon (Korea, Republic of); Lee, Jong-Rim; Chang, Chun-Joong


    This paper discusses some technical issues identified from the seismic hazard analyses for probabilistic safety assessment on the operating Korean nuclear power plants and the related activities to resolve the issues. Since there are no strong instrumental earthquake records in Korea, the seismic hazard analysis is mainly dependent on the historical earthquake records. Results of the past seismic hazard analyses show that there are many uncertainties in attenuation function and intensity level and that there is a need to improve statistical method. The identification of the activity of the Yangsan Fault, which is close to nuclear power plant sites, has been an important issue. But the issue has not been resolved yet in spite of much research works done. Recently, some capable faults were found in the offshore area of Gulupdo Island in the Yellow Sea. It is anticipated that the results of research on both the Yangsan Fault and reduction of uncertainty in seismic hazard analysis will have an significant influence on seismic design and safety assessment of nuclear power plants in the future. (author)

  2. Nowcasting Earthquakes (United States)

    Rundle, J. B.; Donnellan, A.; Grant Ludwig, L.; Turcotte, D. L.; Luginbuhl, M.; Gail, G.


    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system, and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(nearthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(nhazard, and assigns a number between 0% and 100% to every region so defined, thus providing a unique measure. Physically, the EPS corresponds to an estimate of the level of progress through the earthquake cycle in the defined region at the current time.

  3. Geological and Seismological Evaluation of Earthquake Hazards at Ririe Dam, Idaho (United States)


    Office of Management and Budget . Paperwork Reduction Pro ect (0704-0188). Wasmington. DC 20503. 1. AGENCY USE ONLY (Leave blank) | 2. REPORT DATE extensive flows of basaltic lavas. The thick lower block is poorly understood, but is believed to represent magmatic material. A mantle hot spot...Major faults and earthquakes of MM intensity VI and greater 20 the plain was produced by isostatic adjustments of the dense magmatic fill that was

  4. Heredity links natural hazards and human health: Apolipoprotein E gene moderates the health of earthquake survivors. (United States)

    Daly, Michael; MacLachlan, Malcolm


    This study aimed to investigate the role of the apolipoprotein ε4 allele in moderating the influence of an exogenous stressor, an earthquake, on health. A "natural experiment" design was used where the interaction between the presence of the apolipoprotein ε4 allele and the level of subjective and objective exposure to a devastating earthquake was examined in a population-based cohort of elderly Taiwanese (N = 718). The cognitive-affective dimension of health was assessed by measures of perceived control and depression and functional limitations were assessed using measures of instrumental activities of daily living and mobility. Overall health status was gauged using a single-item measure of self-rated health. Those who experienced damage to their property or were forced to move from their homes (high objective exposure) demonstrated low levels of self-rated health and somewhat lower perceived control a year later, only if they were apolipoprotein ε4 carriers. Similarly, those who found the earthquake severely distressing (high subjective exposure) were shown to have low levels of functioning and low self-rated health a year later, only if they possessed the ε4 allele. Our findings suggest that genetic variation in the apolipoprotein E gene may modify the health effects of the exogenous stress of natural disaster exposure.

  5. Surface-seismic imaging for nehrp soil profile classifications and earthquake hazards in urban areas (United States)

    Williams, R.A.; Stephenson, W.J.; Odum, J.K.


    We acquired high-resolution seismic-refraction data on the ground surface in selected areas of the San Fernando Valley (SFV) to help explain the earthquake damage patterns and the variation in ground motion caused by the 17 January 1994 magnitude 6.7 Northridge earthquake. We used these data to determine the compressional- and shear-wave velocities (Vp and Vs) at 20 aftershock recording sites to 30-m depth ( V??s30, and V??p30). Two other sites, located next to boreholes with downhole Vp and Vs data, show that we imaged very similar seismic-vefocity structures in the upper 40 m. Overall, high site response appears to be associated with tow Vs in the near surface, but there can be a wide rangepf site amplifications for a given NEHRP soil type. The data suggest that for the SFV, if the V??s30 is known, we can determine whether the earthquake ground motion will be amplified above a factor of 2 relative to a local rock site.

  6. After the damages: Lessons learned from recent earthquakes for ground-motion prediction and seismic hazard assessment (C.F. Gauss Lecture) (United States)

    Cotton, Fabrice


    Recent damaging earthquakes (e.g. Japan 2011, Nepal 2014, Italy 2016) and associated ground-shaking (ground-motion) records challenge the engineering models used to quantify seismic hazard. The goal of this presentation is to present the lessons learned from these recent events and discuss their implications for ground-motion prediction and probabilistic seismic hazard assessment. The following points will be particularly addressed: 1) Recent observations clearly illustrate the dependency of ground-shaking on earthquake source related factors (e.g. fault properties and geometry, earthquake depth, directivity). The weaknesses of classical models and the impact of these factors on hazard evaluation will be analysed and quantified. 2) These observations also show that events of similar magnitude and style of faulting are producing ground-motions which are highly variable. We will analyse this variability and show that the exponential growth of recorded data give a unique opportunity to quantify regional or between-events shaking variations. Indeed, most seismic-hazard evaluations do not consider the regional specificities of earthquake or wave-propagation properties. There is little guidance in the literature on how this should be done and we will show that this challenge is interdisciplinary, as structural geology, neotectonic and tomographic images can provide key understanding of these regional variations. 3) One of the key lessons of recent earthquakes is that extreme hazard scenarios and ground-shaking are difficult to predict. In other words, we need to mobilize "scientific imagination" and define new strategies based on the latest research results to capture epistemic uncertainties and integrate them in engineering seismology projects. We will discuss these strategies and show an example of their implementation to develop new seismic hazard maps of Europe (Share and Sera FP7 projects) and Germany.

  7. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region (United States)

    Ayele, Atalay


    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  8. HAZUS Analysis of a Hosgri Fault Earthquake Scenario in Support of the Diablo Canyon Power Plant Earthquake Emergency Evacuation Study (United States)

    McLaren, M. K.; Nishenko, S. P.; Seligson, H.; Vardas, T.


    The objective of this project was to provide detailed bridge and roadway damage estimates within Diablo Canyon Power Plant's (DCPP) Emergency Planning Zone (EPZ) resulting from a Moment Magnitude (Mw) 7.2 scenario earthquake on the Hosgri Fault, to be used in the subsequent evacuation planning efforts. Scenario earthquake damage assessments implemented for this study utilized the Federal Emergency Management Agency's HAZUS (HAZUS-MH MR-4) natural hazard loss estimation software. Ground motion data for the M7.2 Hosgri Fault scenario were developed by the ShakeMap Development Team (Dr. David Wald and Dr. Kuo-wan Lin of the USGS) using Chiou and Youngs' "Next Generation Attenuation" (NGA) relationship. Liquefaction and landslide susceptibility within the DCPP Emergency Planning Zone were mapped by Fugro William Lettis & Associates (FWLA). Several bridge database improvements were implemented, derived from available information on bridge retrofit and replacement, provided by Caltrans and San Luis Obispo County Public Works personnel. Data on 186 Caltrans-owned bridges in San Luis Obispo County, including 22 with "Phase 2" bridge retrofits, were provided by Mark Yashinsky, Caltrans Office of Earthquake Engineering. Data on 13 County-owned bridges, including five that have had Phase 2 retrofit work completed and eight that have been replaced, were provided by Dave Flynn, County of San Luis Obispo Department of Public Works. In addition, enhanced roadway data within the EPZ were compiled and incorporated into HAZUS, including improved highway and roadway data available from ESRI (ArcGIS 9 Media Kit, ESRI Data and Maps), and street centerline data within the DCPP Plant limits, provided by FWLA. This study also leveraged earlier work conducted on behalf of the California Emergency Management Agency to test methodologies for improving the underlying building inventory databases for HAZUS (see: Improved building inventory

  9. Earthquake warning system for infrastructures : a scoping analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brodsky, Nancy S.; O' Connor, Sharon L.; Stamber, Kevin Louis; Kelic, Andjelka; Fogleman, William E. (GRIT, Inc., Albuquerque, NM); Vugrin, Eric D.; Corbet, Thomas Frank, Jr.; Brown, Theresa Jean


    This report provides the results of a scoping study evaluating the potential risk reduction value of a hypothetical, earthquake early-warning system. The study was based on an analysis of the actions that could be taken to reduce risks to population and infrastructures, how much time would be required to take each action and the potential consequences of false alarms given the nature of the action. The results of the scoping analysis indicate that risks could be reduced through improving existing event notification systems and individual responses to the notification; and production and utilization of more detailed risk maps for local planning. Detailed maps and training programs, based on existing knowledge of geologic conditions and processes, would reduce uncertainty in the consequence portion of the risk analysis. Uncertainties in the timing, magnitude and location of earthquakes and the potential impacts of false alarms will present major challenges to the value of an early-warning system.

  10. Cascadia slow slip events and earthquake initiation theories: Hazards research with Plate Boundary Observatory geodetic data (Invited) (United States)

    Roeloffs, E. A.; Beeler, N. M.


    The relationship of transient slow slip events (SSEs) to great earthquakes is a global focus of intense and critical hazards research. Plate Boundary Observatory (PBO) GPS and borehole strainmeter (BSM) networks in the Cascadia forearc provide detailed data that can be compared with simulations predicting how SSEs might evolve as a great earthquake approaches. Cascadia SSEs represent aseismic slip of a few cm in the direction of plate convergence over a period of days or weeks, in a depth range down-dip from the locked zone expected to generate the next great Cascadia subduction earthquake. During an SSE, shear stress borne in the SSE depth range is transferred up-dip at an above-background loading rate. If shear stress on the locked zone is continually accumulating, the daily probability of reaching a threshold failure stress is elevated during an SSE . Alternatively, if dynamic instability is due to rate-weakening fault strength, then SSEs still promote earthquake initiation, but that initiation may be delayed until after the SSE ends, and short-duration SSEs may have negligible effect. In some numerical simulations, great earthquakes could nucleate in the SSE depth range, where effective pressure is assumed to be low. Certain models predict that successive SSEs will slip to increasingly shallower depths, eventually encountering higher effective stress where shear heating can destabilize slip and lead to dynamic rupture. PBO GPS stations have recorded surface deformation from SSEs since inception in 2003; borehole strainmeters (BSMs) have recorded SSE strain signals since 2007. GPS and seismic tremor data show that SSEs reoccur all along the Cascadia subduction zone. An SSE is in progress somewhere in Cascadia much of the time, so the short-term probability increase warranted by a typical SSE is presumably low. We could, however, detect differences among successive SSEs and use criteria informed by the models described above to judge whether a distinctive SSE

  11. Studies of crustal structure, seismic precursors to volcanic eruptions and earthquake hazard in the eastern provinces of the Democratic Republic of Congo

    CSIR Research Space (South Africa)

    Mavonga, T


    Full Text Available of an effort to monitor the volcanoes and quantitatively assess the earthquake hazard. This information can be used to regulate the settlement of displaced people and to 'build back better'. In order to investigate volcanic processes in the Virunga area, a...

  12. Micro-earthquake signal analysis and hypocenter determination around Lokon volcano complex

    Energy Technology Data Exchange (ETDEWEB)

    Firmansyah, Rizky, E-mail: [Geophysical Engineering, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Bandung, 40132 (Indonesia); Nugraha, Andri Dian, E-mail: [Global Geophysical Group, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Bandung, 40132 (Indonesia); Kristianto, E-mail: [Center for Volcanology and Geological Hazard Mitigation (CVGHM), Geological Agency, Bandung, 40122 (Indonesia)


    Mount Lokon is one of five active volcanoes which is located in the North Sulawesi region. Since June 26{sup th}, 2011, standby alert set by the Center for Volcanology and Geological Hazard Mitigation (CVGHM) for this mountain. The Mount Lokon volcano erupted on July 4{sup th}, 2011 and still continuously erupted until August 28{sup th}, 2011. Due to its high seismic activity, this study is focused to analysis of micro-earthquake signal and determine the micro-earthquake hypocenter location around the complex area of Lokon-Empung Volcano before eruption phase in 2011 (time periods of January, 2009 up to March, 2010). Determination of the hypocenter location was conducted with Geiger Adaptive Damping (GAD) method. We used initial model from previous study in Volcan de Colima, Mexico. The reason behind the model selection was based on the same characteristics that shared between Mount Lokon and Colima including andesitic stratovolcano and small-plinian explosions volcanian types. In this study, a picking events was limited to the volcano-tectonics of A and B types, hybrid, long-period that has a clear signal onset, and local tectonic with different maximum S – P time are not more than three seconds. As a result, we observed the micro-earthquakes occurred in the area north-west of Mount Lokon region.

  13. Multi-scale earthquake hazard and risk in the Chinese mainland and countermeasures for the preparedness, mitigation, and management: an overview (United States)

    Wu, Z.; Jiang, C.; Ma, T.


    Earthquake hazard and risk in the Chinese mainland exhibit multi-scale characteristics. Temporal scales from centuries to months, spatial scales from the whole mainland to specific engineering structures, and energy scales from great disastrous earthquakes to small earthquakes causing social disturbance and economic loss, feature the complexity of earthquake disasters. Coping with such complex challenge, several research and application projects have been undertaken since recent years. Lessons and experiences of the 2008 Wenchuan earthquake contributed much to the launching and conducting of these projects. Understandings of the scientific problems and technical approaches taken in the mainstream studies in the Chinese mainland have no significant difference from those in the international scientific communities, albeit using of some of the terminologies have "cultural differences" - for instance, in the China Earthquake Administration (CEA), the terminology "earthquake forecast/prediction (study)" is generally used in a much broader sense, mainly indicating time-dependent seismic hazard at different spatio-temporal scales. Several scientific products have been produced serving the society in different forms. These scientific products have unique academic merits due to the long-term persistence feature and the forward forecast nature, which are all essential for the evaluation of the technical performance and the falsification of the scientific ideas. On the other hand, using the language of the "actor network theory (ANT)" in science studies (or the sociology of science), at present, the hierarchical "actors' network", making the science transformed to the actions of the public and government for the preparedness, mitigation, and management of multi-scale earthquake disasters, is still in need of careful construction and improvement.

  14. Hazard Analysis for Building 34 Vacuum Glove Box Assembly (United States)

    Meginnis, Ian


    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  15. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Small Business Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake (United States)

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.


    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of small businesses in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each business establishment size category to each Instrumental Intensity level. The analysis concerns the direct effect of the earthquake on small businesses. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by business establishment size.

  16. Operational Earthquake Forecasting: State of Knowledge and Guidelines for Implementation.


    Koshun Yamaoka; Gerassimos Papadopoulos; Gennady Sobolev; Warner Marzocchi; Ian Main; Raul Madariaga; Paolo Gasparini; Yun-Tai Chen; Jordan, Thomas H.; Jochen Zschau


    Following the 2009 L'Aquila earthquake, the Dipartimento della Protezione Civile Italiana (DPC), appointed an International Commission on Earthquake Forecasting for Civil Protection (ICEF) to report on the current state of knowledge of short-term prediction and forecasting of tectonic earthquakes and indicate guidelines for utilization of possible forerunners of large earthquakes to drive civil protection actions, including the use of probabilistic seismic hazard analysis in the wake of a lar...

  17. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria) (United States)

    Huttenlau, Matthias; Stötter, Johann


    incorporated with additional GIS and statistic data to a comprehensive property-by-property geodatabase of the existing elements and values. This stock of elements and values geodatabase is furthermore the consistent basis for all natural hazard analyses and enables the comparison of the results. The study follows the general accepted moduls (i) hazard analysis, (ii) exposition analysis, and (iii) consequence analysis, whereas the exposition analysis estimates the elements at risk with their corresponding damage potentials and the consequence analysis estimates the PMLs. This multi-hazard analysis focuses on process types with a high to extreme potential of negative consequences on a regional scale. In this context, (i) floodings, (ii) rockslides with the potential of corresponding consequence effects (backwater ponding and outburst flood), (iii) earthquakes, (iv) hail events, and (v) winter storms were considered as hazard processes. Based on general hazard analyses (hazard maps) concrete scenarios and their spatial affectedness were determined. For the different hazard processes, different vulnerability approaches were considered to demonstrate their sensitivity and implication on the results. Thus, no absolute values of losses but probable loss ranges were estimated. It can be shown, that the most serious amount of losses would arise from extreme earthquake events with loss burdens up to more than € 7 bn. solely on buildings and inventory. Possible extreme flood events could lead to losses between € 2 and 2.5 bn., whereas a severe hail swath which affects the central Inn valley could result in losses of ca. € 455 mill. (thereof € 285 mill. on vehicles). The potential most serious rockslide with additional consequence effects would result in losses up to ca. € 185 mill. and extreme winter storms can induce losses between € 100 mill. and 150 mill..

  18. A System of Systems Interface Hazard Analysis Technique (United States)


    16 Table 2. HAZOP Process ................................................................................. 21...Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...analysis techniques.28 c. Hazards and Operability Analysis Hazards and Operability ( HAZOP ) Analysis applies a systematic exploration of system

  19. Long-period analysis of the 2016 Kaikoura earthquake (United States)

    Duputel, Z.; Rivera, L.


    The recent Mw = 7.8 Kaikoura (New Zealand) earthquake involved a remarkably complex rupture propagating in an intricate network of faults at the transition between the Alpine fault in the South Island and the Kermadec-Tonga subduction zone. We investigate the main features of this complicated rupture process using long-period seismological observations. Apparent Rayleigh-wave moment-rate functions reveal a clear northeastward directivity with an unusually weak rupture initiation during 60 s followed by a major 20 s burst of moment rate. To further explore the rupture process, we perform a Bayesian exploration of multiple point-source parameters in a 3-D Earth model. The results show that the rupture initiated as a small strike-slip rupture and propagated to the northeast, triggering large slip on both strike-slip and thrust faults. The Kaikoura earthquake is thus a rare instance in which slip on intraplate faults trigger extensive interplate thrust faulting. This clearly outlines the importance of accounting for secondary faults when assessing seismic and tsunami hazard in subduction zones.

  20. Multi-hazard response analysis of a 5MW offshore wind turbine

    DEFF Research Database (Denmark)

    Katsanos, Evangelos; Sanz, A. Arrospide; Georgakis, Christos T.


    Wind energy has already dominant role on the scene of the clean energy production. Well-promising markets, like China, India, Korea and Latin America are the fields of expansion for new wind turbines mainly installed in offshore environment, where wind, wave and earthquake loads threat the struct......Wind energy has already dominant role on the scene of the clean energy production. Well-promising markets, like China, India, Korea and Latin America are the fields of expansion for new wind turbines mainly installed in offshore environment, where wind, wave and earthquake loads threat...... the structural integrity and reliability of these energy infrastructures. Along these lines, a multi-hazard environment was considered herein and the structural performance of a 5 MW offshore wind turbine was assessed through time domain analysis. A fully integrated model of the offshore structure consisting...

  1. Guidance Index for Shallow Landslide Hazard Analysis

    Directory of Open Access Journals (Sweden)

    Cheila Avalon Cullen


    Full Text Available Rainfall-induced shallow landslides are one of the most frequent hazards on slanted terrains. Intense storms with high-intensity and long-duration rainfall have high potential to trigger rapidly moving soil masses due to changes in pore water pressure and seepage forces. Nevertheless, regardless of the intensity and/or duration of the rainfall, shallow landslides are influenced by antecedent soil moisture conditions. As of this day, no system exists that dynamically interrelates these two factors on large scales. This work introduces a Shallow Landslide Index (SLI as the first implementation of antecedent soil moisture conditions for the hazard analysis of shallow rainfall-induced landslides. The proposed mathematical algorithm is built using a logistic regression method that systematically learns from a comprehensive landslide inventory. Initially, root-soil moisture and rainfall measurements modeled from AMSR-E and TRMM respectively, are used as proxies to develop the index. The input dataset is randomly divided into training and verification sets using the Hold-Out method. Validation results indicate that the best-fit model predicts the highest number of cases correctly at 93.2% accuracy. Consecutively, as AMSR-E and TRMM stopped working in October 2011 and April 2015 respectively, root-soil moisture and rainfall measurements modeled by SMAP and GPM are used to develop models that calculate the SLI for 10, 7, and 3 days. The resulting models indicate a strong relationship (78.7%, 79.6%, and 76.8% respectively between the predictors and the predicted value. The results also highlight important remaining challenges such as adequate information for algorithm functionality and satellite based data reliability. Nevertheless, the experimental system can potentially be used as a dynamic indicator of the total amount of antecedent moisture and rainfall (for a given duration of time needed to trigger a shallow landslide in a susceptible area. It is

  2. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications (United States)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie


    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  3. Source location and mechanism analysis of an earthquake triggered by the 2016 Kumamoto, southwestern Japan, earthquake (United States)

    Nakamura, Takeshi; Aoi, Shin


    The 2016 Kumamoto earthquake ( Mw 7.0) occurred in the central part of Kyushu Island, southwestern Japan, on April 16, 2016. The mainshock triggered an event of maximum acceleration 700 gal that caused severe damage to infrastructure and thousands of homes. We investigate the source location of the triggered event, and the timing of large energy release, by employing the back-projection method for strong-motion network data. The optimal location is estimated to be [33.2750°, 131.3575°] (latitude, longitude) at a depth of 5 km, which is 80 km northeast of the epicenter of the mainshock. The timing is 33.5 s after the origin time of the mainshock. We also investigate the source mechanism by reproducing observed displacement waveforms at a near-source station. The waveforms at smaller-sized events, convolved with the source time function of a pulse width 1 s, are similar to the signature of the observed waveforms of the triggered event. The observations are also reproduced by synthetic waveforms for a normal-fault mechanism and a normal-fault with strike-slip components at the estimated locations. Although our approach does not constrain the strike direction well, our waveform analysis indicates that the triggered earthquake occurred near the station that observed the strong motions, primarily via a normal-fault mechanism or a normal-fault with strike-slip components.[Figure not available: see fulltext.

  4. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.


    When training for a job in which human error has the potential of producing catastrophic results, an understanding of the hazards that may be encountered is of paramount importance. In high consequence activities, it is important that the training program be conducted in a safe environment and yet emphasize the potential hazards. Because of the high consequence of a human error the use of a high-fidelity simulation is of great importance to provide the safe environment the worker needs to learn and hone required skills. A hazards analysis identifies the operation hazards, potential human error, and associated positive measures that aid in the mitigation or prevention of the hazard. The information gained from the hazards analysis should be used in the development of training. This paper will discuss the integration of information from the hazards analysis into the development of simulation components of a training program.

  5. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.


    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  6. Global Earthquake Casualties due to Secondary Effects: A Quantitative Analysis for Improving PAGER Losses (United States)

    Wald, David J.


    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability.

  7. Deep-Sea Turbidites as Guides to Holocene Earthquake History at the Cascadia Subduction Zone—Alternative Views for a Seismic-Hazard Workshop (United States)

    Atwater, Brian F.; Griggs, Gary B.


    This report reviews the geological basis for some recent estimates of earthquake hazards in the Cascadia region between southern British Columbia and northern California. The largest earthquakes to which the region is prone are in the range of magnitude 8-9. The source of these great earthquakes is the fault down which the oceanic Juan de Fuca Plate is being subducted or thrust beneath the North American Plate. Geologic evidence for their occurrence includes sedimentary deposits that have been observed in cores from deep-sea channels and fans. Earthquakes can initiate subaqueous slumps or slides that generate turbidity currents and which produce the sedimentary deposits known as turbidites. The hazard estimates reviewed in this report are derived mainly from deep-sea turbidites that have been interpreted as proxy records of great Cascadia earthquakes. The estimates were first published in 2008. Most of the evidence for them is contained in a monograph now in press. We have reviewed a small part of this evidence, chiefly from Cascadia Channel and its tributaries, all of which head offshore the Pacific coast of Washington State. According to the recent estimates, the Cascadia plate boundary ruptured along its full length in 19 or 20 earthquakes of magnitude 9 in the past 10,000 years; its northern third broke during these giant earthquakes only, and southern segments produced at least 20 additional, lesser earthquakes of Holocene age. The turbidite case for full-length ruptures depends on stratigraphic evidence for simultaneous shaking at the heads of multiple submarine canyons. The simultaneity has been inferred primarily from turbidite counts above a stratigraphic datum, sandy beds likened to strong-motion records, and radiocarbon ages adjusted for turbidity-current erosion. In alternatives proposed here, this turbidite evidence for simultaneous shaking is less sensitive to earthquake size and frequency than previously thought. Turbidites far below a channel

  8. Fire hazards analysis of transuranic waste storage and assay facility

    Energy Technology Data Exchange (ETDEWEB)

    Busching, K.R., Westinghouse Hanford


    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  9. Hydrogeochemical precursors of strong earthquakes in Kamchatka: further analysis

    Directory of Open Access Journals (Sweden)

    P. F. Biagi


    Full Text Available For many years, ion and gas content data have been collected from the groundwater of three deep wells in the southern area of the Kamchatka peninsula, Russia. In the last ten years, five earthquakes with M > 6.5 have occurred within 250 km of the wells. In a previous study, we investigated the possibility that the hydrogeochemical time series contained precursors. The technique used was to assume that each signal with an amplitude of three times the standard deviation is an irregularity and we then defined anomalies as irregularities occurring simultaneously in the data for more than one parameter at each well. Using this method, we identified 11 anomalies with 8 of them being possible successes and 3 being failures as earthquake precursors. Precursors were obtained for all five earthquakes that we considered. In this paper, we allow for the cross-correlation found between the gas data sets and in some cases, between the ion data sets. No cross-correlation has been found between gas and ion content data. Any correlation undermines the idea that an anomaly might be identified from irregularities appearing simultaneously on different parameters at each site. To refine the technique, we re-examine the hydrogeochemical data and define as anomalies those irregularities occurring simultaneously only in the data of two or more uncorrelated parameters. We then restricted the analysis to the cases of just the gas content data and the ion content data. In the first case, we found 6 successes and 2 failures, and in the second case, we found only 3 successes. In the first case, the precursors appear only for three of the five earthquakes we considered, and in the second case, only for two, but these are the earthquakes nearest to the wells. Interestingly, it shows that when a strict set of rules for defining an anomaly is used, the method produces only successes and when less restrictive rules are used, earthquakes further from the well are implicated, but

  10. Source fault geometry of the 2015 Gorkha earthquake (Mw 7.9), Nepal, derived from a dense aftershock observation and earthquake reflection analysis (United States)

    Kurashimo, Eiji; Sato, Hiroshi; Sakai, Shin'ichi; Hirata, Naoshi; Prasad Gajurel, Ananta; Pani Adhikari, Dabda; Nath Upreti, Bishal; Subedi, Krishana; Yagi, Hiroshi; Nidhi Bhattarai, Tara; Ishiyama, Tatsuya


    The megathrust of the Himalayan foothills produced the Mw 7.9 Gorkha earthquake, on 25 April 2015, in Nepal. The geometry of the source fault provides basic information for understanding the active tectonics of the area and for forecasting seismic hazards. We tried to obtain the seismic image of source fault, using precise hypocentral determination and detection of reflectors based on crustal earthquake seismograms. To constrain the geometry of the source fault, observation of aftershocks by dense linear array was performed across the focal area from Hetauda to Syabru Besi, passing through Kathmandu, along a 90 km-long, NS trending seismic line. The aftershocks were observed at 35 stations, deployed at intervals of 3 - 10 km. Earthquakes were recorded using 4.5 Hz three-component sensors and off-line recorders for a total of two months in two separate deployments between August 15 and November 28, 2015. A total 716 of earthquake events were detected and their hypocenters determined using a 1-dimensional velocity structure. Precise hypocenters were determined for 609 events, with an error of less than ±0.5 km per event, using Double-difference tomographic analysis. The obtained hypocenter distribution portrays a gently northward-dipping fault zone at 5-10 km depth. The aftershock distribution accords well with the rupture area estimated from the analysis of crustal movements. Seismicity is very low in the area 65-85 km north of the Main Boundary Thrust (MBT), which coincides with an area of large co-seismic slip as deduced from InSAR and GPS data. Using the seismograms of dense linear array, two reflectors were identified between 60 to 80 km north from the MBT. Shallower reflector corresponds to the plate boundary and lower reflector is in the Indian slab. Estimated source fault geometry from hypocentral distribution and earthquake reflection, is divided into two parts by 80 km from the MTB; southern part is dipping north by 5 degrees and northern part dips 13

  11. Preliminary Tsunami Hazard Analysis for Uljin NPP Site using Tsunami Propagation Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyunme; KIm, Minkyu; Choi, Inkil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Donghoon [Chonnam National Univ., Gwangju (Korea, Republic of)


    The tsunami hazard analysis is based on the seismic hazard analysis method. The seismic hazard analysis had been performed by using the deterministic or probabilistic method. Recently, the probabilistic method has been received more attention than the deterministic method because the probabilistic approach can be considered well uncertainties of hazard analysis. Therefore the studies on the probabilistic tsunami hazard analysis (PTHA) have been performed in this study. This study was focused on the wave propagation analysis which was the most different thing between seismic hazard analysis and tsunami hazard analysis.

  12. Region-specific deterministic and probabilistic seismic hazard ...

    Indian Academy of Sciences (India)

    magnitude of (a) 6.8 and (b) 7.8 (Nepal 2015 earthquake). 7.1 Deterministic seismic hazard analysis. The worst-case scenario map i.e., deterministic hazard map is required for estimating seismic vulnerability, seismic losses and for seismic disaster planning and mitigation. Usually, one or more earthquakes are identified by ...

  13. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.


    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  14. Historical earthquake research in Austria (United States)

    Hammerl, Christa


    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  15. InSAR Analysis of the 2011 Hawthorne (Nevada) Earthquake Swarm: Implications of Earthquake Migration and Stress Transfer (United States)

    Zha, X.; Dai, Z.; Lu, Z.


    The 2011 Hawthorne earthquake swarm occurred in the central Walker Lane zone, neighboring the border between California and Nevada. The swarm included an Mw 4.4 on April 13, Mw 4.6 on April 17, and Mw 3.9 on April 27. Due to the lack of the near-field seismic instrument, it is difficult to get the accurate source information from the seismic data for these moderate-magnitude events. ENVISAT InSAR observations captured the deformation mainly caused by three events during the 2011 Hawthorne earthquake swarm. The surface traces of three seismogenic sources could be identified according to the local topography and interferogram phase discontinuities. The epicenters could be determined using the interferograms and the relocated earthquake distribution. An apparent earthquake migration is revealed by InSAR observations and the earthquake distribution. Analysis and modeling of InSAR data show that three moderate magnitude earthquakes were produced by slip on three previously unrecognized faults in the central Walker Lane. Two seismogenic sources are northwest striking, right-lateral strike-slip faults with some thrust-slip components, and the other source is a northeast striking, thrust-slip fault with some strike-slip components. The former two faults are roughly parallel to each other, and almost perpendicular to the latter one. This special spatial correlation between three seismogenic faults and nature of seismogenic faults suggest the central Walker Lane has been undergoing southeast-northwest horizontal compressive deformation, consistent with the region crustal movement revealed by GPS measurement. The Coulomb failure stresses on the fault planes were calculated using the preferred slip model and the Coulomb 3.4 software package. For the Mw4.6 earthquake, the Coulomb stress change caused by the Mw4.4 event increased by ~0.1 bar. For the Mw3.9 event, the Coulomb stress change caused by the Mw4.6 earthquake increased by ~1.0 bar. This indicates that the preceding

  16. Review of earthquake hazard assessments of plant sites at Paducah, Kentucky and Portsmouth, Ohio

    Energy Technology Data Exchange (ETDEWEB)



    Members of the US Geological Survey staff in Golden, Colorado, have reviewed the submissions of Lawrence Livermore National Laboratory (LLNL) staff and of Risk Engineering, Inc. (REI) (Golden, Colorado) for seismic hazard estimates for Department of Energy facilities at Portsmouth, Ohio, and Paducah, Kentucky. We reviewed the historical seismicity and seismotectonics near the two sites, and general features of the LLNL and EPRI/SOG methodologies used by LLNL and Risk Engineering respectively, and also the separate Risk Engineering methodology used at Paducah. We discussed generic issues that affect the modeling of both sites, and performed alternative calculations to determine sensitivities of seismic hazard results to various assumptions and models in an attempt to assign reasonable bounding values of the hazard. In our studies we find that peak acceleration values of 0.08 g for Portsmouth and 0.32 g for Paducah represent central values of the, ground motions obtained at 1000-year return periods. Peak accelerations obtained in the LLNL and Risk Engineering studies have medians near these values (results obtained using the EPRI/SOG methodology appear low at both sites), and we believe that these medians are appropriate values for use in the evaluation of systems, structures, and components for seismic structural integrity and for the seismic design of new and improved systems, structures, and components at Portsmouth and Paducah.

  17. Strategic crisis and risk communication during a prolonged natural hazard event: lessons learned from the Canterbury earthquake sequence (United States)

    Wein, A. M.; Potter, S.; Becker, J.; Doyle, E. E.; Jones, J. L.


    While communication products are developed for monitoring and forecasting hazard events, less thought may have been given to crisis and risk communication plans. During larger (and rarer) events responsible science agencies may find themselves facing new and intensified demands for information and unprepared for effectively resourcing communications. In a study of the communication of aftershock information during the 2010-12 Canterbury Earthquake Sequence (New Zealand), issues are identified and implications for communication strategy noted. Communication issues during the responses included reliability and timeliness of communication channels for immediate and short decision time frames; access to scientists by those who needed information; unfamiliar emergency management frameworks; information needs of multiple audiences, audience readiness to use the information; and how best to convey empathy during traumatic events and refer to other information sources about what to do and how to cope. Other science communication challenges included meeting an increased demand for earthquake education, getting attention on aftershock forecasts; responding to rumor management; supporting uptake of information by critical infrastructure and government and for the application of scientific information in complex societal decisions; dealing with repetitive information requests; addressing diverse needs of multiple audiences for scientific information; and coordinating communications within and outside the science domain. For a science agency, a communication strategy would consider training scientists in communication, establishing relationships with university scientists and other disaster communication roles, coordinating messages, prioritizing audiences, deliberating forecasts with community leaders, identifying user needs and familiarizing them with the products ahead of time, and practicing the delivery and use of information via scenario planning and exercises.

  18. Multi-Resolution Clustering Analysis and Visualization of Around One Million Synthetic Earthquake Events (United States)

    Kaneko, J. Y.; Yuen, D. A.; Dzwinel, W.; Boryszko, K.; Ben-Zion, Y.; Sevre, E. O.


    The study of seismic patterns with synthetic data is important for analyzing the seismic hazard of faults because one can precisely control the spatial and temporal domains. Using modern clustering analysis from statistics and a recently introduced visualization software, AMIRA, we have examined the multi-resolution nature of a total assemblage involving 922,672 earthquake events in 4 numerically simulated models, which have different constitutive parameters, with 2 disparately different time intervals in a 3D spatial domain. The evolution of stress and slip on the fault plane was simulated with the 3D elastic dislocation theory for a configuration representing the central San Andreas Fault (Ben-Zion, J. Geophys. Res., 101, 5677-5706, 1996). The 4 different models represent various levels of fault zone disorder and have the following brittle properties and names: uniform properties (model U), a Parkfield type Asperity (A), fractal properties (F), and multi-size-heterogeneities (model M). We employed the MNN (mutual nearest neighbor) clustering method and developed a C-program that calculates simultaneously a number of parameters related to the location of the earthquakes and their magnitude values .Visualization was then used to look at the geometrical locations of the hypocenters and the evolution of seismic patterns. We wrote an AmiraScript that allows us to pass the parameters in an interactive format. With data sets consisting of 150 year time intervals, we have unveiled the distinctly multi-resolutional nature in the spatial-temporal pattern of small and large earthquake correlations shown previously by Eneva and Ben-Zion (J. Geophys. Res., 102, 24513-24528, 1997). In order to search for clearer possible stationary patterns and substructures within the clusters, we have also carried out the same analysis for corresponding data sets with time extending to several thousand years. The larger data sets were studied with finer and finer time intervals and multi

  19. Implosion, earthquake, and explosion recordings from the 2000 Seattle Kingdome Seismic Hazards Investigation of Puget Sound (SHIPS), Washington (United States)

    Brocher, Thomas M.; Pratt, Thomas L.; Weaver, Craig S.; Snelson, Catherine M.; Frankel, Arthur D.


    This report describes seismic data obtained in Seattle, Washington, March 24-28, 2000, during a Seismic Hazards Investigation of Puget Sound (SHIPS). The seismic recordings obtained by this SHIPS experiment, nicknamed Kingdome SHIPS, were designed to (1) measure site responses throughout Seattle and to (2) help define the location of the Seattle fault. During Kingdome SHIPS, we recorded the Kingdome implosion, four 150-lb (68-kg) shots, and a Mw = 7.6 teleseism using a dense network of seismographs deployed throughout Seattle. The seismographs were deployed at a nominal spacing of 1 km in a hexagonal grid extending from Green Lake in the north to Boeing Field in the south. The Seattle Kingdome was a domed sports stadium located in downtown Seattle near the Seattle fault. The Seattle Kingdome was imploded (demolished) at 8:32 AM local time (16:32 UTC) on March 26 (JD 086), 2000. The seismic energy produced by implosion of the Kingdome was equivalent to a local earthquake magnitude of 2.3. Strong impacts produced by the implosion of the Kingdome generated seismic arrivals to frequencies as low as 0.1 Hz. Two shots located north of the Seattle fault, where the charges were detonated within the ground water column (Discovery and Magnuson Parks), were much more strongly coupled than were the two shots to the south of the Seattle fault, where the shots were detonated above the water table (Lincoln and Seward Parks). Thirty-eight RefTek stations, scattered throughout Seattle, recorded the Mw = 7.6 Japan Volcano Islands earthquake (22.4°N, 143.6°E, 104 km depth) of 28 March 2000 (JD 088). This teleseism produced useful signals for periods between 4 and 7 seconds. Only a few recordings of small magnitude local earthquakes were made, and these recordings are not presented. In this report, we describe the acquisition of these data, discuss the processing and merging of the data into common shot gathers, and illustrate the acquired data. We also describe the format and

  20. Fire Hazard Analysis for Turbine Building of NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Seung Jun [KMENT, Seoul (Korea, Republic of); Park, Jun Hyun [Korea Electric Power Research Institute, Taejon (Korea, Republic of)


    In order to prove fire safety of operating nuclear power plants, plant-specific fire hazard analysis should be performed. Furthermore the effect of design changes on fire safety should be reviewed periodically. At the estimating fire vulnerability stage, the factors that influence fire vulnerability include ignition sources, combustibles, fire barriers, fire protection features such as detection, alarm, suppression, evacuation are investigated. At the stage of fire hazard assessment, ignition and propagation hazard, passive and active fire protection features, and fire protection program such as pre-fire plan and related procedures are investigated. Based on the result of fire hazard analysis, reasonable improvement plan for fire protection can be established. This paper describes the result of fire hazard analysis classified by fire area for turbine building of which fire hazards and fire frequencies are relatively high in operating nuclear power plant.

  1. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)


    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  2. Miocene to present deformation rates in the Yakima Fold Province and implications for earthquake hazards in central Washington State, USA (United States)

    Staisch, Lydia; Sherrod, Brian; Kelsey, Harvey; Blakely, Richard; Möller, Andreas; Styron, Richard


    , which lack syntectonic growth strata, we exploit 2-m LiDAR data and invert stream profiles to analytically solve for a linear solution to relative uplift rate. From stream profile inversion, we see an increase in incision rates in Pliocene time and suggest that this increased rate is tectonically controlled. Our analyses indicate that deformation rates along the Manastash and Umtanum Ridge anticlines are significantly higher than along the Saddle Mountains. We use our new estimates of slip rates along individual anticlines to calculate the time required to accumulate enough strain energy for a large magnitude earthquake (M≥7) along faults within the YFP. Our results indicate that it takes between several hundred to several thousand years to accumulate sufficient strain energy for a M≥7 earthquake, with the greatest hazard posed by the Umtanum Ridge anticline.

  3. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions (United States)

    De Risi, Raffaele; Goda, Katsuichiro


    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  4. 14 CFR 417.227 - Toxic release hazard analysis. (United States)


    ... members of the public on land and on any waterborne vessels, populated offshore structures, and aircraft... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  5. Reduction of uncertainties in probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)


    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  6. Use of remote sensing and seismotectonic parameters for seismic hazard analysis of Bangalore

    Directory of Open Access Journals (Sweden)

    T. G. Sitharam


    Full Text Available Deterministic Seismic Hazard Analysis (DSHA for the Bangalore, India has been carried out by considering the past earthquakes, assumed subsurface fault rupture lengths and point source synthetic ground motion model. The sources have been identified using satellite remote sensing images and seismotectonic atlas map of India and relevant field studies. Maximum Credible Earthquake (MCE has been determined by considering the regional seismotectonic activity in about 350 km radius around Bangalore. The seismotectonic map has been prepared by considering the faults, lineaments, shear zones in the area and past moderate earthquakes of more than 470 events having the moment magnitude of 3.5 and above. In addition, 1300 number of earthquake tremors having moment magnitude of less than 3.5 has been considered for the study. Shortest distance from the Bangalore to the different sources is measured and then Peak Horizontal Acceleration (PHA is calculated for the different sources and moment magnitude of events using regional attenuation relation for peninsular India. Based on Wells and Coppersmith (1994 relationship, subsurface fault rupture length of about 3.8% of total length of the fault shown to be matching with past earthquake events in the area. To simulate synthetic ground motions, Boore (1983, 2003 SMSIM programs have been used and the PHA for the different locations is evaluated. From the above approaches, the PHA of 0.15 g was established. This value was obtained for a maximum credible earthquake having a moment magnitude of 5.1 for a source Mandya-Channapatna-Bangalore lineament. This particular source has been identified as a vulnerable source for Bangalore. From this study, it is very clear that Bangalore area can be described as seismically moderately active region. It is also recommended that southern part of Karnataka in particular Bangalore, Mandya and Kolar, need to be upgraded from current Indian Seismic Zone II to Seismic Zone III

  7. ShakeMap Atlas 2.0: an improved suite of recent historical earthquake ShakeMaps for global hazard analyses and loss model calibration (United States)

    Garcia, D.; Mah, R.T.; Johnson, K.L.; Hearne, M.G.; Marano, K.D.; Lin, K.-W.; Wald, D.J.


    We introduce the second version of the U.S. Geological Survey ShakeMap Atlas, which is an openly-available compilation of nearly 8,000 ShakeMaps of the most significant global earthquakes between 1973 and 2011. This revision of the Atlas includes: (1) a new version of the ShakeMap software that improves data usage and uncertainty estimations; (2) an updated earthquake source catalogue that includes regional locations and finite fault models; (3) a refined strategy to select prediction and conversion equations based on a new seismotectonic regionalization scheme; and (4) vastly more macroseismic intensity and ground-motion data from regional agencies All these changes make the new Atlas a self-consistent, calibrated ShakeMap catalogue that constitutes an invaluable resource for investigating near-source strong ground-motion, as well as for seismic hazard, scenario, risk, and loss-model development. To this end, the Atlas will provide a hazard base layer for PAGER loss calibration and for the Earthquake Consequences Database within the Global Earthquake Model initiative.

  8. Hazard and operability (HAZOP) analysis. A literature review. (United States)

    Dunjó, Jordi; Fthenakis, Vasilis; Vílchez, Juan A; Arnaldos, Josep


    Hazard and operability (HAZOP) methodology is a Process Hazard Analysis (PHA) technique used worldwide for studying not only the hazards of a system, but also its operability problems, by exploring the effects of any deviations from design conditions. Our paper is the first HAZOP review intended to gather HAZOP-related literature from books, guidelines, standards, major journals, and conference proceedings, with the purpose of classifying the research conducted over the years and define the HAZOP state-of-the-art.

  9. Using Spatial Multi-Criteria Analysis and Ranking Tool (SMART in earthquake risk assessment: a case study of Delhi region, India

    Directory of Open Access Journals (Sweden)

    Nishant Sinha


    Full Text Available This article is aimed at earthquake hazard, vulnerability and risk assessment as a case study to demonstrate the applicability of Spatial Multi-Criteria Analysis and Ranking Tool (SMART, which is based on Saaty's multi-criteria decision analysis (MCDA technique. The three specific study sites of Delhi were chosen for research as it corresponds to a typical patch of the urban environs, completely engrossed with residential, commercial and industrial units. The earthquake hazard affecting components are established in the form of geographic information system data-set layers including seismic zone, peak ground acceleration (PGA, soil characteristics, liquefaction potential, geological characteristics, land use, proximity to fault and epicentre. The physical vulnerability layers comprising building information, namely number of stories, year-built range, area, occupancy and construction type, derived from remote sensing imagery, were only considered for the current research. SMART was developed for earthquake risk assessment, and weights were derived both at component and its element level. Based on weighted overlay techniques, the earthquake hazard and vulnerability layers were created from which the risk maps were derived through multiplicative analysis. The developed risk maps may prove useful in decision-making process and formulating risk mitigation measures.

  10. FRISK: computer program for seismic risk analysis using faults as earthquake sources (United States)

    McGuire, Robin K.


    This computer program makes probabilistic seismic hazard calculations at sites affected by earthquakes occurring on faults which are defined by the user as a series of line segments. The length of rupture of the fault as a function of earthquake magnitude is accounted for, and ground motion estimates at the site are made using the magnitude of the earthquake and the closest distance from the site to the rupture zone. Uncertainty in the earthquake magnitude, in the rupture given magnitude, in the location of the rupture zone on the fault, in the maximum possible magnitude of earthquakes, and in the ground motion at the site given the earthquake, its size, rupture length, and location, are accounted for explicitly. FRISK (Fault RISK) was written to take advantage of repeated calculations, so that seismic hazard analyses for several ground motion parameters (for instance, peak ground acceleration, velocity, and displacement), and for several sites, are most efficiently made with one execution of the program rather than with repeated executions. The program uses a step-truncated exponential distribution for earthquake magnitude, a lognormal distribution for rupture length given magnitude, a uniform distribution for rupture location on faults, and a lognormal distribution of site amplitude given magnitude of the earthquake and distance from the rupture zone to the site. The program has been structured so that other functions may easily be substituted if this is appropriate for a particular problem; for example a wide range of deterministic or probabilistic geophysical models for estimating ground motion may be incorporated, and the program will yield probabilistic estimates of seismic hazard.

  11. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, each based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.

  12. Coulomb static stress changes before and after the 23 October 2011 Van, eastern Turkey, earthquake (MW= 7.1): implications for the earthquake hazard mitigation

    National Research Council Canada - National Science Library

    Utkucu, M; Durmuş, H; Yalçın, H; Budakoğlu, E; Işık, E


      Coulomb stress changes before and after the 23 October 2011 Van, eastern Turkey, earthquake have been analysed using available data related to the background and the aftershock seismicity and the source faults...

  13. Hazardous Waste Site Analysis (Small Site Technology) (United States)


    information. " RCRA required all treaters , storers, and/or disposers to either have permits by November 1980, or qualify for interim status, by notifying...carbon dioxide or compressed liquid state propane ) is used as a solvent to extract organic hazardous constituents from waste. Additional processing

  14. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis (United States)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia


    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready

  15. The evaluation of the earthquake hazard using the exponential distribution method for different seismic source regions in and around Ağrı

    Energy Technology Data Exchange (ETDEWEB)

    Bayrak, Yusuf, E-mail: [Ağrı İbrahim Çeçen University, Ağrı/Turkey (Turkey); Türker, Tuğba, E-mail: [Karadeniz Technical University, Department of Geophysics, Trabzon/Turkey (Turkey)


    The aim of this study; were determined of the earthquake hazard using the exponential distribution method for different seismic sources of the Ağrı and vicinity. A homogeneous earthquake catalog has been examined for 1900-2015 (the instrumental period) with 456 earthquake data for Ağrı and vicinity. Catalog; Bogazici University Kandilli Observatory and Earthquake Research Institute (Burke), National Earthquake Monitoring Center (NEMC), TUBITAK, TURKNET the International Seismological Center (ISC), Seismological Research Institute (IRIS) has been created using different catalogs like. Ağrı and vicinity are divided into 7 different seismic source regions with epicenter distribution of formed earthquakes in the instrumental period, focal mechanism solutions, and existing tectonic structures. In the study, the average magnitude value are calculated according to the specified magnitude ranges for 7 different seismic source region. According to the estimated calculations for 7 different seismic source regions, the biggest difference corresponding with the classes of determined magnitudes between observed and expected cumulative probabilities are determined. The recurrence period and earthquake occurrence number per year are estimated of occurring earthquakes in the Ağrı and vicinity. As a result, 7 different seismic source regions are determined occurrence probabilities of an earthquake 3.2 magnitude, Region 1 was greater than 6.7 magnitude, Region 2 was greater than than 4.7 magnitude, Region 3 was greater than 5.2 magnitude, Region 4 was greater than 6.2 magnitude, Region 5 was greater than 5.7 magnitude, Region 6 was greater than 7.2 magnitude, Region 7 was greater than 6.2 magnitude. The highest observed magnitude 7 different seismic source regions of Ağrı and vicinity are estimated 7 magnitude in Region 6. Region 6 are determined according to determining magnitudes, occurrence years of earthquakes in the future years, respectively, 7.2 magnitude was in 158

  16. A Short Term Seismic Hazard Assessment in Christchurch, New Zealand, After the M 7.1, 4 September 2010 Darfield Earthquake: An Application of a Smoothing Kernel and Rate-and-State Friction Model

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan


    Full Text Available The Mw 6.3, 21 February 2011 Christchurch, New Zealand, earthquake is regarded as an aftershock of the M 7.1, 4 September 2010 Darfield earthquake. However, it caused severe damage in the downtown Christchurch. Such a circumstance points out the importance of an aftershock sequence in seismic hazard evaluation and suggests the re-evaluation of a seismic hazard immediately after a large earthquake occurrence. For this purpose, we propose a probabilistic seismic hazard assessment (PSHA, which takes the disturbance of a short-term seismicity rate into account and can be easily applied in comparison with the classical PSHA. In our approach, the treatment of the background seismicity rate is the same as in the zoneless approach, which considers a bandwidth function as a smoothing Kernel in neighboring region of earthquakes. The rate-and-state friction model imparted by the Coulomb stress change of large earthquakes is used to calculate the fault-interaction-based disturbance in seismicity rate for PSHA. We apply this approach to evaluate the seismic hazard in Christchurch after the occurrence of the M 7.1, 4 September 2010 Darfield earthquake. Results show an increase of seismic hazards due to the stress increase in the region around the rupture plane, which extended to Christchurch. This provides a suitable basis for the application of a time-dependent PSHA using updating earthquake information.

  17. Automatic analysis of the 2015 Gorkha earthquake aftershock sequence. (United States)

    Baillard, C.; Lyon-Caen, H.; Bollinger, L.; Rietbrock, A.; Letort, J.; Adhikari, L. B.


    The Mw 7.8 Gorkha earthquake, that partially ruptured the Main Himalayan Thrust North of Kathmandu on the 25th April 2015, was the largest and most catastrophic earthquake striking Nepal since the great M8.4 1934 earthquake. This mainshock was followed by multiple aftershocks, among them, two notable events that occurred on the 12th May with magnitudes of 7.3 Mw and 6.3 Mw. Due to these recent events it became essential for the authorities and for the scientific community to better evaluate the seismic risk in the region through a detailed analysis of the earthquake catalog, amongst others, the spatio-temporal distribution of the Gorkha aftershock sequence. Here we complement this first study by doing a microseismic study using seismic data coming from the eastern part of the Nepalese Seismological Center network associated to one broadband station in Everest. Our primary goal is to deliver an accurate catalog of the aftershock sequence. Due to the exceptional number of events detected we performed an automatic picking/locating procedure which can be splitted in 4 steps: 1) Coarse picking of the onsets using a classical STA/LTA picker, 2) phase association of picked onsets to detect and declare seismic events, 3) Kurtosis pick refinement around theoretical arrival times to increase picking and location accuracy and, 4) local magnitude calculation based amplitude of waveforms. This procedure is time efficient ( 1 sec/event), reduces considerably the location uncertainties ( 2 to 5 km errors) and increases the number of events detected compared to manual processing. Indeed, the automatic detection rate is 10 times higher than the manual detection rate. By comparing to the USGS catalog we were able to give a new attenuation law to compute local magnitudes in the region. A detailed analysis of the seismicity shows a clear migration toward the east of the region and a sudden decrease of seismicity 100 km east of Kathmandu which may reveal the presence of a tectonic

  18. A Situational Analysis of Priority Disaster Hazards in Uganda ...

    African Journals Online (AJOL)

    Background: Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for ...

  19. Seismic Hazard Analysis of the Bandung Triga 2000 Reactor Site


    Parithusta, Rizkita; P, Sindur; Mangkoesoebroto


    SEISMIC HAZARD ANALYSIS OF THE BANDUNG TRIGA 2000 REACTOR SITE. A seismic hazard analysis of the West Java region is carried out to estimate the peak ground acceleration at the Bandung TRIGA 2000 nuclear reactor site. Both the probabilistic and deterministic approaches are employed to better capture the uncertainties considering the enclosing fault systems. Comprehensive analysis is performed based on the newly revised catalog of seismic data, the most recent results of the construction of se...

  20. Site amplification in Wellington city, New Zealand, determined from analysis of recent earthquake sequences (United States)

    Kaiser, A. E.; Francois-Holden, C.; Benites, R. A.


    New Zealand's capital city of Wellington lies astride the Pacific-Australian plate boundary in an area of high seismic hazard. Several large crustal faults capable of generating earthquakes of magnitude >7 cut through the region, and the subduction interface lies at relatively shallow depths ( 25 km) below. The central city is situated on an alluvial basin with variable bedrock depth capable of generating complex 3D site amplification. Furthermore, much of the rest of the city is spread across high topographic relief with the potential for significant local topographic and basin edge effects. In 2013, the Cook Strait earthquake sequence produced the highest ground shaking experienced in the region in recent decades, and included two earthquakes of Mw 6.6 situated approximately 50km from Wellington. Peak ground accelerations recorded during the sequence ranged up to 0.2g during both major events and spectral accelerations recorded in the central city ranged up to approximately 20-30% of the current building design level. Ground motions during the Cook Strait sequence in Wellington were highly variable and strongly dependent on the local site conditions. We use this new data to present an analysis of local ground motion effects in terms of amplification and polarization using horizontal-to-vertical and site-to-reference spectral ratios calculated for a range of Wellington stations spanning rock to deep soil conditions. We also investigate horizontal and vertical site amplification using a spectral inversion technique to separate source, path and site influences on ground motion. Our results are generally in good agreement with existing microzonation maps, however, some differences may result from the complex role of 3D bedrock and surface topography in the region.

  1. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Labor Market Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake (United States)

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.


    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of economic Super Sectors in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each Super Sector to each Instrumental Intensity level. The analysis concerns the direct effect of the scenario earthquake on economic sectors and provides a baseline for the indirect and interactive analysis of an input-output model of the regional economy. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by the North American Industry Classification System. According to the analysis results, nearly 225,000 business

  2. Fleeing to Fault Zones: Incorporating Syrian Refugees into Earthquake Risk Analysis along the East Anatolian and Dead Sea Rift Fault Zones (United States)

    Wilson, B.; Paradise, T. R.


    The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration

  3. Arc flash hazard analysis and mitigation

    CERN Document Server

    Das, J C


    "All the aspects of arc flash hazard calculations and their mitigation have been covered. Knowledge of electrical power systems up to undergraduate level is assumed. The calculations of short-circuits, protective relaying and varied electrical system configurations in industrial power systems are addressed. Protection systems address differential relays, arc flash sensing relays, protective relaying coordination, current transformer operation and saturation and applications to major electrical equipments from the arc flash considerations. Current technologies and strategies for arc flash mitigation have been covered. A new algorithm for the calculation of arc flash hazard accounting for the decaying nature of the short-circuit currents is included. There are many practical examples and study cases. Review questions and references follow each chapter"--

  4. Analysis of intermediate period correlations of coda from deep earthquakes (United States)

    Poli, Piero; Campillo, Michel; de Hoop, Maarten


    We aim at assessing quantitatively the nature of the signals that appear in coda wave correlations at periods >20 s. These signals contain transient constituents with arrival times corresponding to deep seismic phases. These (body-wave) constituents can be used for imaging. To evaluate this approach, we calculate the autocorrelations of the vertical component seismograms for the Mw 8.4 sea of Okhotsk earthquake at 400 stations in the Eastern US, using data from 1 h before to 50 h after the earthquake. By using array analysis and modes identification, we discover the dominant role played by high quality factor normal modes in the emergence of strong coherent phases as ScS-like, and P'P'df-like. We then make use of geometrical quantization to derive the constituent rays associated with particular modes, and gain insights about the ballistic reverberation of the rays that contributes to the emergence of body waves. Our study indicates that the signals measured in the spatially averaged autocorrelations have a physical significance, but a direct interpretation of ScS-like and P'P'df-like is not trivial. Indeed, even a single simple measurement of long period late coda in a limited period band could provide valuable information on the deep structure by using the temporal information of its autocorrelation, a procedure that could be also useful for planetary exploration.

  5. Reauthorization of the Earthquake Hazards Reduction Act. Hearing before the Subcommittee on Science, Technology, and Space of the Committee on Commerce, Science, and Transportation, United States Senate, One Hundredth Congress, First Session, April 23, 1987

    Energy Technology Data Exchange (ETDEWEB)


    Seven geologists, engineers, and emergency planners testified about the risks and preparations to deal with the possibility of large earthquakes, which can occur in the central and eastern part of the US as well as on the West Coast. The goal of the Earthquake Hazards Act of 1977 was to reduce the cost in human life and property damage. The witnesses reviewed progress in terms of improved building codes, community awareness, and emergency planning. A new issue was that of earthquake insurance and the capacity of financial institutions to cope with the magnitude of losses that are associated with a major earthquake. Additional material submitted for the record follows the testimony.

  6. Research on response spectrum of dam based on scenario earthquake (United States)

    Zhang, Xiaoliang; Zhang, Yushan


    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  7. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) (United States)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.


    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.


    Directory of Open Access Journals (Sweden)

    George Pararas-Carayannis


    Full Text Available Myanmar, formerly Burma, is vulnerable to several natural hazards, such as earthquakes, cyclones, floods, tsunamis and landslides. The present study focuses on geomorphologic and geologic investigations of the south-western region of the country, based on satellite data (Shuttle Radar Topography Mission-SRTM, MODIS and LANDSAT. The main objective is to detect areas vulnerable to inundation by tsunami waves and cyclone surges. Since the region is also vulnerable to earthquake hazards, it is also important to identify seismotectonic patterns, the location of major active faults, and local site conditions that may enhance ground motions and earthquake intensities. As illustrated by this study, linear, topographic features related to subsurface tectonic features become clearly visible on SRTM-derived morphometric maps and on LANDSAT imagery. The GIS integrated evaluation of LANDSAT and SRTM data helps identify areas most susceptible to flooding and inundation by tsunamis and storm surges. Additionally, land elevation maps help identify sites greater than 10 m in elevation height, that would be suitable for the building of protective tsunami/cyclone shelters.

  9. Earthquake forecasting test for Kanto district: Analysis of an earthquake catalog considering focal depth (United States)

    Yokoi, S.; Tsuruoka, H.; Hirata, N.


    We started a research for constructing a 3-dimensional (3D) earthquake forecasting model for the Kanto district in Japan under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity in this area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of earthquake depth distribution. We are developing forecasting models based on the results of 2D modeling. In the first step of the study, we defined the 3D - forecasting region in Kanto with a grid of horizontal 0.1° x 0.1° and every 10 km in a depth from 0 km to 100 km. Then, it was confirmed that RI model showed a good performance in the 3D-forecasting model compared with a 2D model which is non-divided column from 0 km to100 km in a depth. RI model (Nanjo, 2011) learned past seismicity from JMA catalog for 10 years from 1998 to 2009 to estimate probabilities of earthquakes from November 2009 to January 2010. Because we aim to improve forecasting performance of a model of a large earthquake, we need a longer period of earthquake data than current studies. In this study, we analyzed completeness magnitude (Mc) of JMA catalog from 1970 to 2007 with 3 depth ranges, 0 - 30km, 30 - 60km and 60 - 100km by the Maximum curvature method (Wiemer and Wyss, 2000) to assess a quality of the catalog considering a depth of hypocenters. This method tended to estimate Mc smaller than visual inspection method. Time sequence of the Mc from 1970 to 1997 decreased independent of a depth, which means that detection limit of the hypocenter is homogeneous in a depth, and quality of the catalog improved with a time. On the other hand, Mc from 1997 to 2007 showed heterogeneous distribution with a depth. In this presentation, we discuss how use the heterogeneous catalog to develop a 3D forecasting model in Japan. The authors thank the Japan Meteorological Agency for the earthquake catalog. This work is sponsored by the

  10. Analysis of soil radon data in earthquake precursory studies

    Directory of Open Access Journals (Sweden)

    Hari Prasad Jaishi


    Full Text Available Soil radon data were recorded at two selected sites along Mat fault in Mizoram (India, which lies in the highest seismic zone in India. The study was carried out during July 2011 to May 2013 using LR-115 Type II films. Precursory changes in radon concentration were observed prior to some earthquakes that occurred around the measuring sites. Positive correlation was found between the measured radon data and the seismic activity in the region. Statistical analysis of the radon data together with the meteorological parameters was done using Multiple Regression Method. Results obtained show that the method employed was useful for removing the effect of meteorological parameters and to identify radon maxima possibly caused by seismic activity.

  11. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach. (United States)

    Ropkins, K; Beck, A J


    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  12. Analysis of post-earthquake landslide activity and geo-environmental effects (United States)

    Tang, Chenxiao; van Westen, Cees; Jetten, Victor


    Large earthquakes can cause huge losses to human society, due to ground shaking, fault rupture and due to the high density of co-seismic landslides that can be triggered in mountainous areas. In areas that have been affected by such large earthquakes, the threat of landslides continues also after the earthquake, as the co-seismic landslides may be reactivated by high intensity rainfall events. Earthquakes create Huge amount of landslide materials remain on the slopes, leading to a high frequency of landslides and debris flows after earthquakes which threaten lives and create great difficulties in post-seismic reconstruction in the earthquake-hit regions. Without critical information such as the frequency and magnitude of landslides after a major earthquake, reconstruction planning and hazard mitigation works appear to be difficult. The area hit by Mw 7.9 Wenchuan earthquake in 2008, Sichuan province, China, shows some typical examples of bad reconstruction planning due to lack of information: huge debris flows destroyed several re-constructed settlements. This research aim to analyze the decay in post-seismic landslide activity in areas that have been hit by a major earthquake. The areas hit by the 2008 Wenchuan earthquake will be taken a study area. The study will analyze the factors that control post-earthquake landslide activity through the quantification of the landslide volume changes well as through numerical simulation of their initiation process, to obtain a better understanding of the potential threat of post-earthquake landslide as a basis for mitigation planning. The research will make use of high-resolution stereo satellite images, UAV and Terrestrial Laser Scanning(TLS) to obtain multi-temporal DEM to monitor the change of loose sediments and post-seismic landslide activities. A debris flow initiation model that incorporates the volume of source materials, vegetation re-growth, and intensity-duration of the triggering precipitation, and that evaluates

  13. Baseline geophysical data for hazard management in coastal areas in relation to earthquakes and tsunamis

    Directory of Open Access Journals (Sweden)

    KSR Murthy


    Full Text Available A systematic study of geophysical data of the Eastern Continental Margin of India was taken up to identify the land–ocean tectonic lineaments over the east coast of India and the possible neotectonic activity associated with them. These studies helped in delineating the offshore extension of some of the coastal lineaments. Analysis of magnetic, gravity and shallow seismic data, combined with reported seismicity data, indicates moderate seismicity associated with some of these land–ocean tectonics of the Eastern Continental Margin of India. The coastal/offshore regions of Vizianagaram (north of Visakhapatnam and Ongole of the Andhra Pradesh margin and the Puducherry shelf of the Tamil Nadu margin have been identified as zones of weakness where neotectonic activity has been established. Bathymetry data over the Eastern Continental Margin of India revealed the morphology of the shelf and slope of this margin, which in turn can be used as the baseline data for tsunami surge models. Detailed bathymetry map and sections of the Nagapattinam–Cuddalore shelf (from 10.5° to about 12°N indicate that one of the main reasons for the higher run-up heights and inundation in the Nagapattinam–Cuddalore coast during the Indian Ocean Tsunami of 26 December 2004 could be the concave shape of the shelf with a gentle slope, which might have accelerated the tsunami surge to flush through at a rapid force. Structural control also appears to be a contributing factor for the tsunami surge.

  14. Direct methods of soil-structure interaction analysis for earthquake loadings (IV)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Kim, D. S.; Choi, J. S.; Kwon, K. C.; Kim, Y. J.; Lee, H. J.; Kim, S. B.; Kim, D. K. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)


    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite element method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI-QK' has been developed. The computer program has been verified using a free-field site-response problem. The Hualien FVT stochastic finite element analysis after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analysis for the LSST structure has also been performed and compared with the measured data.

  15. Multifractal analysis of earthquakes in Kumaun Himalaya and its ...

    Indian Academy of Sciences (India)

    Earthquakes in this region are mainly caused due to release of elastic strain energy. The Himalayan region can be attributed to .... the Himalayas and the aseismic slip rate simu- lated below the Higher Himalayas suggests that ..... observed power-law build-up of intermediate events before a great earthquake represent the ...

  16. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator. (United States)

    Wang, Wei; Albert, Jeffrey M


    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  17. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy) (United States)

    Grasso, S.; Maugeri, M.

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either

  18. Analysis of Landslides Triggered by October 2005, Kashmir Earthquake. (United States)

    Mahmood, Irfan; Qureshi, Shahid Nadeem; Tariq, Shahina; Atique, Luqman; Iqbal, Muhammad Farooq


    The October 2005, Kashmir earthquake main event was triggered along the Balakot-Bagh Fault which runs from Bagh to Balakot, and caused more damages in and around these areas. Major landslides were activated during and after the earthquake inflicting large damages in the area, both in terms of infrastructure and casualties. These landslides were mainly attributed to the minimum threshold of the earthquake, geology of the area, climatologic and geomorphologic conditions, mudflows, widening of the roads without stability assessment, and heavy rainfall after the earthquake. These landslides were mainly rock and debris falls. Hattian Bala rock avalanche was largest landslide associated with the earthquake which completely destroyed a village and blocked the valley creating a lake. The present study shows that the fault rupture and fault geometry have direct influence on the distribution of landslides and that along the rupture zone a high frequency band of landslides was triggered. There was an increase in number of landslides due to 2005 earthquake and its aftershocks and that most of earthquakes have occurred along faults, rivers and roads. It is observed that the stability of landslide mass is greatly influenced by amplitude, frequency and duration of earthquake induced ground motion. Most of the slope failures along the roads resulted from the alteration of these slopes during widening of the roads, and seepages during the rainy season immediately after the earthquake.  Landslides occurred mostly along weakly cemented and indurated rocks, colluvial sand and cemented soils. It is also worth noting that fissures and ground crack which were induced by main and after shock are still present and they pose a major potential threat for future landslides in case of another earthquake activity or under extreme weather conditions.

  19. Comparison of Structurally Controlled Landslide Hazard Simulation to the Co-seismic Landslides Caused by the M 7.2 2013 Bohol Earthquake. (United States)

    Galang, J. A. M. B.; Eco, R. C.; Lagmay, A. M. A.


    The M_w 7.2 October 15, 2013 Bohol earthquake is one of the more destructive earthquake to hit the Philippines in the 21st century. The epicenter was located in Sagbayan municipality, central Bohol and was generated by a previously unmapped reverse fault called the "Inabanga Fault". The earthquake resulted in 209 fatalities and over 57 million USD worth of damages. The earthquake generated co-seismic landslides most of which were related to fault structures. Unlike rainfall induced landslides, the trigger for co-seismic landslides happen without warning. Preparations for this type of landslides rely heavily on the identification of fracture-related slope instability. To mitigate the impacts of co-seismic landslide hazards, morpho-structural orientations of discontinuity sets were mapped using remote sensing techniques with the aid of a Digital Terrain Model (DTM) obtained in 2012. The DTM used is an IFSAR derived image with a 5-meter pixel resolution and approximately 0.5 meter vertical accuracy. Coltop 3D software was then used to identify similar structures including measurement of their dip and dip directions. The chosen discontinuity sets were then keyed into Matterocking software to identify potential rock slide zones due to planar or wedged discontinuities. After identifying the structurally-controlled unstable slopes, the rock mass propagation extent of the possible rock slides was simulated using Conefall. Separately, a manually derived landslide inventory has been performed using post-earthquake satellite images and LIDAR. The results were compared to the landslide inventory which identified at least 873 landslides. Out of the 873 landslides identified through the inventory, 786 or 90% intersect the simulated structural-controlled landslide hazard areas of Bohol. The results show the potential of this method to identify co-seismic landslide hazard areas for disaster mitigation. Along with computer methods to simulate shallow landslides, and debris flow

  20. Hazard function theory for nonstationary natural hazards (United States)

    Read, L.; Vogel, R. M.


    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  1. Hazard screening application guide. Safety Analysis Report Update Program

    Energy Technology Data Exchange (ETDEWEB)



    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  2. Direct methods of soil-structure interaction analysis for earthquake loadings (V)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Choi, J. S.; Lee, J. J.; Park, D. U. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)


    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI' has been developed. The computer program has been verified using a free-field site-response problem. Post-correlation analysis for the Hualien FVT after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analyses for three LSST structures (Hualien, Lotung and Tepsco structure) have also been performed and compared with the measured data.

  3. Hospital stay as a proxy indicator for severe injury in earthquakes: a retrospective analysis. (United States)

    Zhao, Lu-Ping; Gerdin, Martin; Westman, Lina; Rodriguez-Llanes, Jose Manuel; Wu, Qi; van den Oever, Barbara; Pan, Liang; Albela, Manuel; Chen, Gao; Zhang, De-Sheng; Guha-Sapir, Debarati; von Schreeb, Johan


    Earthquakes are the most violent type of natural disasters and injuries are the dominant medical problem in the early phases after earthquakes. However, likely because of poor data availability, high-quality research on injuries after earthquakes is lacking. Length of hospital stay (LOS) has been validated as a proxy indicator for injury severity in high-income settings and could potentially be used in retrospective research of injuries after earthquakes. In this study, we assessed LOS as an adequate proxy indicator for severe injury in trauma survivors of an earthquake. A retrospective analysis was conducted using a database of 1,878 injured patients from the 2008 Wenchuan earthquake. Our primary outcome was severe injury, defined as a composite measure of serious injury or resource use. Secondary outcomes were serious injury and resource use, analysed separately. Non-parametric receiver operating characteristics (ROC) and area under the curve (AUC) analysis was used to test the discriminatory accuracy of LOS when used to identify severe injury. An 0.7earthquake survivors. However, LOS was found to be a proxy for major nonorthopaedic surgery and blood transfusion. These findings can be useful for retrospective research on earthquake-injured patients when detailed hospital records are not available.

  4. Earthquake-by-earthquake fold growth above the Puente Hills blind thrust fault, Los Angeles, California: Implications for fold kinematics and seismic hazard (United States)

    Leon, Lorraine A.; Christofferson, Shari A.; Dolan, James F.; Shaw, John H.; Pratt, Thomas L.


    Boreholes and high-resolution seismic reflection data collected across the forelimb growth triangle above the central segment of the Puente Hills thrust fault (PHT) beneath Los Angeles, California, provide a detailed record of incremental fold growth during large earthquakes on this major blind thrust fault. These data document fold growth within a discrete kink band that narrows upward from ˜460 m at the base of the Quaternary section (200-250 m depth) to 82% at 250 m depth) folding and uplift occur within discrete kink bands, thereby enabling us to develop a paleoseismic history of the underlying blind thrust fault. The borehole data reveal that the youngest part of the growth triangle in the uppermost 20 m comprises three stratigraphically discrete growth intervals marked by southward thickening sedimentary strata that are separated by intervals in which sediments do not change thickness across the site. We interpret the intervals of growth as occurring after the formation of now-buried paleofold scarps during three large PHT earthquakes in the past 8 kyr. The intervening intervals of no growth record periods of structural quiescence and deposition at the regional, near-horizontal stream gradient at the study site. Minimum uplift in each of the scarp-forming events, which occurred at 0.2-2.2 ka (event Y), 3.0-6.3 ka (event X), and 6.6-8.1 ka (event W), ranged from ˜1.1 to ˜1.6 m, indicating minimum thrust displacements of ≥2.5 to 4.5 m. Such large displacements are consistent with the occurrence of large-magnitude earthquakes (Mw > 7). Cumulative, minimum uplift in the past three events was 3.3 to 4.7 m, suggesting cumulative thrust displacement of ≥7 to 10.5 m. These values yield a minimum Holocene slip rate for the PHT of ≥0.9 to 1.6 mm/yr. The borehole and seismic reflection data demonstrate that dip within the kink band is acquired incrementally, such that older strata that have been deformed by more earthquakes dip more steeply than younger strata

  5. SCEC Community Modeling Environment (SCEC/CME) - Seismic Hazard Analysis Applications and Infrastructure (United States)

    Maechling, P. J.; Jordan, T. H.; Kesselman, C.; Moore, R.; Minster, B.; SCEC ITR Collaboration


    The Southern California Earthquake Center (SCEC) has formed a Geoscience/IT partnership to develop an advanced information infrastructure for system-level earthquake science in Southern California. This SCEC/ITR partnership comprises SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. This collaboration recently completed the second year in a five-year National Science Foundation (NSF) funded ITR project called the SCEC Community Modeling Environment (SCEC/CME). The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed by project collaborators include a Probabilistic Seismic Hazard Analysis system called OpenSHA [Field et al., this meeting]. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERF's). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. A Rupture Dynamic Model (RDM) has also been developed that couples a rupture dynamics simulation into an anelastic wave model. The collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of SHA programs. To support computationally expensive simulations, we have constructed a grid-based system utilizing Globus software [Kesselman et al., this meeting]. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC, NPACI and Teragrid High Performance Computing Centers. We have


    Energy Technology Data Exchange (ETDEWEB)



    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process.

  7. ICA-based polarization analysis on volcano-tectonic earthquakes (United States)

    Falanga, Mariarosaria; De Lauro, Enza; De Martino, Salvatore; Petrosino, Simona


    A new approach for the analysis of polarization of seismic signals is proposed. The method is based on Independent Component Analysis and allows the identification and separation of the basic sources, which are naturally polarized into the vertical and horizontal planes. The results from the case study of a swarm of volcano-tectonic earthquakes occurred at Campi Flegrei in October 2015 are impressive: a clear separation of the P- and S-wave seismic phases in the time domain is obtained. In addition, the efficiency of the method in retrieving the polarization parameters is demonstrated by the comparison with other standard techniques. The presented approach provides wavefield decomposition and polarization analysis in a single step, thus avoiding a priori cumbersome filtering procedures and segmentation of the signals. It is useful for discriminating and analysing different seismic phases and can be applied to a variety of volcanic and tectonic signals, therefore it can strongly support all the studies on propagation and source mechanism. Moreover, due to its fastness and robustness this stand-alone tool can be routinely used in the volcano monitoring practice.

  8. Analysis of Groundwater level Changes in Wells Sensitive to Earthquakes (United States)

    Liu, C.; Lee, C.; Chia, Y.; Hsiao, C.; Kuo, K.


    Earthquake-related groundwater level changes have often been observed in many places in Taiwan which is located at the boundary between the Erasian plate and the Phillipine Sea plate. For instance, more than 160 monitoring wells stations recorded coseismic changes during the 1999 Chi-Chi earthquake. These stations, which consist of one to five wells of different depths, were installed in the coastal plain or hillsides. In this study, we analyze monitoring data from four well stations (Pingding, Chukou, Yuanlin and Donher) to investigate the sensitivity of well water level to earthquakes. The variation of groundwater level with natural and human factors, such as rainfall, barometric pressure, earth tides and pumping were studied to understand the background changes in these wells. We found various relations between the magnitude and the epicentral distance of earthquakes to the co-seismic groundwater level changes at different wells. The sensitivity of monitoring wells was estimated from the ratio of the number of co-seismic groundwater level changes to the number of large earthquakes during the recording period. Earthquake related co-seismic groundwater level changes may reflect the redistribution of crustal stress and strain. However, coseismic changes in multiple-well monitoring stations may vary with depth. Also, water level data from wells of higher sampling rate show more details in co-seismic and background changes. Therefore, high-resolution and high-frequency data are essential for future study of groundwater level changes in response to earthquakes or fault movement.

  9. Seismic Hazard Analysis and Uniform Hazard Spectra for Different Regions of Kerman

    Directory of Open Access Journals (Sweden)

    Gholamreza Ghodrati Amiri


    Full Text Available This paper was present a seismic hazard analysis and uniform hazard spectra for different regions of Kerman city. A collected catalogue containing both historical and instrumental events and covering the period from 8th century AD until now within the area of 200 Km in radius were used and Seismic sources are modeled. Kijko method has been applied for estimating the seismic parameters considering lack of suitable seismic data, inaccuracy of the available information and uncertainty of magnitude in different periods. To determine the peak ground acceleration the calculations were performed by using the logic tree method. Two weighted attenuation relations were used; including Ghodrati et al, 0.6 and Zare et al, 0.4. Analysis was conducted for 13×8 grid points over Kerman region and adjacent areas with SEISRISK III software and in order to determine the seismic spectra Ghodrati et al, spectral attenuation relationships was used.

  10. Association between earthquake events and cholera outbreaks: a cross-country 15-year longitudinal analysis. (United States)

    Sumner, Steven A; Turner, Elizabeth L; Thielman, Nathan M


    Large earthquakes can cause population displacement, critical sanitation infrastructure damage, and increased threats to water resources, potentially predisposing populations to waterborne disease epidemics such as cholera. Problem The risk of cholera outbreaks after earthquake disasters remains uncertain. A cross-country analysis of World Health Organization (WHO) cholera data that would contribute to this discussion has yet to be published. A cross-country longitudinal analysis was conducted among 63 low- and middle-income countries from 1995-2009. The association between earthquake disasters of various effect sizes and a relative spike in cholera rates for a given country was assessed utilizing fixed-effects logistic regression and adjusting for gross domestic product per capita, water and sanitation level, flooding events, percent urbanization, and under-five child mortality. Also, the association between large earthquakes and cholera rate increases of various degrees was assessed. Forty-eight of the 63 countries had at least one year with reported cholera infections during the 15-year study period. Thirty-six of these 48 countries had at least one earthquake disaster. In adjusted analyses, country-years with ≥10,000 persons affected by an earthquake had 2.26 times increased odds (95 CI, 0.89-5.72, P = .08) of having a greater than average cholera rate that year compared to country-years having earthquake. The association between large earthquake disasters and cholera infections appeared to weaken as higher levels of cholera rate increases were tested. A trend of increased risk of greater than average cholera rates when more people were affected by an earthquake in a country-year was noted. However these findings did not reach statistical significance at traditional levels and may be due to chance. Frequent large-scale cholera outbreaks after earthquake disasters appeared to be relatively uncommon.

  11. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, W.S.


    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.


    Kataoka, Shojiro; Nagaya, Kazuhiro; Yabe, Masaaki; Matsuoka, Kazunari; Kaneko, Masahiro

    Dynamic analysis is widely used for the seismic design of the bridge with horizontal force distributed structure in recent years. There is few research on dynamic analysis of such bridges using real earthquake data. In this research, we carried out dynamic analysis using accelerograms recorded during the 2011 off the Pacific coast of Tohoku earthquake (Mw9.0). The analytical earthquake response shows a good agreement with the observed response when colliding and friction forces acted between sideblocks and shoes are taken into account.

  13. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures (United States)

    Kalkan, Erol; Chopra, Anil K.


    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  14. Sensitivity analysis of seismic hazard for the northwestern portion of the state of Gujarat, India (United States)

    Petersen, M.D.; Rastogi, B.K.; Schweig, E.S.; Harmsen, S.C.; Gomberg, J.S.


    We test the sensitivity of seismic hazard to three fault source models for the northwestern portion of Gujarat, India. The models incorporate different characteristic earthquake magnitudes on three faults with individual recurrence intervals of either 800 or 1600 years. These recurrence intervals imply that large earthquakes occur on one of these faults every 266-533 years, similar to the rate of historic large earthquakes in this region during the past two centuries and for earthquakes in intraplate environments like the New Madrid region in the central United States. If one assumes a recurrence interval of 800 years for large earthquakes on each of three local faults, the peak ground accelerations (PGA; horizontal) and 1-Hz spectral acceleration ground motions (5% damping) are greater than 1 g over a broad region for a 2% probability of exceedance in 50 years' hazard level. These probabilistic PGAs at this hazard level are similar to median deterministic ground motions. The PGAs for 10% in 50 years' hazard level are considerably lower, generally ranging between 0.2 g and 0.7 g across northwestern Gujarat. Ground motions calculated from our models that consider fault interevent times of 800 years are considerably higher than other published models even though they imply similar recurrence intervals. These higher ground motions are mainly caused by the application of intraplate attenuation relations, which account for less severe attenuation of seismic waves when compared to the crustal interplate relations used in these previous studies. For sites in Bhuj and Ahmedabad, magnitude (M) 7 3/4 earthquakes contribute most to the PGA and the 0.2- and 1-s spectral acceleration ground motion maps at the two considered hazard levels. ?? 2004 Elsevier B.V. All rights reserved.

  15. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 25. Parameters for Specifying Intensity-Related Earthquake Ground Motions. (United States)


    and Sponheuer, W. 1969. Scale of Seismic Intensity: Proc. Fourth World Conf. on Earthquake Engineering, Santiago, Chile . Murphy, J. R., and O’Brien, L...Predom V/H el, V/I Vel V/H Displ V/H sec VIH Period Period Predom Accel cm/sec Vel cm Disp .05 Dur sec sec Period S11 2 0.48 MODIFIED MERCALLI INTENSITY...0.1 0. 0.16 142.20 Long. Vert Hor Vert Ratio Ratio Vert Ratio Vert r io Du r atio Predom Predom VIH Acce V/H Vel V /H Dspi V H sec 1, H Period Period

  16. Geomorphological Displacement as a Combined Process of Tectonics and Mass-Movement in the 2011 East Japan Earthquake(Mega-Earthquake and Geomorphic Hazards)


    Hiroshi, Yagi; Faculty of Education, Arts and Sciences, Yamagata University


    The triggered earthquake of M7.0 occurred in the vicinity of Iwaki City on 11^ April 2011. It caused co-seismic surface rapture along the pre-existed Idosawa and Yunotake faults and showed normal faulting with slight strike slip sense. However, strike slip senses along those faults are not consistent. That implies converse plunge to the midpoint along the Idosawa fault was causing sag and drag depression. Such phenomena are attributed to a local tensile stress field induced by quick shift of ...

  17. Analysis of Earthquake Recordings Obtained from the Seafloor Earthquake Measurement System (SEMS) Instruments Deployed off the Coast of Southern California (United States)

    Boore, D.M.; Smith, C.E.


    For more than 20 years, a program has been underway to obtain records of earthquake shaking on the seafloor at sites offshore of southern California, near oil platforms. The primary goal of the program is to obtain data that can help determine if ground motions at offshore sites are significantly different than those at onshore sites; if so, caution may be necessary in using onshore motions as the basis for the seismic design of oil platforms. We analyze data from eight earthquakes recorded at six offshore sites; these are the most important data recorded on these stations to date. Seven of the earthquakes were recorded at only one offshore station; the eighth event was recorded at two sites. The earthquakes range in magnitude from 4.7 to 6.1. Because of the scarcity of multiple recordings from any one event, most of the analysis is based on the ratio of spectra from vertical and horizontal components of motion. The results clearly show that the offshore motions have very low vertical motions compared to those from an average onshore site, particularly at short periods. Theoretical calculations find that the water layer has little effect on the horizontal components of motion but that it produces a strong spectral null on the vertical component at the resonant frequency of P waves in the water layer. The vertical-to-horizontal ratios for a few selected onshore sites underlain by relatively low shear-wave velocities are similar to the ratios from offshore sites for frequencies less than about one-half the water layer P-wave resonant frequency, suggesting that the shear-wave velocities beneath a site are more important than the water layer in determining the character of the ground motions at lower frequencies.

  18. Paleo-earthquake signatures from the South Wagad Fault (SWF), Wagad Island, Kachchh, Gujarat, western India: A potential seismic hazard (United States)

    Malik, Javed N.; Gadhavi, Mahendrasinh S.; Kothyari, Girish Ch; Satuluri, Sravanthi


    In last 500 years, Kachchh experienced several large magnitude earthquakes (6.0 ≥ M ≤ 7.8), however, not all accompanied surface rupture. The 1819 Allah Bund earthquake (Mw7.8) accompanied surface rupture, whereas, the 2001 Bhuj event (Mw7.6) occurred at a depth of 23 km on E-W striking south dipping thrust fault remained blind. Discontinuities between the denser-brittle basement (?) and overlying ductile-softer Mesozoic-Tertiary-Quaternary succession resulted in a different geometry of faulting. Normal faults associated with rift were reactivated as reverse faults during inversion tectonics, propagated in sedimentary succession and arrested. Thrust-ramps developed along the discontinuities accompanied surface ruptures. Folded structures along the South Wagad Fault (SWF) - an active thrust, exhibits lateral-propagation of fold segments and linkage, suggestive of fault-related-fold growth. Paleoseismic investigations revealed evidence of at least three paleo-earthquakes. Event I occurred before BCE 5080; Event II between BCE 4820 and 2320, and was probably responsible for a massive damage at Dholavira - Harappan site. Event III was between BCE 1230 and 04, most likely caused severe damage to Dholavira. Archaeo-seismological Quality Factor (AQF) of 0.5 suggests that the Dholavira is vulnerable to earthquakes from nearby active faults. With 1500-2000 yr of recurrence interval, occurrence of a large magnitude earthquake on SWF cannot be ruled out.

  19. Hazardous-waste analysis plan for LLNL operations

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R.S.


    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  20. Landslide hazards and systems analysis: A Central European perspective (United States)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas


    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  1. Towards a probabilistic tsunami hazard analysis for the Gulf of Cadiz (United States)

    Løvholt, Finn; Urgeles, Roger


    Landslides and volcanic flank collapses constitute a significant portion of all known tsunami sources, and they are less constrained geographically than earthquakes as they are not tied to large fault zones. While landslides have mostly produced local tsunamis historically, prehistoric evidence show that landslides can also produce ocean wide tsunamis. Because the landslide induced tsunami probability is more difficult to quantify than the one induced by earthquakes, our understanding of the landslide tsunami hazard is less understood. To improve our understanding and methodologies to deal with this hazard, we here present results and methods for a preliminary landslide probabilistic tsunami hazard assessment (LPTHA) for the Gulf of Cadiz for submerged landslides. The present literature on LPTHA is sparse, and studies have so far been separated into two groups, the first based on observed magnitude frequency distributions (MFD's), the second based on simplified geotechnical slope stability analysis. We argue that the MFD based approach is best suited when a sufficient amount of data covering a wide range of volumes is available, although uncertainties in the dating of the landslides often represent a potential large source of bias. To this end, the relatively rich availability of landslide data in the Gulf of Cadiz makes this area suitable for developing and testing LPTHA models. In the presentation, we will first explore the landslide data and statistics, including different spatial factors such as slope versus volume relationships, faults etc. Examples of how random realizations can be used to distribute tsunami source over the study area will be demonstrated. Furthermore, computational strategies for simulating both the landslide and the tsunami generation in a simplified way will be described. To this end, we use depth averaged viscoplastic landslide model coupled to the numerical tsunami model to represent a set of idealized tsunami sources, which are in turn

  2. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach. (United States)

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G


    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  3. Multiple injuries after earthquakes: a retrospective analysis on 1,871 injured patients from the 2008 Wenchuan earthquake. (United States)

    Lu-Ping, Zhao; Rodriguez-Llanes, Jose Manuel; Qi, Wu; van den Oever, Barbara; Westman, Lina; Albela, Manuel; Liang, Pan; Gao, Chen; De-Sheng, Zhang; Hughes, Melany; von Schreeb, Johan; Guha-Sapir, Debarati


    Multiple injuries have been highlighted as an important clinical dimension of the injury profile following earthquakes, but studies are scarce. We investigated the pattern and combination of injuries among patients with two injuries following the 2008 Wenchuan earthquake. We also described the general injury profile, causes of injury and socio-demographic characteristics of the injured patients. A retrospective hospital-based analysis of 1,871 earthquake injured patients, totaling 3,177 injuries, admitted between 12 and 31 May 2008 to the People's Hospital of Deyang city (PHDC). An electronic, webserver-based database with International Classification of Diseases (ICD)-10-based classification of earthquake-related injury diagnoses (IDs), anatomical sites and additional background variables of the inpatients was used. We analyzed this dataset for injury profile and number of injuries per patient. We then included all patients (856) with two injuries for more in-depth analysis. Possible spatial anatomical associations were determined a priori. Cross-tabulation and more complex frequency matrices for combination analyses were used to investigate the injury profile. Out of the 1,871 injured patients, 810 (43.3%) presented with a single injury. The rest had multiple injuries; 856 (45.8%) had two, 169 (9.0%) patients had three, 32 (1.7%) presented with four injuries, while only 4 (0.2%) were diagnosed with five injuries. The injury diagnoses of patients presenting with two-injuries showed important anatomical intra-site or neighboring clustering, which explained 49.1% of the combinations. For fractures, the result was even more marked as spatial clustering explained 57.9% of the association pattern. The most frequent combination of IDs was a double-fracture, affecting 20.7% of the two-injury patients (n = 177). Another 108 patients (12.6%) presented with fractures associated with crush injury and organ-soft tissue injury. Of the 3,177 injuries, 1,476 (46.5%) were

  4. Earthquake scenario and probabilistic ground-shaking hazard maps for the Albuquerque-Belen-Santa Fe, New Mexico, corridor (United States)

    Wong, I.; Olig, S.; Dober, M.; Silva, W.; Wright, D.; Thomas, P.; Gregor, N.; Sanford, A.; Lin, K.-W.; Love, D.


    New Mexico's population is concentrated along the corridor that extends from Belen in the south to Española in the north and includes Albuquerque and Santa Fe. The Rio Grande rift, which encompasses the corridor, is a major tectonically, volcanically, and seismically active continental rift in the western U.S. Although only one large earthquake (moment magnitude (M) ≥ 6) has possibly occurred in the New Mexico portion of the rift since 1849, paleoseismic data indicate that prehistoric surface-faulting earthquakes of M 6.5 and greater have occurred on aver- age every 400 yrs on many faults throughout the Rio Grande rift.

  5. The earthquake and tsunami of 1865 November 17: evidence for far-field tsunami hazard from Tonga (United States)

    Okal, Emile A.; Borrero, José; Synolakis, Costas E.


    Historical reports of an earthquake in Tonga in 1865 November identify it as the only event from that subduction zone which generated a far-field tsunami observable without instruments. Run-up heights reached 2 m in Rarotonga and 80 cm in the Marquesas Islands. Hydrodynamic simulations require a moment of 4 × 1028 dyn cm, a value significantly larger than previous estimates of the maximum size of earthquake to be expected at the Tonga subduction zone. This warrants an upwards re-evaluation of the tsunami risk from Tonga to the Cook Islands and the various Polynesian chains, which had hitherto been regarded as minor.

  6. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, J.K.; Chander, H.


    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  7. Frequency spectrum method-based stress analysis for oil pipelines in earthquake disaster areas. (United States)

    Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao


    When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline.

  8. Frequency Spectrum Method-Based Stress Analysis for Oil Pipelines in Earthquake Disaster Areas (United States)

    Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao


    When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline. PMID:25692790

  9. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis (United States)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.


    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy

  10. Environmental Impact and Hazards Analysis Critical Control Point ...

    African Journals Online (AJOL)

    Tsire is a local meat delicacy (kebab) in northern Nigeria, which has become popular and widely acceptable throughout the country and even beyond. Three production sites of tsire were evaluated for the environmental impact and hazard analysis critical control point (HACCP) on the microbiological and chemical qualities ...

  11. Development of Hazard Analysis Critical Control Points (HACCP ...

    African Journals Online (AJOL)

    Development of Hazard Analysis Critical Control Points (HACCP) and Enhancement of Microbial Safety Quality during Production of Fermented Legume Based ... Nigerian Food Journal ... Critical control points during production of iru and okpehe, two fermented condiments, were identified in four processors in Nigeria.

  12. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    Energy Technology Data Exchange (ETDEWEB)



    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  13. 14 CFR 417.223 - Flight hazard area analysis. (United States)


    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff to the planned safe flight state of § 417.219(c), including each planned impact, for an orbital... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard areas...

  14. HADES: Microprocessor Hazard Analysis via Formal Verification of Parameterized Systems

    Directory of Open Access Journals (Sweden)

    Lukáš Charvát


    Full Text Available HADES is a fully automated verification tool for pipeline-based microprocessors that aims at flaws caused by improperly handled data hazards. It focuses on single-pipeline microprocessors designed at the register transfer level (RTL and deals with read-after-write, write-after-write, and write-after-read hazards. HADES combines several techniques, including data-flow analysis, error pattern matching, SMT solving, and abstract regular model checking. It has been successfully tested on several microprocessors for embedded applications.

  15. LOCAL SITE CONDITIONS INFLUENCING EARTHQUAKE INTENSITIES AND SECONDARY COLLATERAL IMPACTS IN THE SEA OF MARMARA REGION - Application of Standardized Remote Sensing and GIS-Methods in Detecting Potentially Vulnerable Areas to Earthquakes, Tsunamis and Other Hazards.

    Directory of Open Access Journals (Sweden)

    George Pararas-Carayannis


    Full Text Available The destructive earthquake that struck near the Gulf of Izmit along the North Anatolian fault in Northwest Turkey on August 17, 1999, not only generated a local tsunami that was destructive at Golcuk and other coastal cities in the eastern portion of the enclosed Sea of Marmara, but was also responsible for extensive damage from collateral hazards such as subsidence, landslides, ground liquefaction, soil amplifications, compaction and underwater slumping of unconsolidated sediments. This disaster brought attention in the need to identify in this highly populated region, local conditions that enhance earthquake intensities, tsunami run-up and other collateral disaster impacts. The focus of the present study is to illustrate briefly how standardized remote sensing techniques and GIS-methods can help detect areas that are potentially vulnerable, so that disaster mitigation strategies can be implemented more effectively. Apparently, local site conditions exacerbate earthquake intensities and collateral disaster destruction in the Marmara Sea region. However, using remote sensing data, the causal factors can be determined systematically. With proper evaluation of satellite imageries and digital topographic data, specific geomorphologic/topographic settings that enhance disaster impacts can be identified. With a systematic GIS approach - based on Digital Elevation Model (DEM data - geomorphometric parameters that influence the local site conditions can be determined. Digital elevation data, such as SRTM (Shuttle Radar Topography Mission, with 90m spatial resolution and ASTER-data with 30m resolution, interpolated up to 15 m is readily available. Areas with the steepest slopes can be identified from slope gradient maps. Areas with highest curvatures susceptible to landslides can be identified from curvature maps. Coastal areas below the 10 m elevation susceptible to tsunami inundation can be clearly delineated. Height level maps can also help locate

  16. Earthquake Emergency Education in Dushanbe, Tajikistan (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.


    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  17. Multi-scenario analysis: a new hybrid approach to inform earthquake disaster risk planning (United States)

    Robinson, Tom; Rosser, Nick


    Current earthquake risk assessments take one of two approaches: deterministic (scenario) or probabilistic, but both have notable limitations. Deterministic approaches are limited by a focus on a single scenario, as the results of the analysis are only relevant to the scenario selected, which is unlikely to represent the earthquake that occurs next, nor its impacts. Alternatively, probabilistic approaches are sensitive to the completeness of evidence of past earthquakes, which is inadequate in most seismically-active parts of the world. Consequently, earthquake risk assessments have failed to inform planning prior to major earthquakes such as the 2005 Kashmir and 2008 Wenchuan disasters. This study presents a new hybrid approach for earthquake risk assessments that maintains the high detail of deterministic approaches but considers numerous scenarios simultaneously, similar to probabilistic approaches. The aim of such an approach is to identify impacts that recur in multiple scenarios, or impacts that occur irrespective of the given scenario. Such recurring impacts can be considered the most likely consequences to occur in the next earthquake, despite the precise details of the next earthquake remaining unknown. To demonstrate this, we apply the method to Nepal, one of the most seismically at-risk nations in the world. We model 30 different potential earthquake scenarios throughout the country with magnitude ranges 8.6 to 7.0 for three different times of day (night-time, mid-week day-time, weekend day-time) for a total of 90 different scenarios. By combining the results from each scenario for individual districts, we are able to assess which districts are most at risk of losses in the next earthquake. By focussing on fatalities as a percentage of total population, we rank each district by its: (a) median modelled fatalities; (b) percentage of scenarios with >0 fatalities; (c) inter-quartile range of modelled fatalities; and (d) maximum modelled fatalities. Combining

  18. Seismic hazard assessment: Issues and alternatives (United States)

    Wang, Z.


    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  19. Probabilistic aftershock hazard analysis, two case studies in West and Northwest Iran (United States)

    Ommi, S.; Zafarani, H.


    Aftershock hazard maps contain the essential information for search and rescue process, and re-occupation after a main-shock. Accordingly, the main purposes of this article are to study the aftershock decay parameters and to estimate the expected high-frequency ground motions (i.e., Peak Ground Acceleration (PGA)) for recent large earthquakes in the Iranian plateau. For this aim, the Ahar-Varzaghan doublet earthquake (August 11, 2012; M N =6.5, M N =6.3), and the Ilam (Murmuri) earthquake (August 18, 2014 ; M N =6.2) have been selected. The earthquake catalogue has been collected based on the Gardner and Knopoff (Bull Seismol Soc Am 64(5), 1363-1367, 1974) temporal and spatial windowing technique. The magnitude of completeness and the seismicity parameters (a, b) and the modified Omori law parameters (P, K, C) have been determined for these two earthquakes in the 14, 30, and 60 days after the mainshocks. Also, the temporal changes of parameters (a, b, P, K, C) have been studied. The aftershock hazard maps for the probability of exceedance (33%) have been computed in the time periods of 14, 30, and 60 days after the Ahar-Varzaghan and Ilam (Murmuri) earthquakes. For calculating the expected PGA of aftershocks, the regional and global ground motion prediction equations have been utilized. Amplification factor based on the site classes has also been implied in the calculation of PGA. These aftershock hazard maps show an agreement between the PGAs of large aftershocks and the forecasted PGAs. Also, the significant role of b parameter in the Ilam (Murmuri) probabilistic aftershock hazard maps has been investigated.

  20. Update earthquake risk assessment in Cairo, Egypt (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan


    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  1. Hazard analysis of Clostridium perfringens in the Skylab Food System (United States)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.


    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  2. Frequency Analysis of Aircraft hazards for License Application

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley


    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  3. Earthquake Damage to Transportation Systems (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Earthquakes represent one of the most destructive natural hazards known to man. A serious result of large-magnitude earthquakes is the disruption of transportation...

  4. Site-Specific Earthquake Response Analysis for Portsmouth Gaseous Diffusion Plant, Portsmouth, Ohio (United States)


    paper; L-9T-13 a’’’m I Incluerys biligrphricam eeentSain nDt 1.te-seii arthquake hzrdsos analysis -- hit- Portsmouth.2 Gaseous df Difusion...Lysmer, J. 1987. "Relationsbips Between Soil Conditions and Earthquake Ground Motions in Mexico City in the Earthquake of Sept. 19, 1985," Report EERC-87

  5. Choosing Appropriate Hazards Analysis Techniques For Your Process (United States)


    Study ( HAZOP ); (v) Failure Mode and Effects Analysis (FMEA); (vi ) Fault Tree Analysis; or (vii) An appropriate equivalent methodology.” The safety...CFR 1910.119: ! Checklist ! What-if ! What-if Checklist ! Hazards and Operability Study ( HAZOP ) ! Fault Tree / Logic Diagram ! Failure Modes and...than the other methods and are more appropriate for a simple process. The HAZOP has found much use in the petroleum and chemical industries and the

  6. Standard Compliant Hazard and Threat Analysis for the Automotive Domain


    Kristian Beckers; Jürgen Dürrwang; Dominik Holling


    The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied to...

  7. Singular limit analysis of a model for earthquake faulting

    DEFF Research Database (Denmark)

    Bossolini, Elena; Brøns, Morten; Kristiansen, Kristian Uldall


    In this paper we consider the one dimensional spring-block model describing earthquake faulting. By using geometric singular perturbation theory and the blow-up method we provide a detailed description of the periodicity of the earthquake episodes. In particular, the limit cycles arise from...... of the blow-up method to recover the hyperbolicity. This enables the identification of a new attracting manifold that organises the dynamics at infinity. This in turn leads to the formulation of a conjecture on the behaviour of the limit cycles as the time-scale separation increases. We provide the basic...


    Itoh, Kazuya; Noda, Masashi; Kikkawa, Naotaka; Hori, Tomohito; Tamate, Satoshi; Toyosawa, Yasuo; Suemasa, Naoaki

    Labour accidents in disaster-relief and disaster restoration work following the Niigata Chuetsu Earthquake (2004) and the Niigata Chuetsu-oki Earthquake (2007) were analysed and characterised in order to raise awareness of the risks and hazards in such work. The Niigata Chuetsu-oki Earthquake affected houses and buildings rather than roads and railways, which are generally disrupted due to landslides or slope failures caused by earthquakes. In this scenario, the predominant type of accident is a "fall to lower level," which increases mainly due to the fact that labourers are working to repair houses and buildings. On the other hand, landslides and slope failures were much more prevalent in the Niigata Chuetsu Earthquake, resulting in more accidents occurring in geotechnical works rather than in construction works. Therefore, care should be taken in preventing "fall to lower level" accidents associated with repair work on the roofs of low-rise houses, "cut or abrasion" accidents due to the demolition of damaged houses and "caught in or compressed by equipment" accidents in road works and water and sewage works.

  9. Visual Analysis on Tidal Triggering Earthquake: the 2011 M9.0 Tohoku-Oki Earthquake being a case (United States)

    Wu, Lixin; Mao, Wenfei; Ma, Weiyu; Zheng, Shuo


    The lunar-solar tidal stresses due to the gravitational attraction of the Moon and Sun to the Earth change with time and place inside the lithosphere. Although it is probably at least three orders of magnitude smaller than the tectonic stresses upon which they are superimposed, the tidal stress rates may be two orders of magnitude larger than the tectonic stress rate averaged over the recurrence interval for a seismic region, which is possible to trigger an earthquake as the last straw. For more than a century, researchers have sought to detect the effect of tidal stress on individual earthquake or to make statistical analysis on global earthquake-tide correlations. Unfortunately, not all of the reports on tidal stress triggering earthquake are consistent. Some have found that large and shallow earthquakes are more likely to be triggered by tidal stress than deep and small ones, yet there have been many positive correlations noted for shallow small ones. To observe the possible tidal triggering it is necessary to resolve the tidal stress onto the fault surface or subduction zone to determine if the normal component or shear components are compatible with the fault motion. This analysis requires accurate focal mechanisms and detailed geometry of the fault surface. Although the 2011 Tohoku-Oki Magnitude 9.0 earthquake in Japan was found tend to occur near the time of maximum tidal stress amplitude, the triggering mechanism is not yet clear. After interpolating the irregular triangular network of the subduction zone, produced by Mark Simons et al. (2011), into a regular gridded network, we calculated the tidal stress at each grid in period between 30 days before and 7 days after the great shocking. The dynamical tidal stress were resolved to three components including trend stress, dip stress and normal stress. For the first time, the spatio-temporal evolution of the tidal components on the fault surface or subduction zone are mapped in three dimension. These maps

  10. Analysis of a Possibility of Electromagnetic Earthquake Triggering by Ionospheric Disturbations (United States)

    Novikov, V.; Ruzhin, Y.


    It is well known that some ionospheric perturbations precede strong earthquakes, and there are attempts to detect and apply them as precursors for short-term earthquake prediction. In that case it is assumed that the processes of earthquake preparation in lithosphere can provide disturbances in ionosphere. From another hand, theoretical, field, and laboratory experimental results obtained during implementation of research projects in Russia within recent ten years demonstrated an evidence of artificial electromagnetic triggering of earthquakes, when electric current density provided by special pulsed power systems at the earthquake source depth (5-10 km) is 10^-7 - 10^-8 A/m^2 is comparable with the density of telluric currents induced in the crust by ionospheric disturbances. In this case it may be supposed that some reported preseismic ionosperic anomalies provide triggering effect for earthquake occurrence. To clear the details of ionosphere-lithosphere coupling and a possibility of electromagnetic triggering of seismic events an analysis of ionospheric precursors of earthquakes, statistical analysis of geomagnetic field variations and seismic activity, laboratory studies of dynamics of deformation of stressed rocks under the electromagnetic impact, as well as theoretical analysis of the possible mechanisms of interaction of rocks with electromagnetic field and their verification in laboratory experiments at the special test equipment, which simulates behavior of the fault zone under external triggering factors were catrried out. A model of electromagnetic triggering of seismic events caused by ionospheric electromagnetic perturbations is proposed based on the fluid migration to the fault under critical stressed state due to interaction of conductive fluid with telluric currents and geomagnetic field. A possibility of development of physical method of short-term earthquake prediction based on electromagnetic triggering effects is discussed.


    Directory of Open Access Journals (Sweden)

    Katarzyna CHRUZIK


    Full Text Available International air law imposes an obligation on the part of transport operators to operationalize risk management, and hence develop records of hazards and estimate the level of risk in the respective organization. Air transport is a complex system combining advanced technical systems, operators and procedures. Sources of hazards occur in all of these closely related and mutually interacting areas, which operate in highly dispersed spaces with a short time horizon. A highly important element of risk management is therefore to identify sources of danger, not only in terms of their own internal risks (the source of threats and activation of threats within the same transport organization, but also in the area of common risk (sources of threats beyond the transport system to which the activation of the hazard is related and external risks (sources of threats outside the transport system. The overall risk management of a transport organization should consider all three risk areas. The paper presents an analysis of internal sources of threats to civil air operations and the resulting main risk areas. The article complements a previous paper by the same authors entitled “Analysis of external sources of hazards in civil air operations”.

  12. Far field tsunami simulations of the 1755 Lisbon earthquake: Implications for tsunami hazard to the U.S. East Coast and the Caribbean (United States)

    Barkan, R.; ten Brink, Uri S.; Lin, J.


    The great Lisbon earthquake of November 1st, 1755 with an estimated moment magnitude of 8.5-9.0 was the most destructive earthquake in European history. The associated tsunami run-up was reported to have reached 5-15??m along the Portuguese and Moroccan coasts and the run-up was significant at the Azores and Madeira Island. Run-up reports from a trans-oceanic tsunami were documented in the Caribbean, Brazil and Newfoundland (Canada). No reports were documented along the U.S. East Coast. Many attempts have been made to characterize the 1755 Lisbon earthquake source using geophysical surveys and modeling the near-field earthquake intensity and tsunami effects. Studying far field effects, as presented in this paper, is advantageous in establishing constraints on source location and strike orientation because trans-oceanic tsunamis are less influenced by near source bathymetry and are unaffected by triggered submarine landslides at the source. Source location, fault orientation and bathymetry are the main elements governing transatlantic tsunami propagation to sites along the U.S. East Coast, much more than distance from the source and continental shelf width. Results of our far and near-field tsunami simulations based on relative amplitude comparison limit the earthquake source area to a region located south of the Gorringe Bank in the center of the Horseshoe Plain. This is in contrast with previously suggested sources such as Marqu??s de Pombal Fault, and Gulf of C??diz Fault, which are farther east of the Horseshoe Plain. The earthquake was likely to be a thrust event on a fault striking ~ 345?? and dipping to the ENE as opposed to the suggested earthquake source of the Gorringe Bank Fault, which trends NE-SW. Gorringe Bank, the Madeira-Tore Rise (MTR), and the Azores appear to have acted as topographic scatterers for tsunami energy, shielding most of the U.S. East Coast from the 1755 Lisbon tsunami. Additional simulations to assess tsunami hazard to the U.S. East

  13. Uncertainty analysis for seismic hazard in Northern and Central Italy (United States)

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.


    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  14. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system (United States)

    Takarada, S.


    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website ( was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in

  15. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L


    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  16. Dynamic Response Analysis of Cable of Submerged Floating Tunnel under Hydrodynamic Force and Earthquake

    Directory of Open Access Journals (Sweden)

    Zhiwen Wu


    Full Text Available A simplified analysis model of cable for submerged floating tunnel subjected to parametrically excited vibrations in the ocean environment is proposed in this investigation. The equation of motion of the cable is obtained by a mathematical method utilizing the Euler beam theory and the Galerkin method. The hydrodynamic force induced by earthquake excitations is formulated to simulate real seaquake conditions. The random earthquake excitation in the time domain is formulated by the stochastic phase spectrum method. An analytical model for analyzing the cable for submerged floating tunnel subjected to combined hydrodynamic forces and earthquake excitations is then developed. The sensitivity of key parameters including the hydrodynamic, earthquake, and structural parameters on the dynamic response of the cable is investigated and discussed. The present model enables a preliminary examination of the hydrodynamic and seismic behavior of cable for submerged floating tunnel and can provide valuable recommendations for use in design and operation of anchor systems for submerged floating tunnel.

  17. Slope instabilities triggered by the 2011 Lorca earthquake (M{sub w} 5.1): a comparison and revision of hazard assessments of earthquake-triggered landslides in Murcia; Inestabilidades de ladera provocadas por el terremoto de Lorca de 2011 (Mw 5,1): comparacion y revision de estudios de peligrosidad de movimientos de ladera por efecto sismico en Murcia

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez-Peces, M. J.; Garcia-Mayordomo, J.; Martinez-Diaz, J. J.; Tsige, M.


    The Lorca basin has been the object of recent research aimed at studying the phenomenon of earthquake induced landslides and their assessment within the context of different seismic scenarios, bearing in mind the influence of soil and topographical amplification effects. Nevertheless, it was not until the Lorca earthquakes of 11 May 2011 that it became possible to adopt a systematic approach to the problem. We provide here an inventory of slope instabilities triggered by the Lorca earthquakes comprising 100 cases, mainly small rock and soil falls (1 to 100 m{sup 3}). The distribution of these instabilities is compared to two different earthquake-triggered landslide hazard maps: one considering the occurrence of the most probable earthquake for a 475-yr return period in the Lorca basin (M{sub w} = 5.0), which was previously published on the basis of a low-resolution digital elevation model (DEM), and a second one matching the occurrence of the M{sub w} = 5.1 2011 Lorca earthquake, which was undertaken using a higher resolution DEM. The most frequent Newmark displacement values related to the slope failures triggered by the 2011 Lorca earthquakes are smaller than 2 cm in both hazard scenarios and coincide with areas where significant soil and topographical seismic amplification effects have occurred.

  18. Multifractal analysis of earthquakes in Kumaun Himalaya and its ...

    Indian Academy of Sciences (India)

    Himalayan seismicity is related to continuing northward convergence of Indian plate against Eurasian plate. Earthquakes in this region are mainly caused due to release of elastic strain energy. The Himalayan region can be attributed to highly complex geodynamic process and therefore is best suited for multifractal ...

  19. Hazardous Materials Routing Study Phase II: Analysis of Hazardous Materials Truck Routes in Proximity to the Dallas Central Business District (United States)


    This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...

  20. Amplification of Earthquake Ground Motions in Washington, DC, and Implications for Hazard Assessments in Central and Eastern North America (United States)

    Pratt, Thomas L.; Horton, J. Wright; Muñoz, Jessica; Hough, Susan E.; Chapman, Martin C.; Olgun, C. Guney


    The extent of damage in Washington, DC, from the 2011 Mw 5.8 Mineral, VA, earthquake was surprising for an epicenter 130 km away; U.S. Geological Survey "Did-You-Feel-It" reports suggest that Atlantic Coastal Plain and other unconsolidated sediments amplified ground motions in the city. We measure this amplification relative to bedrock sites using earthquake signals recorded on a temporary seismometer array. The spectral ratios show strong amplification in the 0.7 to 4 Hz frequency range for sites on sediments. This range overlaps with resonant frequencies of buildings in the city as inferred from their heights, suggesting amplification at frequencies to which many buildings are vulnerable to damage. Our results emphasize that local amplification can raise moderate ground motions to damaging levels in stable continental regions, where low attenuation extends shaking levels over wide areas and unconsolidated deposits on crystalline metamorphic or igneous bedrock can result in strong contrasts in near-surface material properties.

  1. Epidemiological analysis of trauma patients following the Lushan earthquake.

    Directory of Open Access Journals (Sweden)

    Li Zhang

    Full Text Available BACKGROUND: A 7.0-magnitude earthquake hit Lushan County in China's Sichuan province on April 20, 2013, resulting in 196 deaths and 11,470 injured. This study was designed to analyze the characteristics of the injuries and the treatment of the seismic victims. METHODS: After the earthquake, an epidemiological survey of injured patients was conducted by the Health Department of Sichuan Province. Epidemiological survey tools included paper-and-pencil questionnaires and a data management system based on the Access Database. Questionnaires were completed based on the medical records of inpatients with earthquake-related injuries. Outpatients or non-seismic injured inpatients were excluded. A total of 2010 patients from 140 hospitals were included. RESULTS: The most common type of injuries involved bone fractures (58.3%. Children younger than 10 years of age suffered fewer fractures and chest injuries, but more skin and soft -tissue injuries. Patients older than 80 years were more likely to suffer hip and thigh fractures, pelvis fractures, and chest injuries, whereas adult patients suffered more ankle and foot fractures. A total of 207 cases of calcaneal fracture were due to high falling injuries related to extreme panic. The most common type of infection in hospitalized patients was pulmonary infections. A total of 70.5% patients had limb dysfunction, and 60.1% of this group received rehabilitation. Most patients received rehabilitation within 1 week, and the median duration of rehabilitation was 3 weeks. The cause of death of all seven hospitalized patients who died was severe traumatic brain injuries; five of this group died within 24 h after the earthquake. CONCLUSIONS: Injuries varied as a function of the age of the victim. As more injuries were indirectly caused by the Lushan earthquake, disaster education is urgently needed to avoid secondary injuries.

  2. Fault zone regulation, seismic hazard, and social vulnerability in Los Angeles, California: Hazard or urban amenity? (United States)

    Toké, Nathan A.; Boone, Christopher G.; Arrowsmith, J. Ramón


    Public perception and regulation of environmental hazards are important factors in the development and configuration of cities. Throughout California, probabilistic seismic hazard mapping and geologic investigations of active faults have spatially quantified earthquake hazard. In Los Angeles, these analyses have informed earthquake engineering, public awareness, the insurance industry, and the government regulation of developments near faults. Understanding the impact of natural hazards regulation on the social and built geography of cities is vital for informing future science and policy directions. We constructed a relative social vulnerability index classification for Los Angeles to examine the social condition within regions of significant seismic hazard, including areas regulated as Alquist-Priolo (AP) Act earthquake fault zones. Despite hazard disclosures, social vulnerability is lowest within AP regulatory zones and vulnerability increases with distance from them. Because the AP Act requires building setbacks from active faults, newer developments in these zones are bisected by parks. Parcel-level analysis demonstrates that homes adjacent to these fault zone parks are the most valuable in their neighborhoods. At a broad scale, a Landsat-based normalized difference vegetation index shows that greenness near AP zones is greater than the rest of the metropolitan area. In the parks-poor city of Los Angeles, fault zone regulation has contributed to the construction of park space within areas of earthquake hazard, thus transforming zones of natural hazard into amenities, attracting populations of relatively high social status, and demonstrating that the distribution of social vulnerability is sometimes more strongly tied to amenities than hazards.

  3. Environmental risk analysis of hazardous material rail transportation. (United States)

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L


    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials. Published by Elsevier B.V.

  4. Environmentally Friendly Solution to Ground Hazards in Design of Bridges in Earthquake Prone Areas Using Timber Piles (United States)

    Sadeghi, H.


    Bridges are major elements of infrastructure in all societies. Their safety and continued serviceability guaranties the transportation and emergency access in urban and rural areas. However, these important structures are subject to earthquake induced damages in structure and foundations. The basic approach to the proper support of foundations are a) distribution of imposed loads to foundation in a way they can resist those loads without excessive settlement and failure; b) modification of foundation ground with various available methods; and c) combination of "a" and "b". The engineers has to face the task of designing the foundations meeting all safely and serviceability criteria but sometimes when there are numerous environmental and financial constrains, the use of some traditional methods become inevitable. This paper explains the application of timber piles to improve ground resistance to liquefaction and to secure the abutments of short to medium length bridges in an earthquake/liquefaction prone area in Bohol Island, Philippines. The limitations of using the common ground improvement methods (i.e., injection, dynamic compaction) because of either environmental or financial concerns along with the abundance of timber in the area made the engineers to use a network of timber piles behind the backwalls of the bridge abutments. The suggested timber pile network is simulated by numerical methods and its safety is examined. The results show that the compaction caused by driving of the piles and bearing capacity provided by timbers reduce the settlement and lateral movements due to service and earthquake induced loads.

  5. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances (United States)

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  6. Parameter estimation in Probabilistic Seismic Hazard Analysis: current problems and some solutions (United States)

    Vermeulen, Petrus


    A typical Probabilistic Seismic Hazard Analysis (PSHA) comprises identification of seismic source zones, determination of hazard parameters for these zones, selection of an appropriate ground motion prediction equation (GMPE), and integration over probabilities according the Cornell-McGuire procedure. Determination of hazard parameters often does not receive the attention it deserves, and, therefore, problems therein are often overlooked. Here, many of these problems are identified, and some of them addressed. The parameters that need to be identified are those associated with the frequency-magnitude law, those associated with earthquake recurrence law in time, and the parameters controlling the GMPE. This study is concerned with the frequency-magnitude law and temporal distribution of earthquakes, and not with GMPEs. TheGutenberg-Richter frequency-magnitude law is usually adopted for the frequency-magnitude law, and a Poisson process for earthquake recurrence in time. Accordingly, the parameters that need to be determined are the slope parameter of the Gutenberg-Richter frequency-magnitude law, i.e. the b-value, the maximum value at which the Gutenberg-Richter law applies mmax, and the mean recurrence frequency,λ, of earthquakes. If, instead of the Cornell-McGuire, the "Parametric-Historic procedure" is used, these parameters do not have to be known before the PSHA computations, they are estimated directly during the PSHA computation. The resulting relation for the frequency of ground motion vibration parameters has an analogous functional form to the frequency-magnitude law, which is described by parameters γ (analogous to the b¬-value of the Gutenberg-Richter law) and the maximum possible ground motion amax (analogous to mmax). Originally, the approach was possible to apply only to the simple GMPE, however, recently a method was extended to incorporate more complex forms of GMPE's. With regards to the parameter mmax, there are numerous methods of estimation

  7. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xueqin [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); School of Social Development and Public Policy, Beijing Normal University, Beijing 100875 (China); Li, Ning [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); Yuan, Shuai, E-mail: [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); Xu, Ning; Shi, Wenqin; Chen, Weibin [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China)


    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  8. Multi-Hazard Sustainability: Towards Infrastructure Resilience (United States)

    Lin, T.


    Natural and anthropogenic hazards pose significant challenges to civil infrastructure. This presents opportunities in investigating site-specific hazards in structural engineering to aid mitigation and adaptation efforts. This presentation will highlight: (a) recent advances in hazard-consistent ground motion selection methodology for nonlinear dynamic analyses, (b) ongoing efforts in validation of earthquake simulations and their effects on tall buildings, and (c) a pilot study on probabilistic sea-level rise hazard analysis incorporating aleatory and epistemic uncertainties. High performance computing and visualization further facilitate research and outreach to improve resilience under multiple hazards in the face of climate change.

  9. Directivity in NGA earthquake ground motions: Analysis using isochrone theory (United States)

    Spudich, P.; Chiou, B.S.J.


    We present correction factors that may be applied to the ground motion prediction relations of Abrahamson and Silva, Boore and Atkinson, Campbell and Bozorgnia, and Chiou and Youngs (all in this volume) to model the azimuthally varying distribution of the GMRotI50 component of ground motion (commonly called 'directivity') around earthquakes. Our correction factors may be used for planar or nonplanar faults having any dip or slip rake (faulting mechanism). Our correction factors predict directivity-induced variations of spectral acceleration that are roughly half of the strike-slip variations predicted by Somerville et al. (1997), and use of our factors reduces record-to-record sigma by about 2-20% at 5 sec or greater period. ?? 2008, Earthquake Engineering Research Institute.

  10. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    Directory of Open Access Journals (Sweden)

    W. F. Peng


    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  11. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro


    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  12. [Analysis of patients with bone injury in Wenchuan earthquake]. (United States)

    Yi, Min; Pei, Fu-xing; Song, Yue-ming; Yang, Tian-fu; Huang, Fu-guo; Tu, Chong-qi; Cen, Shi-qiang; Xiang, Zhou; Li, Jian; Liu, Hao; Liu, Lei; Yang, Jing; Wang, Guang-lin; Liu, Li-Min; Shen, Bin; Zhou, Zong-ke; Zeng, Jian-cheng


    To evaluate the patients with bone injury in Wenchuan earthquake. From May 12th to June 15th 2008 the data of 1410 patients with bone injury in Wenchuan earthquake were analyzed to evaluate clinical intervention and remedy-managing experience. The 1410 patients average age was from 4 to 103 years old. And 744 cases (52.7%) suffered from blunt injuries, 379 cases (26.9%) from buried injuries, 287 cases (20.4%) from falling injuries; And 1317 cases were with fracture, 93 with limbs soft tissue injuries; 261 patients combined with other parts of injuries including 45 cases with paralysis; 66 cases were with crush syndrome, 25 with gas gangrene, 76 with acute kidney failure, 26 with multiple organ failure. And 912 operations were performed including 402 fracture fixation, 224 debridement, 152 debridement and suture, 85 amputation, 29 implant skin, 8 fixation of joint dislocation, 5 surgical flaps transplantation, 4 nerve and tendon suture, 2 arthroscopes, 1 joint replacement. Among the 66 crush syndrome patients, 49 accepted continuous renal replacement therapy, in which 9 cases were bleeding from named arteries and 20 blood vessels were getting embolism. Among the 1410 cases, 1 died from multiple organ failure. Among the patients with bone injury in Wenchuan earthquake, the elderly patients are more than the youth; The injuries are always combined with other complications; Opened injuries are polluted severely; It is difficult to deal with the crush syndrome; Paraplegia cases are less, but the amputees are more.

  13. Ground Liquefaction and Deformation Analysis of Breakwater Structures Under Earthquakes

    Directory of Open Access Journals (Sweden)

    Zhao Jie


    Full Text Available Ground liquefaction and deformation is one of the important causes that damage engineering structures. Chinese current code for seismic design of breakwater is based on the single-level seismic design method as well as code for port and water-way engineering. However, this code can not exactly reflect the seismic performance of breakwater structures which experience different seismic intensities. In this paper, the author used a finite difference software, namely, FLAC3D, to analyze the state and compute seismic responses of breakwater structure. The breakwater foundation’s pore pressure ratio and displacement due to different earthquake have been studied. And the result show that: Smaller earthquakes have little influence on serviceability of the foundation, and severe earthquakes can liquefy some parts of the foundation; In the latter case , obvious changes of pores and foundation displaces can be found. Particularly, when seismic peak acceleration reachs 0.2g, Liquefaction appears in the foundation and mainly concentrated in the upper right side of the structure. In addition, the survey of ultra-hole pressure and displacement values of sand layers of the breakwater, manifests when the ultra pore pressure near 1.0, displacement and overturning structure is relatively large, resulting in varying degrees of damage to the structure. This paper’s research can provide theoretical and designable reference for similar engineering structures

  14. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco


    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  15. Flood Hazard and Risk Analysis in Urban Area (United States)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien


    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  16. Fault roughness and strength heterogeneity control earthquake size and stress drop

    KAUST Repository

    Zielke, Olaf


    An earthquake\\'s stress drop is related to the frictional breakdown during sliding and constitutes a fundamental quantity of the rupture process. High-speed laboratory friction experiments that emulate the rupture process imply stress drop values that greatly exceed those commonly reported for natural earthquakes. We hypothesize that this stress drop discrepancy is due to fault-surface roughness and strength heterogeneity: an earthquake\\'s moment release and its recurrence probability depend not only on stress drop and rupture dimension but also on the geometric roughness of the ruptured fault and the location of failing strength asperities along it. Using large-scale numerical simulations for earthquake ruptures under varying roughness and strength conditions, we verify our hypothesis, showing that smoother faults may generate larger earthquakes than rougher faults under identical tectonic loading conditions. We further discuss the potential impact of fault roughness on earthquake recurrence probability. This finding provides important information, also for seismic hazard analysis.

  17. Northern Caribbean Tsunami Hazard: Earthquake and Gravity Source Contribution of the Tsunami of 2010 in Haïti (United States)

    Poupardin, Adrien; Hébert, Hélène; Calais, Eric; Gailler, Audrey


    The Mw 7 earthquake of January 12, 2010, in Haïti was followed by a tsunami with wave heights reaching 3 m in some locations (Grand Goâve, Jacmel) on either side of the Presqu'Ile du Sud where the event took place. The tsunami was also recorded at DART buoy 42407 (about 600 km southeast of the earthquake source) and at a tide gauge in Santo Domingo (Dominican Republic). In the hours following the event, the National Earthquake Information Center (NEIC) suggested rupture of a south-dipping segment of the Enriquillo-Plantain Garden fault (EPGF). Fritz et al. (2013) used the NEIC source model to simulate the tsunami height and match coastal run-up measurements and DART data by (1) increasing coseismic slip on the EPGF while keeping a constant Mo by scaling the regional rigidity, and (2) invoking a coastal submarine landslide in addition to ground motion. Since then, several studies have considerably improved our understanding of the 2010 Haiti earthquake source using GPS, InSAR, seismological, geological, and/or teleseismic data (Meng et al., 2012; Hayes et al., 2010, Symithe et al., 2013). All show that rupture occurred on a north-dipping blind fault (Leogâne fault) with 1/3 of its moment expressed by reverse motion and up to 60 cm of coastal uplift. Here we revisit the January 12, 2010 Haiti tsunami by modeling runup heights, DART, and tide gauge observations using these recent source models as input parameters. We propagate the tsunami using a non linear shallow water tsunami model able to account for the shoaling effect thanks to imbricated bathymetric grids. Simulations indicate run-up heights much lower than observed (1) in the Grand Goâve Bay, consistent with the hypoythesis of a landslide-triggered tsunami at this location, (2) along the southern coast of Hispaniola and at the DART buoy, closest to observations however when using Symithe et al.'s source model. We also find wave heights up to 1 m in Port-au-Prince (harbor and coastal shantytowns) when using

  18. REASSESSMENT OF TSUNAMI HAZARD IN THE CITY OF IQUIQUE, CHILE, AFTER THE PISAGUA EARTHQUAKE OF APRIL 2014 In the present contribution, we will reassess the tsunami hazard for the North of Chile taking into account the occurrence of the recent events, focusing on the potential tsunami impact that a worse case scenario could produce in the city of Iquique. (United States)

    Cienfuegos, R.; Suarez, L.; Aránguiz, R.; Gonzalez, G.; González-Carrasco, J. F.; Catalan, P. A.; Dominguez, J. C.; Tomita, T.


    On April 1st2014 a 8.1 Mw Earthquake occurred at 23:46:50 UTC (20:46:50 local time) with its epicenter located off the coast of Pisagua, 68 km north of the city of Iquique (An et al., 2014). The potential risk of earthquake and tsunami in this area was widely recognized by the scientific community (Chlieh et al., 2004). Nevertheless, the energy released by this earthquake and the associated slip distribution was much less than expected. In the present contribution, we will reassess the tsunami hazard for the North of Chile taking into account the occurrence of the recent events, focusing on the potential impact that a worse case scenario could produce in the city of Iquique. For that purpose, an updated tsunami source will be derived using updated information on the seismic and co-seismic tectonic displacements that is available from historical, geological information, and the dense GPS and seismometer networks available in the North of Chile. The updated tsunami source will be used to generate initial conditions for a tsunami and analyze the following aspects: i) large scale hydrodynamics, ii) arrival times, maximum flow depths, and inundation area, iii) potential impact on the port of Iquique, and more specifically on the container's drift that the tsunami could produce. This analysis is essential to reassess tsunami hazard in Iquique, evaluate evacuation plans and mitigation options regarding the port operation. Tsunami propagation and inundation will be conducted using the STOC model (Tomita and Honda, 2010), and a high resolution Lidar topographic database. ReferencesAn, C. et al. (2014). Tsunami source and its validation of the 2014 Iquique, Chile Earthquake, Geophys. Res. Lett., 41, doi:10.1002/2014GL060567. Chlieh, et al. (2004). Crustal deformation and fault slip during the seismic cycle in the north Chile subduction zone, from GPS and INSAR observations, Geophys J. Int., 158(2), 695-711, 10.1111/j.1365-246X.2004.02326.x. Tomita, T., & Honda, K. (2010

  19. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Sevougian, S. David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  20. Rupture characteristics of the 2016 Meinong earthquake revealed by the back projection and directivity analysis of teleseismic broadband waveforms (United States)

    Jian, Pei-Ru; Hung, Shu-Huei; Meng, Lingsen; Sun, Daoyuan


    The 2016 Mw 6.4 Meinong earthquake struck a previously unrecognized fault zone in midcrust beneath south Taiwan and inflicted heavy causalities in the populated Tainan City about 30 km northwest of the epicenter. Because of its relatively short rupture duration and P wave trains contaminated by large-amplitude depth phases and reverberations generated in the source region, accurate characterization of the rupture process and source properties for such a shallow strong earthquake remains challenging. Here we present a first high-resolution MUltiple SIgnal Classification back projection source image by using both P and depth-phase sP waves recorded at two large and dense arrays to understand the source behavior and consequent hazards of this peculiar catastrophic event. The results further corroborated by the directivity analysis indicate a unilateral rupture propagating northwestward and slightly downward on the shallow NE-dipping fault plane. The source radiation process is primarily characterized by one single peak, 7 s duration, with a total rupture length of 17 km and average rupture speed of 2.4 km/s. The rupture terminated immediately east of the prominent off-fault aftershock cluster about 20 km northwest of the hypocenter. Synergistic amplification of ground shaking by the directivity and strong excitation of sP and reverberations mainly caused the destruction concentrated in the area further to the northwest away from the rupture zone.

  1. Flash-sourcing or the rapid detection and characterisation of earthquake effects through clickstream data analysis (United States)

    Bossu, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.


    Rapid characterisation of earthquake effects is essential for a timely and appropriate response in favour of victims and/or of eyewitnesses. In case of damaging earthquakes, any field observations that can fill the information gap characterising their immediate aftermath can contribute to more efficient rescue operations. This paper presents the last developments of a method called "flash-sourcing" addressing these issues. It relies on eyewitnesses, the first informed and the first concerned by an earthquake occurrence. More precisely, their use of the EMSC earthquake information website ( is analysed in real time to map the area where the earthquake was felt and identify, at least under certain circumstances zones of widespread damage. The approach is based on the natural and immediate convergence of eyewitnesses on the website who rush to the Internet to investigate cause of the shaking they just felt causing our traffic to increase The area where an earthquake was felt is mapped simply by locating Internet Protocol (IP) addresses during traffic surges. In addition, the presence of eyewitnesses browsing our website within minutes of an earthquake occurrence excludes the possibility of widespread damage in the localities they originate from: in case of severe damage, the networks would be down. The validity of the information derived from this clickstream analysis is confirmed by comparisons with EMS98 macroseismic map obtained from online questionnaires. The name of this approach, "flash-sourcing", is a combination of "flash-crowd" and "crowdsourcing" intending to reflect the rapidity of the data collation from the public. For computer scientists, a flash-crowd names a traffic surge on a website. Crowdsourcing means work being done by a "crowd" of people; It also characterises Internet and mobile applications collecting information from the public such as online macroseismic questionnaires. Like crowdsourcing techniques, flash-sourcing is a

  2. Introduction to Plate Boundaries and Natural Hazards

    NARCIS (Netherlands)

    Duarte, João C.; Schellart, Wouter P.


    A great variety of natural hazards occur on Earth, including earthquakes, volcanic eruptions, tsunamis, landslides, floods, fires, tornadoes, hurricanes, and avalanches. The most destructive of these hazards, earthquakes, tsunamis, and volcanic eruptions, are mostly associated with tectonic plate

  3. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan. (United States)


    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each...

  4. Safety analysis of contained low-hazard biotechnology applications. (United States)

    Pettauer, D; Käppeli, O; van den Eede, G


    A technical safety analysis has been performed on a containment-level-2 pilot plant in order to assess an upgrading of the existing facility, which should comply with good manufacturing practices. The results were obtained by employing the hazard and operability (HAZOP) assessment method and are discussed in the light of the appropriateness of this procedural tool for low-hazard biotechnology applications. The potential release of micro-organisms accounts only for a minor part of the hazardous consequences. However, in certain cases the release of a large or moderate amount of micro-organisms would not be immediately identified. Most of the actions required to avoid these consequences fall into the realm of operational procedures. As a major part of potential failures result from human errors, standard operating procedures play a prominent role when establishing the concept of safety management. The HAZOP assessment method was found to be adequate for the type of process under investigation. The results also may be used for the generation of checklists which, in most cases, are sufficient for routine safety assurance.

  5. Tree-ring analysis in natural hazards research - an overview (United States)

    Stoffel, M.; Bollschweiler, M.


    The understanding of geomorphic processes and knowledge of past events are important tasks for the assessment of natural hazards. Tree rings have on varied occasions proved to be a reliable tool for the acquisition of data on past events. In this review paper, we provide an overview on the use of tree rings in natural hazards research, starting with a description of the different types of disturbances by geomorphic processes and the resulting growth reactions. Thereafter, a summary is presented on the different methods commonly used for the analysis and interpretation of reactions in affected trees. We illustrate selected results from dendrogeomorphological investigations of geomorphic processes with an emphasis on fluvial (e.g., flooding, debris flows) and mass-movement processes (e.g., landslides, snow avalanche), where lots of data have been generated over the past few decades. We also present results from rockfall and permafrost studies, where data are much scarcer, albeit data from tree-ring studies have proved to be of great value in these fields as well. Most studies using tree rings have focused on alpine environments in Europe and North America, whereas other parts of the world have been widely neglected by dendrogeomorphologists so far. We therefore challenge researchers to focus on other regions with distinct climates as well, to look on less frequently studied processes as well and to broaden and improve approaches and methods commonly used in tree-ring research so as to allow a better understanding of geomorphic processes, natural hazards and risk.

  6. Hazardous materials transportation: a risk-analysis-based routing methodology. (United States)

    Leonelli, P; Bonvicini, S; Spadoni, G


    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  7. Using SAR and GPS for Hazard Management and Response: Progress and Examples from the Advanced Rapid Imaging and Analysis (ARIA) Project (United States)

    Owen, S. E.; Simons, M.; Hua, H.; Yun, S. H.; Agram, P. S.; Milillo, P.; Sacco, G. F.; Webb, F.; Rosen, P. A.; Lundgren, P.; Milillo, G.; Manipon, G. J. M.; Moore, A. W.; Liu, Z.; Polet, J.; Cruz, J.


    ARIA is a joint JPL/Caltech project to automate synthetic aperture radar (SAR) and GPS imaging capabilities for scientific understanding, hazard response, and societal benefit. We have built a prototype SAR and GPS data system that forms the foundation for hazard monitoring and response capability, as well as providing imaging capabilities important for science studies. Together, InSAR and GPS have the ability to capture surface deformation in high spatial and temporal resolution. For earthquakes, this deformation provides information that is complementary to seismic data on location, geometry and magnitude of earthquakes. Accurate location information is critical for understanding the regions affected by damaging shaking. Regular surface deformation measurements from SAR and GPS are useful for monitoring changes related to many processes that are important for hazard and resource management such as volcanic deformation, groundwater withdrawal, and landsliding. Observations of SAR coherence change have a demonstrated use for damage assessment for hazards such as earthquakes, tsunamis, hurricanes, and volcanic eruptions. These damage assessment maps can be made from imagery taken day or night and are not affected by clouds, making them valuable complements to optical imagery. The coherence change caused by the damage from hazards (building collapse, flooding, ash fall) is also detectable with intelligent algorithms, allowing for rapid generation of damage assessment maps over large areas at fine resolution, down to the spatial scale of single family homes. We will present the progress and results we have made on automating the analysis of SAR data for hazard monitoring and response using data from the Italian Space Agency's (ASI) COSMO-SkyMed constellation of X-band SAR satellites. Since the beginning of our project with ASI, our team has imaged deformation and coherence change caused by many natural hazard events around the world. We will present progress on our

  8. Hazard function theory for nonstationary natural hazards (United States)

    Read, Laura K.; Vogel, Richard M.


    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  9. Analysis of a school building damaged by the 2015 Ranau earthquake Malaysia (United States)

    Takano, Shugo; Saito, Taiki


    On June 5th, 2015 a severe earthquake with a moment Magnitude of 6.0 occurred in Ranau, Malaysia. Depth of the epicenter is 10 km. Due to the earthquake, many facilities were damaged and 18 people were killed due to rockfalls [1]. Because the British Standard (BS) is adopted as a regulation for built buildings in Malaysia, the seismic force is not considered in the structural design. Therefore, the seismic resistance of Malaysian buildings is unclear. To secure the human life and building safety, it is important to grasp seismic resistance of the building. The objective of this study is to evaluate the seismic resistance of the existing buildings in Malaysia built by the British Standard. A school building that was damaged at the Ranau earthquake is selected as the target building. The building is a four story building and the ground floor is designed to be a parking space for the staff. The structural types are infill masonries where main frame is configured by reinforced concrete columns and beams and brick is installed inside the frame as walls. Analysis is performed using the STERA_3D software that is the software to analyze the seismic performance of buildings developed by one of the authors. Firstly, the natural period of the building is calculated and compared with the result of micro-tremor measurement. Secondly, the nonlinear push-over analysis was conducted to evaluate the horizontal load bearing capacity of the building. Thirdly, the earthquake response analysis was conducted using the time history acceleration data measured at the Ranau earthquake by the seismograph installed at Kota Kinabalu. By comparing the results of earthquake response analysis and the actual damage of the building, the reason that caused damage to the building is clarified.

  10. A structural analysis in seismic archaeology: the walls of Noto and the 1693 earthquake

    Directory of Open Access Journals (Sweden)

    G. Lombardini


    Full Text Available A crucial problenl for seismic archeology is how to recognize seismic effects and how to date them. On an experimental basis. we proposed that the problem be reversed, and that we begin at the other end: i.e. by analyzing already known seismic effects on ancient structures, testified by written sources. to be able to .calibrate>> the types or possible observations and any subsequent elaborations. The choice of the walls of Noto was suggested by the fact that Noto was abandoned following the earthquake of l693 (I,= XI MCS. Me 7.5 which had already been studied in depth as part of an ING research programme (1988-92. Moreover, just after recent research, this event proved to be reconstructed with a high quality standard. Photogrammetric measurements were made on several parts of the town walls to plot a numerical model aimed at ascertaining specific aspects of the earthquake damage. An estimate of the ground acceleration during the earthquake has been attempted via non-linear finite-element analyses of a building located by the main city gate. The analyses show that. in order to obtain the building vault collapse, a ground acceleration of 0.5 to 0.7 g had to be reached during the earthquake. This result, typical of a strong earthquake such as the one of 1693, proves that an approach based on finite element analysis and a sound engineering judgment Inay be systematically applied to historical earthquake sites to obtain some estimates of ground acceleration in historical earthquakes. On the whole, this work aimed at starting up the second development phase of the great event of 1693 of which the macroseismic erfects are known. In the meantime, some possibilities of tackling structural analyses in seismic archaeology are being explored.

  11. Seismicity, seismic regionalization, earthquake risk, statistics, and probability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chinnery, M.A.; North, R.G.


    Observational data relating surface wave magnitude M/sub s/ to seismic moment M/sub 0/ are used to convert a well-known frequency-M/sub s/ plot into a frequency-M/sub 0/ relationship, which turns out to be remarkably linear. There is no evidence of an upper bound to M/sub 0/, on the basis of presently available evidence. The possibility exists that extremely large earthquakes (M/sub 0/ = 10/sup 31/ dyne-cm or greater) may occur from time to time.

  12. Seismic hazard analysis of nuclear installations in France. Current practice and research

    Energy Technology Data Exchange (ETDEWEB)

    Mohammadioun, B. [CEA Centre d`Etudes de Fontenay-aux-Roses, 92 (France). Inst. de Protection et de Surete Nucleaire


    The methodology put into practice in France for the evaluation of seismic hazard on the sites of nuclear facilities is founded on data assembled country-wide over the past 15 years, in geology, geophysics and seismology. It is appropriate to the regional seismotectonic context (interplate), characterized notably by diffuse seismicity. Extensive use is made of information drawn from historical seismicity. The regulatory practice described in the RFS I.2.c is reexamined periodically and is subject to up-dating so as to take advantage of new earthquake data and of the results gained from research work. Acquisition of the basic data, such as the identification of active faults and the quantification of site effect, which will be needed to achieve improved preparedness versus severe earthquake hazard in the 21st century, will necessarily be the fruit of close international cooperation and collaboration, which should accordingly be actively promoted. (J.P.N.)

  13. Evaluation of an active learning module to teach hazard and risk in Hazard Analysis and Critical Control Points (HACCP) classes. (United States)

    Oyarzabal, Omar A; Rowe, Ellen


    The terms hazard and risk are significant building blocks for the organization of risk-based food safety plans. Unfortunately, these terms are not clear for some personnel working in food manufacturing facilities. In addition, there are few examples of active learning modules for teaching adult participants the principles of hazard analysis and critical control points (HACCP). In this study, we evaluated the effectiveness of an active learning module to teach hazard and risk to participants of HACCP classes provided by the University of Vermont Extension in 2015 and 2016. This interactive module is comprised of a questionnaire; group playing of a dice game that we have previously introduced in the teaching of HACCP; the discussion of the terms hazard and risk; and a self-assessment questionnaire to evaluate the teaching of hazard and risk. From 71 adult participants that completed this module, 40 participants (56%) provided the most appropriate definition of hazard, 19 participants (27%) provided the most appropriate definition of risk, 14 participants (20%) provided the most appropriate definitions of both hazard and risk, and 23 participants (32%) did not provide an appropriate definition for hazard or risk. Self-assessment data showed an improvement in the understanding of these terms (P < 0.05). Thirty participants (42%) stated that the most valuable thing they learned with this interactive module was the difference between hazard and risk, and 40 participants (65%) responded that they did not attend similar presentations in the past. The fact that less than one third of the participants answered properly to the definitions of hazard and risk at baseline is not surprising. However, these results highlight the need for the incorporation of modules to discuss these important food safety terms and include more active learning modules to teach food safety classes. This study suggests that active learning helps food personnel better understand important food safety

  14. Evaluation of an active learning module to teach hazard and risk in Hazard Analysis and Critical Control Points (HACCP classes

    Directory of Open Access Journals (Sweden)

    Omar A. Oyarzabal


    Full Text Available The terms hazard and risk are significant building blocks for the organization of risk-based food safety plans. Unfortunately, these terms are not clear for some personnel working in food manufacturing facilities. In addition, there are few examples of active learning modules for teaching adult participants the principles of hazard analysis and critical control points (HACCP. In this study, we evaluated the effectiveness of an active learning module to teach hazard and risk to participants of HACCP classes provided by the University of Vermont Extension in 2015 and 2016. This interactive module is comprised of a questionnaire; group playing of a dice game that we have previously introduced in the teaching of HACCP; the discussion of the terms hazard and risk; and a self-assessment questionnaire to evaluate the teaching of hazard and risk. From 71 adult participants that completed this module, 40 participants (56% provided the most appropriate definition of hazard, 19 participants (27% provided the most appropriate definition of risk, 14 participants (20% provided the most appropriate definitions of both hazard and risk, and 23 participants (32% did not provide an appropriate definition for hazard or risk. Self-assessment data showed an improvement in the understanding of these terms (P < 0.05. Thirty participants (42% stated that the most valuable thing they learned with this interactive module was the difference between hazard and risk, and 40 participants (65% responded that they did not attend similar presentations in the past. The fact that less than one third of the participants answered properly to the definitions of hazard and risk at baseline is not surprising. However, these results highlight the need for the incorporation of modules to discuss these important food safety terms and include more active learning modules to teach food safety classes. This study suggests that active learning helps food personnel better understand important

  15. Risk prediction of Critical Infrastructures against extreme natural hazards: local and regional scale analysis (United States)

    Rosato, Vittorio; Hounjet, Micheline; Burzel, Andreas; Di Pietro, Antonio; Tofani, Alberto; Pollino, Maurizio; Giovinazzi, Sonia


    Natural hazard events can induce severe impacts on the built environment; they can hit wide and densely populated areas, where there is a large number of (inter)dependent technological systems whose damages could cause the failure or malfunctioning of further different services, spreading the impacts on wider geographical areas. The EU project CIPRNet (Critical Infrastructures Preparedness and Resilience Research Network) is realizing an unprecedented Decision Support System (DSS) which enables to operationally perform risk prediction on Critical Infrastructures (CI) by predicting the occurrence of natural events (from long term weather to short nowcast predictions, correlating intrinsic vulnerabilities of CI elements with the different events' manifestation strengths, and analysing the resulting Damage Scenario. The Damage Scenario is then transformed into an Impact Scenario, where punctual CI element damages are transformed into micro (local area) or meso (regional) scale Services Outages. At the smaller scale, the DSS simulates detailed city models (where CI dependencies are explicitly accounted for) that are of important input for crisis management organizations whereas, at the regional scale by using approximate System-of-Systems model describing systemic interactions, the focus is on raising awareness. The DSS has allowed to develop a novel simulation framework for predicting earthquakes shake maps originating from a given seismic event, considering the shock wave propagation in inhomogeneous media and the subsequent produced damages by estimating building vulnerabilities on the basis of a phenomenological model [1, 2]. Moreover, in presence of areas containing river basins, when abundant precipitations are expected, the DSS solves the hydrodynamic 1D/2D models of the river basins for predicting the flux runoff and the corresponding flood dynamics. This calculation allows the estimation of the Damage Scenario and triggers the evaluation of the Impact Scenario

  16. Hazard analysis of a computer based medical diagnostic system. (United States)

    Chudleigh, M F


    Medical screening of sectors of the population is now a routine and vital part of health care: an example is cervical smear testing. There is currently significant interest in the possible introduction of semi-automated microscopy systems for cervical cytology and one such experimental system is now undergoing laboratory trials. A collaborative project has been set up to demonstrate the benefits and constraints that arise from applying safety-critical methods developed in other domains to such a diagnostic system. We have carried out a system hazard analysis, successfully using the HAZOP technique adapted from the petrochemical industry.

  17. Marine natural hazards in coastal zone: observations, analysis and modelling (Plinius Medal Lecture) (United States)

    Didenkulova, Ira


    Giant surface waves approaching the coast frequently cause extensive coastal flooding, destruction of coastal constructions and loss of lives. Such waves can be generated by various phenomena: strong storms and cyclones, underwater earthquakes, high-speed ferries, aerial and submarine landslides. The most famous examples of such events are the catastrophic tsunami in the Indian Ocean, which occurred on 26 December 2004 and hurricane Katrina (28 August 2005) in the Atlantic Ocean. The huge storm in the Baltic Sea on 9 January 2005, which produced unexpectedly long waves in many areas of the Baltic Sea and the influence of unusually high surge created by long waves from high-speed ferries, should also be mentioned as examples of regional marine natural hazards connected with extensive runup of certain types of waves. The processes of wave shoaling and runup for all these different marine natural hazards (tsunami, coastal freak waves, ship waves) are studied based on rigorous solutions of nonlinear shallow-water theory. The key and novel results presented here are: i) parameterization of basic formulas for extreme runup characteristics for bell-shape waves, showing that they weakly depend on the initial wave shape, which is usually unknown in real sea conditions; ii) runup analysis of periodic asymmetric waves with a steep front, as such waves are penetrating inland over large distances and with larger velocities than symmetric waves; iii) statistical analysis of irregular wave runup demonstrating that wave nonlinearity nearshore does not influence on the probability distribution of the velocity of the moving shoreline and its moments, and influences on the vertical displacement of the moving shoreline (runup). Wave runup on convex beaches and in narrow bays, which allow abnormal wave amplification is also discussed. Described analytical results are used for explanation of observed extreme runup of tsunami, freak (sneaker) waves and ship waves on different coasts

  18. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    Energy Technology Data Exchange (ETDEWEB)



    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  19. Multi-hazard response analysis of a 5MW offshore wind turbine

    DEFF Research Database (Denmark)

    Katsanos, Evangelos; Sanz, A. Arrospide; Georgakis, Christos T.


    Wind energy has already dominant role on the scene of the clean energy production. Well-promising markets, like China, India, Korea and Latin America are the fields of expansion for new wind turbines mainly installed in offshore environment, where wind, wave and earthquake loads threat...... response of the turbine’s tower was found to be severely affected by the earthquake excitations. Moreover, fragility analysis based on acceleration capacity thresholds for the nacelle’s equipment corroborated that the earthquake excitations may adversely affect the reliability and availability of wind...

  20. Identifying nursing hazards in the emergency department: a new approach to nursing job hazard analysis. (United States)

    Ramsay, Jim; Denny, Frank; Szirotnyak, Kara; Thomas, Jonathan; Corneliuson, Elizabeth; Paxton, Kim L


    It is widely acknowledged that nurses are crucial components in healthcare system. In their roles, nurses are regularly confronted with a variety of biological, physical, and chemical hazards during the course of performing their duties. The safety of nurses themselves, and subsequently that of their patients, depends directly upon the degree to which nurses have knowledge of occupational hazards specific to their jobs and managerial mechanisms for mitigating those hazards. The level of occupational safety and health training resources available to nurses, as well as management support, are critical factors in preventing adverse outcomes from routine job-related hazards. This study will identify gaps in self protective safety education for registered nurses working in emergency departments as well as for nursing students. Furthermore, this study reviews the nature and scope of occupational nursing hazards, and the degree to which current nursing education and position descriptions (or functional statements) equip nurses to recognize and address the hazards inherent in their jobs. This study has three parts. First, a literature review was performed to summarize the nature and scope of occupational nursing hazards. Second, the safety components of position descriptions from 29 Veterans Affairs (VA) hospitals across the United States were obtained and evaluated by an expert panel of occupational health nurses. Finally, an expert panel of occupational health nurses evaluated the degree to which nursing accreditation standards are integrated with OSHA's list of known emergency department hazards; and a separate expert panel of occupational health nurses evaluated the degree to which current VA emergency department nursing position descriptions incorporated hazard recognition and control strategies. Ultimately, prevention of job-related injuries for nurses, and subsequently their patients, will depend directly on the degree to which nurses can identify and control the

  1. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers


    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  2. Characterising Seismic Hazard Input for Analysis Risk to Multi-System Infrastructures: Application to Scenario Event-Based Models and extension to Probabilistic Risk (United States)

    Weatherill, G. A.; Silva, V.


    The potential human and economic cost of earthquakes to complex urban infrastructures has been demonstrated in the most emphatic manner by recent large earthquakes such as that of Haiti (February 2010), Christchurch (September 2010 and February 2011) and Tohoku (March 2011). Consideration of seismic risk for a homogenous portfolio, such as a single building t