WorldWideScience

Sample records for understanding earthquake hazards

  1. Awareness and understanding of earthquake hazards at school

    Science.gov (United States)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  2. Cascading hazards: Understanding triggering relations between wet tropical cyclones, landslides, and earthquakes

    Science.gov (United States)

    Wdowinski, S.; Peng, Z.; Ferrier, K.; Lin, C. H.; Hsu, Y. J.; Shyu, J. B. H.

    2017-12-01

    Earthquakes, landslides, and tropical cyclones are extreme hazards that pose significant threats to human life and property. Some of the couplings between these hazards are well known. For example, sudden, widespread landsliding can be triggered by large earthquakes and by extreme rainfall events like tropical cyclones. Recent studies have also shown that earthquakes can be triggered by erosional unloading over 100-year timescales. In a NASA supported project, titled "Cascading hazards: Understanding triggering relations between wet tropical cyclones, landslides, and earthquake", we study triggering relations between these hazard types. The project focuses on such triggering relations in Taiwan, which is subjected to very wet tropical storms, landslides, and earthquakes. One example for such triggering relations is the 2009 Morakot typhoon, which was the wettest recorded typhoon in Taiwan (2850 mm of rain in 100 hours). The typhoon caused widespread flooding and triggered more than 20,000 landslides, including the devastating Hsiaolin landslide. Six months later, the same area was hit by the 2010 M=6.4 Jiashian earthquake near Kaohsiung city, which added to the infrastructure damage induced by the typhoon and the landslides. Preliminary analysis of temporal relations between main-shock earthquakes and the six wettest typhoons in Taiwan's past 50 years reveals similar temporal relations between M≥5 events and wet typhoons. Future work in the project will include remote sensing analysis of landsliding, seismic and geodetic monitoring of landslides, detection of microseismicity and tremor activities, and mechanical modeling of crustal stress changes due to surface unloading.

  3. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  4. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    Science.gov (United States)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  5. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  6. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  7. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  8. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  9. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  10. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  11. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    Science.gov (United States)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  12. Composite Earthquake Catalog of the Yellow Sea for Seismic Hazard Studies

    Science.gov (United States)

    Kang, S. Y.; Kim, K. H.; LI, Z.; Hao, T.

    2017-12-01

    The Yellow Sea (a.k.a West Sea in Korea) is an epicontinental and semi-closed sea located between Korea and China. Recent earthquakes in the Yellow Sea including, but not limited to, the Seogyuckryulbi-do (1 April 2014, magnitude 5.1), Heuksan-do (21 April 2013, magnitude 4.9), Baekryung-do (18 May 2013, magnitude 4.9) earthquakes, and the earthquake swarm in the Boryung offshore region in 2013, remind us of the seismic hazards affecting east Asia. This series of earthquakes in the Yellow Sea raised numerous questions. Unfortunately, both governments have trouble in monitoring seismicity in the Yellow Sea because earthquakes occur beyond their seismic networks. For example, the epicenters of the magnitude 5.1 earthquake in the Seogyuckryulbi-do region in 2014 reported by the Korea Meteorological Administration and China Earthquake Administration differed by approximately 20 km. This illustrates the difficulty with seismic monitoring and locating earthquakes in the region, despite the huge effort made by both governments. Joint effort is required not only to overcome the limits posed by political boundaries and geographical location but also to study seismicity and the underground structures responsible. Although the well-established and developing seismic networks in Korea and China have provided unprecedented amount and quality of seismic data, high quality catalog is limited to the recent 10s of years, which is far from major earthquake cycle. It is also noticed the earthquake catalog from either country is biased to its own and cannot provide complete picture of seismicity in the Yellow Sea. In order to understand seismic hazard and tectonics in the Yellow Sea, a composite earthquake catalog has been developed. We gathered earthquake information during last 5,000 years from various sources. There are good reasons to believe that some listings account for same earthquake, but in different source parameters. We established criteria in order to provide consistent

  13. ASSESSMENT OF EARTHQUAKE HAZARDS ON WASTE LANDFILLS

    DEFF Research Database (Denmark)

    Zania, Varvara; Tsompanakis, Yiannis; Psarropoulos, Prodromos

    Earthquake hazards may arise as a result of: (a) transient ground deformation, which is induced due to seismic wave propagation, and (b) permanent ground deformation, which is caused by abrupt fault dislocation. Since the adequate performance of waste landfills after an earthquake is of outmost...... importance, the current study examines the impact of both types of earthquake hazards by performing efficient finite-element analyses. These took also into account the potential slip displacement development along the geosynthetic interfaces of the composite base liner. At first, the development of permanent...

  14. Seismicity and seismic hazard in Sabah, East Malaysia from earthquake and geodetic data

    Science.gov (United States)

    Gilligan, A.; Rawlinson, N.; Tongkul, F.; Stephenson, R.

    2017-12-01

    While the levels of seismicity are low in most of Malaysia, the state of Sabah in northern Borneo has moderate levels of seismicity. Notable earthquakes in the region include the 1976 M6.2 Lahad Datu earthquake and the 2015 M6 Ranau earthquake. The recent Ranau earthquake resulted in the deaths of 18 people on Mt Kinabalu, an estimated 100 million RM ( US$23 million) damage to buildings, roads, and infrastructure from shaking, and flooding, reduced water quality, and damage to farms from landslides. Over the last 40 years the population of Sabah has increased to over four times what it was in 1976, yet seismic hazard in Sabah remains poorly understood. Using seismic and geodetic data we hope to better quantify the hazards posed by earthquakes in Sabah, and thus help to minimize risk. In order to do this we need to know about the locations of earthquakes, types of earthquakes that occur, and faults that are generating them. We use data from 15 MetMalaysia seismic stations currently operating in Sabah to develop a region-specific velocity model from receiver functions and a pre-existing surface wave model. We use this new velocity model to (re)locate earthquakes that occurred in Sabah from 2005-2016, including a large number of aftershocks from the 2015 Ranau earthquake. We use a probabilistic nonlinear earthquake location program to locate the earthquakes and then refine their relative locations using a double difference method. The recorded waveforms are further used to obtain moment tensor solutions for these earthquakes. Earthquake locations and moment tensor solutions are then compared with the locations of faults throughout Sabah. Faults are identified from high-resolution IFSAR images and subsequent fieldwork, with a particular focus on the Lahad Datau and Ranau areas. Used together, these seismic and geodetic data can help us to develop a new seismic hazard model for Sabah, as well as aiding in the delivery of outreach activities regarding seismic hazard

  15. Geotechnical hazards from large earthquakes and heavy rainfalls

    CERN Document Server

    Kazama, Motoki; Lee, Wei

    2017-01-01

    This book is a collection of papers presented at the International Workshop on Geotechnical Natural Hazards held July 12–15, 2014, in Kitakyushu, Japan. The workshop was the sixth in the series of Japan–Taiwan Joint Workshops on Geotechnical Hazards from Large Earthquakes and Heavy Rainfalls, held under the auspices of the Asian Technical Committee No. 3 on Geotechnology for Natural Hazards of the International Society for Soil Mechanics and Geotechnical Engineering. It was co-organized by the Japanese Geotechnical Society and the Taiwanese Geotechnical Society. The contents of this book focus on geotechnical and natural hazard-related issues in Asia such as earthquakes, tsunami, rainfall-induced debris flows, slope failures, and landslides. The book contains the latest information and mitigation technology on earthquake- and rainfall-induced geotechnical natural hazards. By dissemination of the latest state-of-the-art research in the area, the information contained in this book will help researchers, des...

  16. Playing against nature: improving earthquake hazard mitigation

    Science.gov (United States)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the

  17. Lower bound earthquake magnitude for probabilistic seismic hazard evaluation

    International Nuclear Information System (INIS)

    McCann, M.W. Jr.; Reed, J.W.

    1990-01-01

    This paper presents the results of a study that develops an engineering and seismological basis for selecting a lower-bound magnitude (LBM) for use in seismic hazard assessment. As part of a seismic hazard analysis the range of earthquake magnitudes that are included in the assessment of the probability of exceedance of ground motion must be defined. The upper-bound magnitude is established by earth science experts based on their interpretation of the maximum size of earthquakes that can be generated by a seismic source. The lower-bound or smallest earthquake that is considered in the analysis must also be specified. The LBM limits the earthquakes that are considered in assessing the probability that specified ground motion levels are exceeded. In the past there has not been a direct consideration of the appropriate LBM value that should be used in a seismic hazard assessment. This study specifically looks at the selection of a LBM for use in seismic hazard analyses that are input to the evaluation/design of nuclear power plants (NPPs). Topics addressed in the evaluation of a LBM are earthquake experience data at heavy industrial facilities, engineering characteristics of ground motions associated with small-magnitude earthquakes, probabilistic seismic risk assessments (seismic PRAs), and seismic margin evaluations. The results of this study and the recommendations concerning a LBM for use in seismic hazard assessments are discussed. (orig.)

  18. Earthquake Prediction Research In Iceland, Applications For Hazard Assessments and Warnings

    Science.gov (United States)

    Stefansson, R.

    Earthquake prediction research in Iceland, applications for hazard assessments and warnings. The first multinational earthquake prediction research project in Iceland was the Eu- ropean Council encouraged SIL project of the Nordic countries, 1988-1995. The path selected for this research was to study the physics of crustal processes leading to earth- quakes. It was considered that small earthquakes, down to magnitude zero, were the most significant for this purpose, because of the detailed information which they pro- vide both in time and space. The test area for the project was the earthquake prone region of the South Iceland seismic zone (SISZ). The PRENLAB and PRENLAB-2 projects, 1996-2000 supported by the European Union were a direct continuation of the SIL project, but with a more multidisciplinary approach. PRENLAB stands for "Earthquake prediction research in a natural labo- ratory". The basic objective was to advance our understanding in general on where, when and how dangerous NH10earthquake motion might strike. Methods were devel- oped to study crustal processes and conditions, by microearthquake information, by continuous GPS, InSAR, theoretical modelling, fault mapping and paleoseismology. New algorithms were developed for short term warnings. A very useful short term warning was issued twice in the year 2000, one for a sudden start of an eruption in Volcano Hekla February 26, and the other 25 hours before a second (in a sequence of two) magnitude 6.6 (Ms) earthquake in the South Iceland seismic zone in June 21, with the correct location and approximate size. A formal short term warning, although not going to the public, was also issued before a magnitude 5 earthquake in November 1998. In the presentation it will be shortly described what these warnings were based on. A general hazard assessmnets was presented in scientific journals 10-15 years ago assessing within a few kilometers the location of the faults of the two 2000 earthquakes and suggesting

  19. Updated earthquake catalogue for seismic hazard analysis in Pakistan

    Science.gov (United States)

    Khan, Sarfraz; Waseem, Muhammad; Khan, Muhammad Asif; Ahmed, Waqas

    2018-03-01

    A reliable and homogenized earthquake catalogue is essential for seismic hazard assessment in any area. This article describes the compilation and processing of an updated earthquake catalogue for Pakistan. The earthquake catalogue compiled in this study for the region (quadrangle bounded by the geographical limits 40-83° N and 20-40° E) includes 36,563 earthquake events, which are reported as 4.0-8.3 moment magnitude (M W) and span from 25 AD to 2016. Relationships are developed between the moment magnitude and body, and surface wave magnitude scales to unify the catalogue in terms of magnitude M W. The catalogue includes earthquakes from Pakistan and neighbouring countries to minimize the effects of geopolitical boundaries in seismic hazard assessment studies. Earthquakes reported by local and international agencies as well as individual catalogues are included. The proposed catalogue is further used to obtain magnitude of completeness after removal of dependent events by using four different algorithms. Finally, seismicity parameters of the seismic sources are reported, and recommendations are made for seismic hazard assessment studies in Pakistan.

  20. A global probabilistic tsunami hazard assessment from earthquake sources

    Science.gov (United States)

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  1. Deterministic Earthquake Hazard Assessment by Public Agencies in California

    Science.gov (United States)

    Mualchin, L.

    2005-12-01

    Even in its short recorded history, California has experienced a number of damaging earthquakes that have resulted in new codes and other legislation for public safety. In particular, the 1971 San Fernando earthquake produced some of the most lasting results such as the Hospital Safety Act, the Strong Motion Instrumentation Program, the Alquist-Priolo Special Studies Zone Act, and the California Department of Transportation (Caltrans') fault-based deterministic seismic hazard (DSH) map. The latter product provides values for earthquake ground motions based on Maximum Credible Earthquakes (MCEs), defined as the largest earthquakes that can reasonably be expected on faults in the current tectonic regime. For surface fault rupture displacement hazards, detailed study of the same faults apply. Originally, hospital, dam, and other critical facilities used seismic design criteria based on deterministic seismic hazard analyses (DSHA). However, probabilistic methods grew and took hold by introducing earthquake design criteria based on time factors and quantifying "uncertainties", by procedures such as logic trees. These probabilistic seismic hazard analyses (PSHA) ignored the DSH approach. Some agencies were influenced to adopt only the PSHA method. However, deficiencies in the PSHA method are becoming recognized, and the use of the method is now becoming a focus of strong debate. Caltrans is in the process of producing the fourth edition of its DSH map. The reason for preferring the DSH method is that Caltrans believes it is more realistic than the probabilistic method for assessing earthquake hazards that may affect critical facilities, and is the best available method for insuring public safety. Its time-invariant values help to produce robust design criteria that are soundly based on physical evidence. And it is the method for which there is the least opportunity for unwelcome surprises.

  2. Earthquake hazard zonation using peak ground acceleration (PGA) approach

    International Nuclear Information System (INIS)

    Irwansyah, E; Winarko, E; Rasjid, Z E; Bekti, R D

    2013-01-01

    The objective of this research is to develop seismic hazard area zones in the building infrastructure of the Banda Aceh City Indonesia using peak ground acceleration (PGA) measured using global and local attenuation function. PGA is calculated using attenuation function that describes the correlation between the local ground movement intensity the earthquake magnitude and the distance from the earthquake's epicentre. The data used comes from the earthquake damage catalogue available from the Indonesia meteorology, climatology and geophysics agency (BMKG) with range from year 1973 – 2011. The research methodology consists of six steps, which is developing the grid, calculation of the distance from the epicentre to the centroid of the grid, calculation of PGA values, developing the computer application, plotting the PGA values to the centroid grid, and developing the earthquake hazard zones using kriging algorithm. The conclusion of this research is that the global attenuation function that was developed by [20] can be applied to calculate the PGA values in the city of Banda Aceh. Banda Aceh city in micro scale can be divided into three hazard zones which is low hazard zone with PGA value of 0.8767 gals up to 0.8780 gals, medium hazard zone with PGA values of 0.8781 up to 0.8793 gals and high hazard zone with PGA values of 0.8794 up to 0.8806 gals.

  3. Understanding dynamic friction through spontaneously evolving laboratory earthquakes.

    Science.gov (United States)

    Rubino, V; Rosakis, A J; Lapusta, N

    2017-06-29

    Friction plays a key role in how ruptures unzip faults in the Earth's crust and release waves that cause destructive shaking. Yet dynamic friction evolution is one of the biggest uncertainties in earthquake science. Here we report on novel measurements of evolving local friction during spontaneously developing mini-earthquakes in the laboratory, enabled by our ultrahigh speed full-field imaging technique. The technique captures the evolution of displacements, velocities and stresses of dynamic ruptures, whose rupture speed range from sub-Rayleigh to supershear. The observed friction has complex evolution, featuring initial velocity strengthening followed by substantial velocity weakening. Our measurements are consistent with rate-and-state friction formulations supplemented with flash heating but not with widely used slip-weakening friction laws. This study develops a new approach for measuring local evolution of dynamic friction and has important implications for understanding earthquake hazard since laws governing frictional resistance of faults are vital ingredients in physically-based predictive models of the earthquake source.

  4. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    Science.gov (United States)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  5. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    Science.gov (United States)

    Boyd, Oliver Salz; Magistrale, Harold

    2011-01-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  6. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev

    2010-12-01

    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  7. Earthquake Hazard and Risk in Alaska

    Science.gov (United States)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  8. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008

    Science.gov (United States)

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.

    2009-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: http://earthquake.usgs.gov/regional/ceus/urban_map/st_louis/index.php. This progress report summarizes the

  9. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    International Nuclear Information System (INIS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-01-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh. (paper)

  10. Advancing Understanding of Earthquakes by Drilling an Eroding Convergent Margin

    Science.gov (United States)

    von Huene, R.; Vannucchi, P.; Ranero, C. R.

    2010-12-01

    A program of IODP with great societal relevance is sampling and instrumenting the seismogenic zone. The zone generates great earthquakes that trigger tsunamis, and submarine slides thereby endangering coastal communities containing over sixty percent of the earth’s population. To asses and mitigate this endangerment it is urgent to advance understanding of fault dynamics that allows more timely anticipation of hazardous seismicity. Seismogenesis on accreting and eroding convergent plate boundaries apparently differ because of dissimilar materials along the interplate fault. As the history of instrumentally recorded earthquakes expands the difference becomes clearer. The more homogeneous clay, silt and sand subducted at accreting margins is associated with great earthquakes (M 9) whereas the fragmented upper plate rock that can dominate subducted material along an eroding margin plate interface is associated with many tsunamigenic earthquakes (Bilek, 2010). Few areas have been identified where the seismogenic zone can be reached with scientific drilling. In IODP accreting margins are studied on the NanTroSeize drill transect off Japan where the ultimate drilling of the seismogenic interface may occur by the end of IODP. The eroding Costa Rica margin will be studied in CRISP where a drill program will begin in 2011. The Costa Rican geophysical site survey will be complete with acquisition and processing of 3D seismic data in 2011 but the entire drilling will not be accomplished in IODP. It is appropriate that the accreting margin study be accomplished soon considering the indications of a pending great earthquake that will affect a country that has devoted enormous resources to IODP. However, understanding the erosional end-member is scientifically as important to an understanding of fault mechanics. Transoceanic tsunamis affect the entire Pacific rim where most subduction zones are eroding margins. The Costa Rican subduction zone is less complex operationally and

  11. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    Science.gov (United States)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  12. Long Aftershock Sequences within Continents and Implications for Earthquake Hazard Assessment

    Science.gov (United States)

    Stein, S. A.; Liu, M.

    2014-12-01

    Recent seismicity in the Tangshan region in North China has prompted concern about a repetition of the 1976 M7.8 earthquake that destroyed the city, killing more than 242,000 people. However, the decay of seismicity there implies that the recent earthquakes are probably aftershocks of the 1976 event. This 37-year sequence is an example of the phenomenon that aftershock sequences within continents are often significantly longer than the typical 10 years at plate boundaries. The long sequence of aftershocks in continents is consistent with a simple friction-based model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Hence the slowly-deforming continents tend to have aftershock sequences significantly longer than at rapidly-loaded plate boundaries. This effect has two consequences for hazard assessment. First, within the heavily populated continents that are typically within plate interiors, assessments of earthquake hazards rely significantly on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. This assumption would lead to overestimation of the hazard in presently active areas and underestimation elsewhere, if some of these small events are aftershocks. Second, successful attempts to remove aftershocks from catalogs used for hazard assessment would underestimate the hazard, because much of the hazard is due to the aftershocks, and the declustering algorithms implicitly assume short aftershock sequences and thus do not remove long-duration ones.

  13. Insights into earthquake hazard map performance from shaking history simulations

    Science.gov (United States)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher

  14. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    Science.gov (United States)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-09-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction

  15. National Earthquake Hazards Program at a Crossroads

    Science.gov (United States)

    Showstack, Randy

    The U.S.National Earthquake Hazards Reduction Program, which turns 25 years old on 1 October 2003, is passing through two major transitions, which experts said either could weaken or strengthen the program. On 1 March, a federal government reorganization placed NEHRP's lead agency,the Federal Emergency Management Agency (FEMA),within the new Department of Homeland Security (DHS). A number of earthquake scientists and engineers expressed concern that NEHRP, which already faces budgetary and organizational challenges, and lacks visibility,could end up being marginalized in the bureaucratic shuffle. Some experts, though,as well as agency officials, said they hope DHS will recognize synergies between dealing with earthquakes and terrorist attacks.

  16. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    Science.gov (United States)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  17. Assessment of earthquake-induced tsunami hazard at a power plant site

    International Nuclear Information System (INIS)

    Ghosh, A.K.

    2008-01-01

    This paper presents a study of the tsunami hazard due to submarine earthquakes at a power plant site on the east coast of India. The paper considers various sources of earthquakes from the tectonic information, and records of past earthquakes and tsunamis. Magnitude-frequency relationship for earthquake occurrence rate and a simplified model for tsunami run-up height as a function of earthquake magnitude and the distance between the source and site have been developed. Finally, considering equal likelihood of generation of earthquakes anywhere on each of the faults, the tsunami hazard has been evaluated and presented as a relationship between tsunami height and its mean recurrence interval (MRI). Probability of exceedence of a certain wave height in a given period of time is also presented. These studies will be helpful in making an estimate of the tsunami-induced flooding potential at the site

  18. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    Science.gov (United States)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  19. Integrating Real-time Earthquakes into Natural Hazard Courses

    Science.gov (United States)

    Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.

    2001-12-01

    Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm

  20. The earthquake of January 13, 1915 and the seismic hazard of the area

    International Nuclear Information System (INIS)

    Scarascia Mugnozza, Gabriele; Hailemikael, Salomon; Martini, Guido

    2015-01-01

    The January 13, 1915, magnitude 7.0 Marsica Earthquake devastated the Fucino basin and surroundings, causing about 30,000 casualties and entirely destroying several towns, among which the major municipality of the area, the town of Avezzano. In this paper, we briefly review the main characteristics of the earthquake and its effects on the environment. Furthermore, based on the Italian building code and ongoing seismic microzonation investigations, we describe the seismic hazard of the area struck by the earthquake in terms of both probabilistic seismic hazard assessment and contribution of site effects on the seismic hazard estimate. All the studies confirm the very high level of seismic hazard of the Fucino territory [it

  1. Probabilistic earthquake hazard analysis for Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  2. Earthquake Hazard and Risk in New Zealand

    Science.gov (United States)

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates

  3. The wicked problem of earthquake hazard in developing countries: the example of Bangladesh

    Science.gov (United States)

    Steckler, M. S.; Akhter, S. H.; Stein, S.; Seeber, L.

    2017-12-01

    Many developing nations in earthquake-prone areas confront a tough problem: how much of their limited resources to use mitigating earthquake hazards? This decision is difficult because it is unclear when an infrequent major earthquake may happen, how big it could be, and how much harm it may cause. This issue faces nations with profound immediate needs and ongoing rapid urbanization. Earthquake hazard mitigation in Bangladesh is a wicked problem. It is the world's most densely populated nation, with 160 million people in an area the size of Iowa. Complex geology and sparse data make assessing a possibly-large earthquake hazard difficult. Hence it is hard to decide how much of the limited resources available should be used for earthquake hazard mitigation, given other more immediate needs. Per capita GDP is $1200, so Bangladesh is committed to economic growth and resources are needed to address many critical challenges and hazards. In their subtropical environment, rural Bangladeshis traditionally relied on modest mud or bamboo homes. Their rapidly growing, crowded capital, Dhaka, is filled with multistory concrete buildings likely to be vulnerable to earthquakes. The risk is compounded by the potential collapse of services and accessibility after a major temblor. However, extensive construction as the population shifts from rural to urban provides opportunity for earthquake-risk reduction. While this situation seems daunting, it is not hopeless. Robust risk management is practical, even for developing nations. It involves recognizing uncertainties and developing policies that should give a reasonable outcome for a range of the possible hazard and loss scenarios. Over decades, Bangladesh has achieved a thousandfold reduction in risk from tropical cyclones by building shelters and setting up a warning system. Similar efforts are underway for earthquakes. Smart investments can be very effective, even if modest. Hence, we suggest strategies consistent with high

  4. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    Science.gov (United States)

    McNamara, D. E.; Yeck, W. L.; Barnhart, W. D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, A.; Hough, S. E.; Benz, H. M.; Earle, P. S.

    2017-09-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard. Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10-15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  5. Earthquake Hazard for Aswan High Dam Area

    Science.gov (United States)

    Ismail, Awad

    2016-04-01

    Earthquake activity and seismic hazard analysis are important components of the seismic aspects for very essential structures such as major dams. The Aswan High Dam (AHD) created the second man-made reservoir in the world (Lake Nasser) and is constructed near urban areas pose a high-risk potential for downstream life and property. The Dam area is one of the seismically active regions in Egypt and is occupied with several cross faults, which are dominant in the east-west and north-south. Epicenters were found to cluster around active faults in the northern part of Lake and AHD location. The space-time distribution and the relation of the seismicity with the lake water level fluctuations were studied. The Aswan seismicity separates into shallow and deep seismic zones, between 0 and 14 and 14 and 30 km, respectively. These two seismic zones behave differently over time, as indicated by the seismicity rate, lateral extent, b-value, and spatial clustering. It is characterized by earthquake swarm sequences showing activation of the clustering-events over time and space. The effect of the North African drought (1982 to present) is clearly seen in the reservoir water level. As it decreased and left the most active fault segments uncovered, the shallow activity was found to be more sensitive to rapid discharging than to the filling. This study indicates that geology, topography, lineations in seismicity, offsets in the faults, changes in fault trends and focal mechanisms are closely related. No relation was found between earthquake activity and both-ground water table fluctuations and water temperatures measured in wells located around the Kalabsha area. The peak ground acceleration is estimated in the dam site based on strong ground motion simulation. This seismic hazard analyses have indicated that AHD is stable with the present seismicity. The earthquake epicenters have recently took place approximately 5 km west of the AHD structure. This suggests that AHD dam must be

  6. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    Science.gov (United States)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  7. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    Science.gov (United States)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  8. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah

    Science.gov (United States)

    Gori, Paula L.

    1993-01-01

    engineering studies. Translated earthquake hazard maps have also been developed to identify areas that are particularly vulnerable to various causes of damage such as ground shaking, surface rupturing, and liquefaction. The implementation of earthquake hazard reduction plans are now under way in various communities in Utah. The results of a survey presented in this paper indicate that technical public officials (planners and building officials) have an understanding of the earthquake hazards and how to mitigate the risks. Although the survey shows that the general public has a slightly lower concern about the potential for economic losses, they recognize the potential problems and can support a number of earthquake mitigation measures. The study suggests that many community groups along the Wasatch Front, including volunteer groups, business groups, and elected and appointed officials, are ready for action-oriented educational programs. These programs could lead to a significant reduction in the risks associated with earthquake hazards. A DATA BASE DESIGNED FOR URBAN SEISMIC HAZARDS STUDIES: A computerized data base has been designed for use in urban seismic hazards studies conducted by the U.S. Geological Survey. The design includes file structures for 16 linked data sets, which contain geological, geophysical, and seismological data used in preparing relative ground response maps of large urban areas. The data base is organized along relational data base principles. A prototype urban hazards data base has been created for evaluation in two urban areas currently under investigation: the Wasatch Front region of Utah and the Puget Sound area of Washington. The initial implementation of the urban hazards data base was accomplished on a microcomputer using dBASE III Plus software and transferred to minicomputers and a work station. A MAPPING OF GROUND-SHAKING INTENSITIES FOR SALT LAKE COUNTY, UTAH: This paper documents the development of maps showing a

  9. Up-to-date Probabilistic Earthquake Hazard Maps for Egypt

    Science.gov (United States)

    Gaber, Hanan; El-Hadidy, Mahmoud; Badawy, Ahmed

    2018-04-01

    An up-to-date earthquake hazard analysis has been performed in Egypt using a probabilistic seismic hazard approach. Through the current study, we use a complete and homogenous earthquake catalog covering the time period between 2200 BC and 2015 AD. Three seismotectonic models representing the seismic activity in and around Egypt are used. A logic-tree framework is applied to allow for the epistemic uncertainty in the declustering parameters, minimum magnitude, seismotectonic setting and ground-motion prediction equations. The hazard analysis is performed for a grid of 0.5° × 0.5° in terms of types of rock site for the peak ground acceleration (PGA) and spectral acceleration at 0.2-, 0.5-, 1.0- and 2.0-s periods. The hazard is estimated for three return periods (72, 475 and 2475 years) corresponding to 50, 10 and 2% probability of exceedance in 50 years. The uniform hazard spectra for the cities of Cairo, Alexandria, Aswan and Nuwbia are constructed. The hazard maps show that the highest ground acceleration values are expected in the northeastern part of Egypt around the Gulf of Aqaba (PGA up to 0.4 g for return period 475 years) and in south Egypt around the city of Aswan (PGA up to 0.2 g for return period 475 years). The Western Desert of Egypt is characterized by the lowest level of hazard (PGA lower than 0.1 g for return period 475 years).

  10. Earthquake Hazard Assessment: an Independent Review

    Science.gov (United States)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  11. The 2012 Ferrara seismic sequence: Regional crustal structure, earthquake sources, and seismic hazard

    Science.gov (United States)

    Malagnini, Luca; Herrmann, Robert B.; Munafò, Irene; Buttinelli, Mauro; Anselmi, Mario; Akinci, Aybige; Boschi, E.

    2012-10-01

    Inadequate seismic design codes can be dangerous, particularly when they underestimate the true hazard. In this study we use data from a sequence of moderate-sized earthquakes in northeast Italy to validate and test a regional wave propagation model which, in turn, is used to understand some weaknesses of the current design spectra. Our velocity model, while regionalized and somewhat ad hoc, is consistent with geophysical observations and the local geology. In the 0.02-0.1 Hz band, this model is validated by using it to calculate moment tensor solutions of 20 earthquakes (5.6 ≥ MW ≥ 3.2) in the 2012 Ferrara, Italy, seismic sequence. The seismic spectra observed for the relatively small main shock significantly exceeded the design spectra to be used in the area for critical structures. Observations and synthetics reveal that the ground motions are dominated by long-duration surface waves, which, apparently, the design codes do not adequately anticipate. In light of our results, the present seismic hazard assessment in the entire Pianura Padana, including the city of Milan, needs to be re-evaluated.

  12. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    Science.gov (United States)

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude

  13. Earthquake Hazard in the New Madrid Seismic Zone Remains a Concern

    Science.gov (United States)

    Frankel, A.D.; Applegate, D.; Tuttle, M.P.; Williams, R.A.

    2009-01-01

    There is broad agreement in the scientific community that a continuing concern exists for a major destructive earthquake in the New Madrid seismic zone. Many structures in Memphis, Tenn., St. Louis, Mo., and other communities in the central Mississippi River Valley region are vulnerable and at risk from severe ground shaking. This assessment is based on decades of research on New Madrid earthquakes and related phenomena by dozens of Federal, university, State, and consulting earth scientists. Considerable interest has developed recently from media reports that the New Madrid seismic zone may be shutting down. These reports stem from published research using global positioning system (GPS) instruments with results of geodetic measurements of strain in the Earth's crust. Because of a lack of measurable strain at the surface in some areas of the seismic zone over the past 14 years, arguments have been advanced that there is no buildup of stress at depth within the New Madrid seismic zone and that the zone may no longer pose a significant hazard. As part of the consensus-building process used to develop the national seismic hazard maps, the U.S. Geological Survey (USGS) convened a workshop of experts in 2006 to evaluate the latest findings in earthquake hazards in the Eastern United States. These experts considered the GPS data from New Madrid available at that time that also showed little to no ground movement at the surface. The experts did not find the GPS data to be a convincing reason to lower the assessment of earthquake hazard in the New Madrid region, especially in light of the many other types of data that are used to construct the hazard assessment, several of which are described here.

  14. The Wenchuan, China M8.0 Earthquake: A Lesson and Implication for Seismic Hazard Mitigation

    Science.gov (United States)

    Wang, Z.

    2008-12-01

    The Wenchuan, China M8.0 earthquake caused great damage and huge casualty. 69,197 people were killed, 374,176 people were injured, and 18,341 people are still missing. The estimated direct economic loss is about 126 billion U.S. dollar. The Wenchuan earthquake again demonstrated that earthquake does not kill people, but the built environments and induced hazards, landslides in particular, do. Therefore, it is critical to strengthen the built environments, such buildings and bridges, and to mitigate the induced hazards in order to avoid such disaster. As a part of the so-called North-South Seismic Zone in China, the Wenchuan earthquake occurred along the Longmen Shan thrust belt which forms a boundary between the Qinghai-Tibet Plateau and the Sichuan basin, and there is a long history (~4,000 years) of seismicity in the area. The historical records show that the area experienced high intensity (i.e., greater than IX) in the past several thousand years. In other words, the area is well-known to have high seismic hazard because of its tectonic setting and seismicity. However, only intensity VII (0.1 to 0.15g PGA) has been considered for seismic design for the built environments in the area. This was one of the main reasons that so many building collapses, particularly the school buildings, during the Wenchuan earthquake. It is clear that the seismic design (i.e., the design ground motion or intensity) is not adequate in the Wenchuan earthquake stricken area. A lesson can be learned from the Wenchuan earthquake on the seismic hazard and risk assessment. A lesson can also be learned from this earthquake on seismic hazard mitigation and/or seismic risk reduction.

  15. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.

  16. Monitoring Geologic Hazards and Vegetation Recovery in the Wenchuan Earthquake Region Using Aerial Photography

    Directory of Open Access Journals (Sweden)

    Zhenwang Li

    2014-03-01

    Full Text Available On 12 May 2008, the 8.0-magnitude Wenchuan earthquake occurred in Sichuan Province, China, triggering thousands of landslides, debris flows, and barrier lakes, leading to a substantial loss of life and damage to the local environment and infrastructure. This study aimed to monitor the status of geologic hazards and vegetation recovery in a post-earthquake disaster area using high-resolution aerial photography from 2008 to 2011, acquired from the Center for Earth Observation and Digital Earth (CEODE, Chinese Academy of Sciences. The distribution and range of hazards were identified in 15 large, representative geologic hazard areas triggered by the Wenchuan earthquake. After conducting an overlay analysis, the variations of these hazards between successive years were analyzed to reflect the geologic hazard development and vegetation recovery. The results showed that in the first year after the Wenchuan earthquake, debris flows occurred frequently with high intensity. Resultantly, with the source material becoming less available and the slope structure stabilizing, the intensity and frequency of debris flows gradually decreased with time. The development rate of debris flows between 2008 and 2011 was 3% per year. The lithology played a dominant role in the formation of debris flows, and the topography and hazard size in the earthquake affected area also had an influence on the debris flow development process. Meanwhile, the overall geologic hazard area decreased at 12% per year, and the vegetation recovery on the landslide mass was 15% to 20% per year between 2008 and 2011. The outcomes of this study provide supporting data for ecological recovery as well as debris flow control and prevention projects in hazard-prone areas.

  17. Crustal structure and Seismic Hazard studies in Nigeria from ambient noise and earthquakes

    Science.gov (United States)

    Kadiri, U. A.

    2016-12-01

    The crust, upper Mantle and seismic hazard studies have been carried out in Nigeria using noise and earthquake data. The data were acquired from stations in Nigeria and international Agencies. Firstly, known depths of sediments in the Lower Benue Trough (LBT) were collected from wells; Resonance frequency (Fo) and average shear-wave velocities (Vs) were then computed using Matlab. Secondly, average velocities were estimated from noise cross-correlation along seismic stations. Thirdly, the moho depths beneath Ife, Kaduna and Nsukka stations were estimated, as well as Vp/Vs ratio using 2009 earthquake with epicenter in Nigeria. Finally, Statistical and Probabilistic Seismic Hazard Assessment (PSHA) were used to compute seismic hazard parameters in Nigeria and its surroundings. The results showed that, soils on the LBT with average shear wave velocity of about 5684m/s would experience more amplification in case of an earthquake, compared to the basement complex in Nigeria. The Vs beneath the seismic stations in Nigeria were also estimated as 288m/s, 1019m/s, 940.6m/s and 255.02m/s in Ife, Nsukka, Awka, and Abakaliki respectively. The average velocity along the station paths was 4.5km/secs, and the Vp, Vs for depths 100-500km profile in parts of South West Nigeria increased from about 5.83-6.42Km/sec and 3.48-6.31km/s respectively with Vp/Vs ratio decreasing from 1.68 to 1.02. Statistical analysis revealed a trend of increasing earthquake occurrence along the Mid-Atlantic Ridge and tending to West African region. The analysis of PSHA shows the likelihood of earthquakes with different magnitudes occurring in Nigeria and other parts West Africa in future. This work is aimed at addressing critical issues regarding sites effect characterization, improved earthquake location and robust seismic hazards assessment for planning in the choice of sites for critical facilities in Nigeria. Keywords: Sediment thickness, Resonance Frequency, Average Velocity, Seismic Hazard, Nigeria

  18. The 2016 Kumamoto Earthquakes: Cascading Geological Hazards and Compounding Risks

    Directory of Open Access Journals (Sweden)

    Katsuichiro Goda

    2016-08-01

    Full Text Available A sequence of two strike-slip earthquakes occurred on 14 and 16 April 2016 in the intraplate region of Kyushu Island, Japan, apart from subduction zones, and caused significant damage and disruption to the Kumamoto region. The analyses of regional seismic catalog and available strong motion recordings reveal striking characteristics of the events, such as migrating seismicity, earthquake surface rupture, and major foreshock-mainshock earthquake sequences. To gain valuable lessons from the events, a UK Earthquake Engineering Field Investigation Team (EEFIT was dispatched to Kumamoto, and earthquake damage surveys were conducted to relate observed earthquake characteristics to building and infrastructure damage caused by the earthquakes. The lessons learnt from the reconnaissance mission have important implications on current seismic design practice regarding the required seismic resistance of structures under multiple shocks and the seismic design of infrastructure subject to large ground deformation. The observations also highlight the consequences of cascading geological hazards on community resilience. To share the gathered damage data widely, geo-tagged photos are organized using Google Earth and the kmz file is made publicly available.

  19. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    Science.gov (United States)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  20. 2017 One‐year seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes

    Science.gov (United States)

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert; Llenos, Andrea L.; Ellsworth, William L.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2017-01-01

    We produce a one‐year 2017 seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one‐year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic‐hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one year) in five focus areas: Oklahoma–Kansas, the Raton basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 moment magnitude (M) ≥4 and 3 M≥5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma–Kansas focus area, two earthquakes with M≥4 occurred near Trinidad, Colorado (in the Raton basin focus area), but no earthquakes with M≥2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground‐shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared with 2015, which may be related to decreased wastewater injection caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  1. How Can Museum Exhibits Enhance Earthquake and Tsunami Hazard Resiliency?

    Science.gov (United States)

    Olds, S. E.

    2015-12-01

    Creating a natural disaster-ready community requires interoperating scientific, technical, and social systems. In addition to the technical elements that need to be in place, communities and individuals need to be prepared to react when a natural hazard event occurs. Natural hazard awareness and preparedness training and education often takes place through informal learning at science centers and formal k-12 education programs as well as through awareness raising via strategically placed informational tsunami warning signs and placards. Museums and science centers are influential in raising science literacy within a community, however can science centers enhance earthquake and tsunami resiliency by providing hazard science content and preparedness exhibits? Museum docents and informal educators are uniquely situated within the community. They are transmitters and translators of science information to broad audiences. Through interaction with the public, docents are well positioned to be informants of the knowledge beliefs, and feelings of science center visitors. They themselves are life-long learners, both constantly learning from the museum content around them and sharing this content with visitors. They are also members of a community where they live. In-depth interviews with museum informal educators and docents were conducted at a science center in coastal Pacific Northwest. This region has a potential to be struck by a great 9+ Mw earthquake and subsequent tsunami. During the interviews, docents described how they applied learning from natural hazard exhibits at a science visitor center to their daily lives. During the individual interviews, the museum docents described their awareness (knowledge, attitudes, and behaviors) of natural hazards where they live and work, the feelings evoked as they learned about their hazard vulnerability, the extent to which they applied this learning and awareness to their lives, such as creating an evacuation plan, whether

  2. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States

    Science.gov (United States)

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in

  3. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    Science.gov (United States)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  4. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model

    Science.gov (United States)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza

    2017-08-01

    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  5. Kinematics, mechanics, and potential earthquake hazards for faults in Pottawatomie County, Kansas, USA

    Science.gov (United States)

    Ohlmacher, G.C.; Berendsen, P.

    2005-01-01

    Many stable continental regions have subregions with poorly defined earthquake hazards. Analysis of minor structures (folds and faults) in these subregions can improve our understanding of the tectonics and earthquake hazards. Detailed structural mapping in Pottawatomie County has revealed a suite consisting of two uplifted blocks aligned along a northeast trend and surrounded by faults. The first uplift is located southwest of the second. The northwest and southeast sides of these uplifts are bounded by northeast-trending right-lateral faults. To the east, both uplifts are bounded by north-trending reverse faults, and the first uplift is bounded by a north-trending high-angle fault to the west. The structural suite occurs above a basement fault that is part of a series of north-northeast-trending faults that delineate the Humboldt Fault Zone of eastern Kansas, an integral part of the Midcontinent Rift System. The favored kinematic model is a contractional stepover (push-up) between echelon strike-slip faults. Mechanical modeling using the boundary element method supports the interpretation of the uplifts as contractional stepovers and indicates that an approximately east-northeast maximum compressive stress trajectory is responsible for the formation of the structural suite. This stress trajectory suggests potential activity during the Laramide Orogeny, which agrees with the age of kimberlite emplacement in adjacent Riley County. The current stress field in Kansas has a N85??W maximum compressive stress trajectory that could potentially produce earthquakes along the basement faults. Several epicenters of seismic events (

  6. Baseline geophysical data for hazard management in coastal areas in relation to earthquakes and tsunamis

    Digital Repository Service at National Institute of Oceanography (India)

    Murthy, K.S.R.

    is another factor for some of the intraplate earthquakes in the South Indian Shield, which includes the Eastern and Western Continental Margins of India. Baseline geophysical data for hazard management in coastal areas in relation to earthquakes... surge. Keywords Hazard management, marine geophysical data, geomorphology and tsunami surge, coastal seismicity Date received: 7 August 2015; accepted: 15 October 2015 CSIR – National Institute of Oceanography, Visakhapatnam, India Corresponding author...

  7. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Science.gov (United States)

    2010-08-17

    ... accommodate Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov... of Technology, 365 Innovation Drive, Memphis, TN 38152-3115. Please note admittance instructions...: Trends and developments in the science and engineering of earthquake hazards reduction; The effectiveness...

  8. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    Science.gov (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  9. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    Science.gov (United States)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  10. Effects of Strike-Slip Fault Segmentation on Earthquake Energy and Seismic Hazard

    Science.gov (United States)

    Madden, E. H.; Cooke, M. L.; Savage, H. M.; McBeck, J.

    2014-12-01

    Many major strike-slip faults are segmented along strike, including those along plate boundaries in California and Turkey. Failure of distinct fault segments at depth may be the source of multiple pulses of seismic radiation observed for single earthquakes. However, how and when segmentation affects fault behavior and energy release is the basis of many outstanding questions related to the physics of faulting and seismic hazard. These include the probability for a single earthquake to rupture multiple fault segments and the effects of segmentation on earthquake magnitude, radiated seismic energy, and ground motions. Using numerical models, we quantify components of the earthquake energy budget, including the tectonic work acting externally on the system, the energy of internal rock strain, the energy required to overcome fault strength and initiate slip, the energy required to overcome frictional resistance during slip, and the radiated seismic energy. We compare the energy budgets of systems of two en echelon fault segments with various spacing that include both releasing and restraining steps. First, we allow the fault segments to fail simultaneously and capture the effects of segmentation geometry on the earthquake energy budget and on the efficiency with which applied displacement is accommodated. Assuming that higher efficiency correlates with higher probability for a single, larger earthquake, this approach has utility for assessing the seismic hazard of segmented faults. Second, we nucleate slip along a weak portion of one fault segment and let the quasi-static rupture propagate across the system. Allowing fractures to form near faults in these models shows that damage develops within releasing steps and promotes slip along the second fault, while damage develops outside of restraining steps and can prohibit slip along the second fault. Work is consumed in both the propagation of and frictional slip along these new fractures, impacting the energy available

  11. Aftereffects of Subduction-Zone Earthquakes: Potential Tsunami Hazards along the Japan Sea Coast.

    Science.gov (United States)

    Minoura, Koji; Sugawara, Daisuke; Yamanoi, Tohru; Yamada, Tsutomu

    2015-10-01

    The 2011 Tohoku-Oki Earthquake is a typical subduction-zone earthquake and is the 4th largest earthquake after the beginning of instrumental observation of earthquakes in the 19th century. In fact, the 2011 Tohoku-Oki Earthquake displaced the northeast Japan island arc horizontally and vertically. The displacement largely changed the tectonic situation of the arc from compressive to tensile. The 9th century in Japan was a period of natural hazards caused by frequent large-scale earthquakes. The aseismic tsunamis that inflicted damage on the Japan Sea coast in the 11th century were related to the occurrence of massive earthquakes that represented the final stage of a period of high seismic activity. Anti-compressive tectonics triggered by the subduction-zone earthquakes induced gravitational instability, which resulted in the generation of tsunamis caused by slope failing at the arc-back-arc boundary. The crustal displacement after the 2011 earthquake infers an increased risk of unexpected local tsunami flooding in the Japan Sea coastal areas.

  12. Seismic hazard maps for earthquake-resistant construction designs

    International Nuclear Information System (INIS)

    Ohkawa, Izuru

    2004-01-01

    Based on the idea that seismic phenomena in Japan varying in different localities are to be reflected in designing specific nuclear facilities in specific site, the present research program started to make seismic hazard maps representing geographical distribution of seismic load factors. First, recent research data on historical earthquakes and materials on active faults in Japan have been documented. Differences in character due to different localities are expressed by dynamic load in consideration of specific building properties. Next, hazard evaluation corresponding to seismic-resistance factor is given as response index (spectrum) of an adequately selected building, for example a nuclear power station, with the help of investigation results of statistical analysis. (S. Ohno)

  13. Reducing Vulnerability of Ports and Harbors to Earthquake and Tsunami Hazards

    Science.gov (United States)

    Wood, Nathan J.; Good, James W.; Goodwin, Robert F.

    2002-01-01

    Recent scientific research suggests the Pacific Northwest could experience catastrophic earthquakes in the near future, both from distant and local sources, posing a significant threat to coastal communities. Damage could result from numerous earthquake-related hazards, such as severe ground shaking, soil liquefaction, landslides, land subsidence/uplift, and tsunami inundation. Because of their geographic location, ports and harbors are especially vulnerable to these hazards. Ports and harbors, however, are important components of many coastal communities, supporting numerous activities critical to the local and regional economy and possibly serving as vital post-event, response-recovery transportation links. A collaborative, multi-year initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to earthquake and tsunami hazards, involving Oregon Sea Grant (OSG), Washington Sea Grant (WSG), the National Oceanic and Atmospheric Administration Coastal Services Center (CSC), and the U.S. Geological Survey Center for Science Policy (CSP). Specific products of this research, planning, and outreach initiative include a regional stakeholder issues and needs assessment, a community-based mitigation planning process, a Geographic Information System (GIS) — based vulnerability assessment methodology, an educational web-site and a regional data archive. This paper summarizes these efforts, including results of two pilot port-harbor community projects, one in Yaquina Bay, Oregon and the other in Sinclair Inlet, Washington. Finally, plans are outlined for outreach to other port and harbor communities in the Pacific Northwest and beyond, using "getting started" workshops and a web-based tutorial.

  14. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    Science.gov (United States)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  15. Probabilistic tsunami hazard assessment based on the long-term evaluation of subduction-zone earthquakes along the Sagami Trough, Japan

    Science.gov (United States)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Maeda, T.; Matsuyama, H.; Toyama, N.; Kito, T.; Murata, Y.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.; Hakamata, T.

    2017-12-01

    For the forthcoming large earthquakes along the Sagami Trough where the Philippine Sea Plate is subducting beneath the northeast Japan arc, the Earthquake Research Committee(ERC) /Headquarters for Earthquake Research Promotion, Japanese government (2014a) assessed that M7 and M8 class earthquakes will occur there and defined the possible extent of the earthquake source areas. They assessed 70% and 0% 5% of the occurrence probability within the next 30 years (from Jan. 1, 2014), respectively, for the M7 and M8 class earthquakes. First, we set possible 10 earthquake source areas(ESAs) and 920 ESAs, respectively, for M8 and M7 class earthquakes. Next, we constructed 125 characterized earthquake fault models (CEFMs) and 938 CEFMs, respectively, for M8 and M7 class earthquakes, based on "tsunami receipt" of ERC (2017) (Kitoh et al., 2016, JpGU). All the CEFMs are allowed to have a large slip area for expression of fault slip heterogeneity. For all the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. Finally, we re-distributed the occurrence probability to all CEFMs (Abe et al., 2014, JpGU) and gathered excess probabilities for variable tsunami heights, calculated from all the CEFMs, at every observation point along Pacific coast to get PTHA. We incorporated aleatory uncertainties inherent in tsunami calculation and earthquake fault slip heterogeneity. We considered two kinds of probabilistic hazard models; one is "Present-time hazard model" under an assumption that the earthquake occurrence basically follows a renewal process based on BPT distribution if the latest faulting time was known. The other is "Long-time averaged hazard model" under an assumption that earthquake occurrence follows a stationary Poisson process. We fixed our viewpoint, for example, on the probability that the tsunami height will exceed 3 meters at coastal points in next

  16. Unexpected earthquake hazard revealed by Holocene rupture on the Kenchreai Fault (central Greece): Implications for weak sub-fault shear zones

    Science.gov (United States)

    Copley, Alex; Grützner, Christoph; Howell, Andy; Jackson, James; Penney, Camilla; Wimpenny, Sam

    2018-03-01

    High-resolution elevation models, palaeoseismic trenching, and Quaternary dating demonstrate that the Kenchreai Fault in the eastern Gulf of Corinth (Greece) has ruptured in the Holocene. Along with the adjacent Pisia and Heraion Faults (which ruptured in 1981), our results indicate the presence of closely-spaced and parallel normal faults that are simultaneously active, but at different rates. Such a configuration allows us to address one of the major questions in understanding the earthquake cycle, specifically what controls the distribution of interseismic strain accumulation? Our results imply that the interseismic loading and subsequent earthquakes on these faults are governed by weak shear zones in the underlying ductile crust. In addition, the identification of significant earthquake slip on a fault that does not dominate the late Quaternary geomorphology or vertical coastal motions in the region provides an important lesson in earthquake hazard assessment.

  17. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    Science.gov (United States)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses

  18. Earthquake shaking hazard estimates and exposure changes in the conterminous United States

    Science.gov (United States)

    Jaiswal, Kishor S.; Petersen, Mark D.; Rukstales, Kenneth S.; Leith, William S.

    2015-01-01

    A large portion of the population of the United States lives in areas vulnerable to earthquake hazards. This investigation aims to quantify population and infrastructure exposure within the conterminous U.S. that are subjected to varying levels of earthquake ground motions by systematically analyzing the last four cycles of the U.S. Geological Survey's (USGS) National Seismic Hazard Models (published in 1996, 2002, 2008 and 2014). Using the 2013 LandScan data, we estimate the numbers of people who are exposed to potentially damaging ground motions (peak ground accelerations at or above 0.1g). At least 28 million (~9% of the total population) may experience 0.1g level of shaking at relatively frequent intervals (annual rate of 1 in 72 years or 50% probability of exceedance (PE) in 50 years), 57 million (~18% of the total population) may experience this level of shaking at moderately frequent intervals (annual rate of 1 in 475 years or 10% PE in 50 years), and 143 million (~46% of the total population) may experience such shaking at relatively infrequent intervals (annual rate of 1 in 2,475 years or 2% PE in 50 years). We also show that there is a significant number of critical infrastructure facilities located in high earthquake-hazard areas (Modified Mercalli Intensity ≥ VII with moderately frequent recurrence interval).

  19. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    Science.gov (United States)

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and also for preparing emergency response plans.The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault Zone to the east of the study area.In this report, an earthquake scenario is intended to depict the potential consequences of significant earthquakes. A scenario earthquake is not necessarily the largest or most damaging earthquake possible on a recognized fault. Rather it is both large enough and likely enough that emergency planners should consider it in regional emergency response plans. In particular, the ground motion predicted for a given scenario earthquake does not represent a full probabilistic hazard assessment, and thus it does not provide the basis for hazard zoning and earthquake-resistant building design.Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM). Alternatives

  20. Original earthquake design basis in light of recent seismic hazard studies

    International Nuclear Information System (INIS)

    Petrovski, D.

    1993-01-01

    For the purpose of conceiving the framework within which efforts have been made in the eastern countries to construct earthquake resistant nuclear power plants, a review of the development and application of the seismic zoning map of USSR is given. The normative values of seismic intensity and acceleration are discussed from the aspect of recent probabilistic seismic hazard studies. To that effect, presented briefly in this paper is the methodology of probabilistic seismic hazard analysis. (author)

  1. Application of High Performance Computing to Earthquake Hazard and Disaster Estimation in Urban Area

    Directory of Open Access Journals (Sweden)

    Muneo Hori

    2018-02-01

    Full Text Available Integrated earthquake simulation (IES is a seamless simulation of analyzing all processes of earthquake hazard and disaster. There are two difficulties in carrying out IES, namely, the requirement of large-scale computation and the requirement of numerous analysis models for structures in an urban area, and they are solved by taking advantage of high performance computing (HPC and by developing a system of automated model construction. HPC is a key element in developing IES, as it needs to analyze wave propagation and amplification processes in an underground structure; a model of high fidelity for the underground structure exceeds a degree-of-freedom larger than 100 billion. Examples of IES for Tokyo Metropolis are presented; the numerical computation is made by using K computer, the supercomputer of Japan. The estimation of earthquake hazard and disaster for a given earthquake scenario is made by the ground motion simulation and the urban area seismic response simulation, respectively, for the target area of 10,000 m × 10,000 m.

  2. Recent research in earth structure, earthquake and mine seismology, and seismic hazard evaluation in South Africa

    CSIR Research Space (South Africa)

    Wright, C

    2003-07-01

    Full Text Available of earthquakes, earthquake hazard and earth structure in South Africa was prepared for the centennial handbook of the Interna- tional Association of Seismology and the Physics of the Earth?s Interior(IASPEI).3 Referencestothesescompletedinthelastfour...

  3. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    Science.gov (United States)

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  4. A procedure for the determination of scenario earthquakes for seismic design based on probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Hirose, Jiro; Muramatsu, Ken

    2002-03-01

    This report presents a study on the procedures for the determination of scenario earthquakes for seismic design of nuclear power plants (NPPs) based on probabilistic seismic hazard analysis (PSHA). In the recent years, the use of PSHA, which is a part of seismic probabilistic safety assessment (PSA), to determine the design basis earthquake motions for NPPs has been proposed. The identified earthquakes are called probability-based scenario earthquakes (PBSEs). The concept of PBSEs originates both from the study of US NRC and from Ishikawa and Kameda. The assessment of PBSEs is composed of seismic hazard analysis and identification of dominant earthquakes. The objectives of this study are to formulate the concept of PBSEs and to examine the procedures for determining the PBSEs for a domestic NPP site. This report consists of three parts, namely, procedures to compile analytical conditions for PBSEs, an assessment to identify PBSEs for a model site using the Ishikawa's concept and the examination of uncertainties involved in analytical conditions. The results obtained from the examination of PBSEs using Ishikawa's concept are as follows. (a) Since PBSEs are expressed by hazard-consistent magnitude and distance in terms of a prescribed reference probability, it is easy to obtain a concrete image of earthquakes that determine the ground response spectrum to be considered in the design of NPPs. (b) Source contribution factors provide the information on the importance of the earthquake source regions and/or active faults, and allows the selection of a couple of PBSEs based on their importance to the site. (c) Since analytical conditions involve uncertainty, sensitivity analyses on uncertainties that would affect seismic hazard curves and identification of PBSEs were performed on various aspects and provided useful insights for assessment of PBSEs. A result from this sensitivity analysis was that, although the difference in selection of attenuation equations led to a

  5. Tectonic styles of future earthquakes in Italy as input data for seismic hazard

    Science.gov (United States)

    Pondrelli, S.; Meletti, C.; Rovida, A.; Visini, F.; D'Amico, V.; Pace, B.

    2017-12-01

    In a recent elaboration of a new seismogenic zonation and hazard model for Italy, we tried to understand how many indications we have on the tectonic style of future earthquake/rupture. Using all available or recomputed seismic moment tensors for relevant seismic events (Mw starting from 4.5) of the last 100 yrs, first arrival focal mechanisms for less recent earthquakes and also geological data on past activated faults, we collected a database gathering a thousands of data all over the Italian peninsula and regions around it. After several summations of seismic moment tensors, over regular grids of different dimensions and different thicknesses of the seismogenic layer, we applied the same procedure to each of the 50 area sources that were designed in the seismogenic zonation. The results for several seismic zones are very stable, e.g. along the southern Apennines we expect future earthquakes to be mostly extensional, although in the outer part of the chain strike-slip events are possible. In the Northern part of the Apennines we also expect different, opposite tectonic styles for different hypocentral depths. In several zones, characterized by a low seismic moment release, defined for the study region using 1000 yrs of catalog, the next possible tectonic style of future earthquakes is less clear. It is worth to note that for some zones the possible greatest earthquake could be not represented in the available observations. We also add to our analysis the computation of the seismic release rate, computed using a distributed completeness, identified for single great events of the historical seismic catalog for Italy. All these information layers, overlapped and compared, may be used to characterize each new seismogenic zone.

  6. Echo-sounding method aids earthquake hazard studies

    Science.gov (United States)

    ,

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  7. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    Science.gov (United States)

    Bostenaru Dan, M.

    2009-04-01

    This special issue includes selected papers on the topic of earthquake impact from the sessions held in 2004 in Nice, France and in 2005 in Vienna, Austria at the first and respectivelly the second European Geosciences Union General Assembly. Since its start in 1999, in the Hague, Netherlands, the hazard of earthquakes has been the most popular of the session. The respective calls in 2004 was for: Nature's forces including earthquakes, floods, landslides, high winds and volcanic eruptions can inflict losses to urban settlements and man-made structures such as infrastructure. In Europe, recent years have seen such significant losses from earthquakes in south and south-eastern Europe, floods in central Europe, and wind storms in western Europe. Meanwhile, significant progress has been made in understanding disasters. Several scientific fields contribute to a holistic approach in the evaluation of capacities, vulnerabilities and hazards, the main factors on mitigating urban disasters due to natural hazards. An important part of the session is devoted to assessment of earthquake shaking and loss scenarios, including both physical damage and human causalities. Early warning and rapid damage evaluation are of utmost importance for addressing the safety of many essential facilities, for emergency management of events and for disaster response. In case of earthquake occurrence strong motion networks, data processing and interpretation lead to preliminary estimation (scenarios) of geographical distribution of damages. Factual information on inflicted damage, like those obtained from shaking maps or aerial imagery permit a confrontation with simulation maps of damage in order to define a more accurate picture of the overall losses. Most recent developments towards quantitative and qualitative simulation of natural hazard impacts on urban areas, which provide decision-making support for urban disaster management, and success stories of and lessons learned from disaster

  8. Earthquake risk reduction in the United States: An assessment of selected user needs and recommendations for the National Earthquake Hazards Reduction Program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This Assessment was conducted to improve the National Earthquake Hazards Reduction Program (NEHRP) by providing NEHRP agencies with information that supports their user-oriented setting of crosscutting priorities in the NEHRP strategic planning process. The primary objective of this Assessment was to take a ``snapshot`` evaluation of the needs of selected users throughout the major program elements of NEHRP. Secondary objectives were to conduct an assessment of the knowledge that exists (or is being developed by NEHRP) to support earthquake risk reduction, and to begin a process of evaluating how NEHRP is meeting user needs. An identification of NEHRP`s strengths also resulted from the effort, since those strengths demonstrate successful methods that may be useful to NEHRP in the future. These strengths are identified in the text, and many of them represent important achievements since the Earthquake Hazards Reduction Act was passed in 1977.

  9. Dynamic evaluation of seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    Science.gov (United States)

    Kossobokov, V. G.; Nekrasova, A.

    2016-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A + B•(6 - M) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L, A characterizes the average annual rate of strong (M = 6) earthquakes, B determines the balance between magnitude ranges, and C estimates the fractal dimension of seismic locus in projection to the Earth surface. The parameters A, B, and C of USLE are used to assess, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity or paleo data), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures. The hazard maps for a given territory change dramatically, when the methodology is applied to a certain size moving time window, e.g. about a decade long for an intermediate-term regional assessment or exponentially increasing intervals for a daily local strong aftershock forecasting. The of dynamical seismic hazard and risks assessment is illustrated by applications to the territory of Greater Caucasus and Crimea and the two-year series of aftershocks of the 11 October 2008 Kurchaloy, Chechnya earthquake which case-history appears to be encouraging for further systematic testing as potential short-term forecasting tool.

  10. Assessing the Utility of and Improving USGS Earthquake Hazards Program Products

    Science.gov (United States)

    Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.

    2010-12-01

    A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.

  11. Disaggregated seismic hazard and the elastic input energy spectrum: An approach to design earthquake selection

    Science.gov (United States)

    Chapman, Martin Colby

    1998-12-01

    The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression

  12. Semi-automated landform classification for hazard mapping of soil liquefaction by earthquake

    Science.gov (United States)

    Nakano, Takayuki

    2018-05-01

    Soil liquefaction damages were caused by huge earthquake in Japan, and the similar damages are concerned in near future huge earthquake. On the other hand, a preparation of soil liquefaction risk map (soil liquefaction hazard map) is impeded by the difficulty of evaluation of soil liquefaction risk. Generally, relative soil liquefaction risk should be able to be evaluated from landform classification data by using experimental rule based on the relationship between extent of soil liquefaction damage and landform classification items associated with past earthquake. Therefore, I rearranged the relationship between landform classification items and soil liquefaction risk intelligibly in order to enable the evaluation of soil liquefaction risk based on landform classification data appropriately and efficiently. And I developed a new method of generating landform classification data of 50-m grid size from existing landform classification data of 250-m grid size by using digital elevation model (DEM) data and multi-band satellite image data in order to evaluate soil liquefaction risk in detail spatially. It is expected that the products of this study contribute to efficient producing of soil liquefaction hazard map by local government.

  13. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    Science.gov (United States)

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated

  14. Seismological and geological investigation for earthquake hazard in the Greater Accra Metropolitan Area

    International Nuclear Information System (INIS)

    Doku, M. S.

    2013-07-01

    A seismological and geological investigation for earthquake hazard in the Greater Accra Metropolitan Area was undertaken. The research was aimed at employing a methematical model to estimate the seismic stress for the study area by generating a complete, unified and harmonized earthquake catalogue spanning 1615 to 2012. Seismic events were souced from Leydecker, G. and P. Amponsah, (1986), Ambraseys and Adams, (1986), Amponsah (2008), Geological Survey Department, Accra, Ghana, Amponsah (2002), National Earthquake Information Service, United States Geological Survey, Denver, Colorado 80225, USA, the International Seismological Centre and the National Data Centre of the Ghana Atomic Energy Commission. Events occurring in the study area were used to create and Epicentral Intensity Map and a seismicity map of the study area after interpolation of missing seismic magnitudes. The least square method and the maximum likelihood estimation method were employed to evaluate b-values of 0.6 and 0.9 respectively for the study area. A thematic map of epicentral intensity superimposed on the geology of the study area was also developed to help understand the relationship between the virtually fractured, jointed and sheared geology and the seismic events. The results obtained are indicative of the fact that the stress level of GAMA has a telling effect on its seismicity and also the events are prevalents at fractured, jointed and sheared zones. (au)

  15. Coulomb Failure Stress Accumulation in Nepal After the 2015 Mw 7.8 Gorkha Earthquake: Testing Earthquake Triggering Hypothesis and Evaluating Seismic Hazards

    Science.gov (United States)

    Xiong, N.; Niu, F.

    2017-12-01

    A Mw 7.8 earthquake struck Gorkha, Nepal, on April 5, 2015, resulting in more than 8000 deaths and 3.5 million homeless. The earthquake initiated 70km west of Kathmandu and propagated eastward, rupturing an area of approximately 150km by 60km in size. However, the earthquake failed to fully rupture the locked fault beneath the Himalaya, suggesting that the region south of Kathmandu and west of the current rupture are still locked and a much more powerful earthquake might occur in future. Therefore, the seismic hazard of the unruptured region is of great concern. In this study, we investigated the Coulomb failure stress (CFS) accumulation on the unruptured fault transferred by the Gorkha earthquake and some nearby historical great earthquakes. First, we calculated the co-seismic CFS changes of the Gorkha earthquake on the nodal planes of 16 large aftershocks to quantitatively examine whether they were brought closer to failure by the mainshock. It is shown that at least 12 of the 16 aftershocks were encouraged by an increase of CFS of 0.1-3 MPa. The correspondence between the distribution of off-fault aftershocks and the increased CFS pattern also validates the applicability of the earthquake triggering hypothesis in the thrust regime of Nepal. With the validation as confidence, we calculated the co-seismic CFS change on the locked region imparted by the Gorkha earthquake and historical great earthquakes. A newly proposed ramp-flat-ramp-flat fault geometry model was employed, and the source parameters of historical earthquakes were computed with the empirical scaling relationship. A broad region south of the Kathmandu and west of the current rupture were shown to be positively stressed with CFS change roughly ranging between 0.01 and 0.5 MPa. The maximum of CFS increase (>1MPa) was found in the updip segment south of the current rupture, implying a high seismic hazard. Since the locked region may be additionally stressed by the post-seismic relaxation of the lower

  16. Earthquake induced liquefaction hazard, probability and risk assessment in the city of Kolkata, India: its historical perspective and deterministic scenario

    Science.gov (United States)

    Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.

    2018-01-01

    Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically

  17. Earthquake and volcano hazard notices: An economic evaluation of changes in risk perceptions

    Science.gov (United States)

    Bernknopf, R.L.; Brookshire, D.S.; Thayer, M.A.

    1990-01-01

    Earthquake and volcano hazard notices were issued for the Mammoth Lakes, California area by the U.S. Geological Survey under the authority granted by the Disaster Relief Act of 1974. The effects on investment, recretion visitation, and risk perceptionsare explored. The hazard notices did not affect recreation visitation, although investment was affected. A perceived loss in the market value of homes was documented. Risk perceptions were altered for property owners. Communication of the probability of an event over time would enhance hazard notices as a policy instrument and would mitigate unnecessary market perturbations. ?? 1990.

  18. Pakistan’s Earthquake and Tsunami Hazards Potential Impact on Infrastructure

    Directory of Open Access Journals (Sweden)

    GEORGE PARARAS-CARAYANNIS

    2011-06-01

    Full Text Available Interaction of the Indian, Arabian and Eurasian tectonic plates has resulted in the formation of major active fault systems in South Asia. Compression along the tectonic boundaries results in thrust or reverse type of faulting and zones of crustal deformation characterized by high seismic activity and continuing Orogenesis. The more intense seismic activity occurs near regions of thrust faulting which is developing at the Himalayan foothills. In northern Pakistan, the Hindu Kush Mountains converge with the Karakoram Range to form a part of the Himalayan mountain system. Northern, western as well as southern Pakistan, Kashmir and northern India and Afghanistan are along such zones of high seismic activity. In Pakistan, most of the earthquakes occur in the north and western regions along the boundary of the Indian tectonic plate with the Iranian and Afghan micro-plates. The active zone extends from the Makran region in the southwest to the Hazara-Kashmir syntaxial bend in the north. Southwest Pakistan is vulnerable to both earthquake and tsunami hazards. In 2005, earthquakes devastated northern Pakistan and Kashmir and severely affected the cities of Muzaffarabad, Islamadad and Rawalpindi, causing severe destruction to the infrastructure of the northern region. A major earthquake along an extensive transform fault system in 1935 destroyed the city Quetta and the adjoining region. A major earthquake along the northern Arabian sea in 1945 generated a very destructive tsunami along the coasts of Baluchistan and Sindh Provinces. The region near Karachi is vulnerable as it is located near four major faults where destructive earthquakes and tsunamis have occurred in the past. Given Pakistan’s vulnerability and extensive infrastructure development in recent years, the present study reviews briefly the earthquake and tsunami risk factors and assesses the impact that such disasters can have on the country’s critical infrastructure - which includes

  19. Estimate of airborne release of plutonium from Babcock and Wilcox plant as a result of severe wind hazard and earthquake

    International Nuclear Information System (INIS)

    Mishima, J.; Schwendiman, L.C.; Ayer, J.E.

    1978-10-01

    As part of an interdisciplinary study to evaluate the potential radiological consequences of wind hazard and earthquake upon existing commercial mixed oxide fuel fabrication plants, the potential mass airborne releases of plutonium (source terms) from such events are estimated. The estimated souce terms are based upon the fraction of enclosures damaged to three levels of severity (crush, puncture penetrate, and loss of external filter, in order of decreasing severity), called damage ratio, and the airborne release if all enclosures suffered that level of damage. The discussion of damage scenarios and source terms is divided into wind hazard and earthquake scenarios in order of increasing severity. The largest airborne releases from the building were for cases involving the catastrophic collapse of the roof over the major production areas--wind hazard at 110 mph and earthquakes with peak ground accelerations of 0.20 to 0.29 g. Wind hazards at higher air velocities and earthquakes with higher ground acceleration do not result in significantly greater source terms. The source terms were calculated as additional mass of respirable particles released with time up to 4 days; and, under these assumptions, approximately 98% of the mass of material of concern is made airborne from 2 h to 4 days after the event. The overall building source terms from the damage scenarios evaluated are shown in a table. The contribution of individual areas to the overall building source term is presented in order of increasing severity for wind hazard and earthquake

  20. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  1. Hazard-consistent response spectra in the Region of Murcia (Southeast Spain): comparison to earthquake-resistant provisions

    OpenAIRE

    Gaspar Escribano, Jorge M.; Benito Oterino, Belen; Garcia Mayordomo, Julian

    2008-01-01

    Hazard-consistent ground-motion characterisations of three representative sites located in the Region of Murcia (southeast Spain) are presented. This is the area where the last three damaging events in Spain occurred and there is a significant amount of data for comparing them with seismic hazard estimates and earthquake-resistant provisions. Results of a probabilistic seismic hazard analysis are used to derive uniform hazard spectra (UHS) for the 475-year return period, on rock and soil cond...

  2. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones

    Science.gov (United States)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana

    2017-11-01

    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  3. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones

    Science.gov (United States)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana

    2018-04-01

    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  4. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    Science.gov (United States)

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  5. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    International Nuclear Information System (INIS)

    Wong, I.G.; Green, R.K.; Sun, J.I.; Pezzopane, S.K.; Abrahamson, N.A.; Quittmeyer, R.C.

    1996-01-01

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M w ) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M w 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M w 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper

  6. The revaluation of the macroseismic effects of March 4, 1977 earthquake in the frame of the new seismic hazard assessment methodologies

    International Nuclear Information System (INIS)

    Pantea, A.; Constantin, Angela; Anghel, M.

    2002-01-01

    To increase the earthquakes resistance of structure the design norms and construction require the best knowledge of seismic hazard parameters and using the new methodologies of seismic hazard assessment. One of these parameters is seismic intensity of the earthquakes occurred on the whole territory analyzed during as long as possible time interval for which data are available, especially for the strongest of them. For Romanian territory the strongest and the best known from the point of view of the macroseismic effects is the March 4, 1977 earthquake. Seismology by itself, without geophysics (solid earth physics), geology, geography, and geodesy, cannot fully, comprehensively, validly assess seismic hazards. Among those who have understood seismic hazard assessment as the result of cooperation between geosciences as a whole and seismology, one may quote Bune, 1978; Pantea et al., 2002, etc. Assessing seismic hazards is a complex undertaking, for it draws on a vast amount of knowledge in numerous sectors of geosciences, particularly solid earth physics as a branch of geophysics that also includes seismology, tectonic physics, gravimetry, geomagnetism, geochronology, etc.. It involves processing the results of complex geophysical, seismologic, tectonic, and geologic studies. To get a picture of, and understand, the laws that govern seismogenesis, one has to know what the relations are among the measured physical quantities indicating the properties of the rocks (whether gravimetric, magnetometric, electrometric, seismometric, or others), the dynamics of tectonic structures, as well as the nature and geological characteristics. Geophysics can be relied upon to determine the deep internal structure of the earth that geological methods are unable to reveal. Geophysics, and implicitly seismology, can help resolve the problem by: 1. Identifying the areas of the seismic sources and their characteristics, including focal depth, M max [Bune, 1978], and the recurrence chart

  7. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  8. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    Science.gov (United States)

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  9. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    Science.gov (United States)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  10. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    Science.gov (United States)

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  11. ThinkHazard!: an open-source, global tool for understanding hazard information

    Science.gov (United States)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone

    2016-04-01

    Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.

  12. Protection of the human race against natural hazards (asteroids, comets, volcanoes, earthquakes)

    Science.gov (United States)

    Smith, Joseph V.

    1985-10-01

    Although we justifiably worry about the danger of nuclear war to civilization, and perhaps even to survival of the human race, we tend to consider natural hazards (e.g., comets, asteroids, volcanoes, earthquakes) as unavoidable acts of God. In any human lifetime, a truly catastrophic natural event is very unlikely, but ultimately one will occur. For the first time in human history we have sufficient technical skills to begin protection of Earth from some natural hazards. We could decide collectively throughout the world to reassign resources: in particular, reduction of nuclear and conventional weapons to a less dangerous level would allow concomitant increase of international programs for detection and prevention of natural hazards. Worldwide cooperation to mitigate natural hazards might help psychologically to lead us away from the divisive bickering that triggers wars. Future generations could hail us as pioneers of peace and safety rather than curse us as agents of death and destruction.

  13. Scientists Engage South Carolina Community in Earthquake Education and Preparedness

    Science.gov (United States)

    Hall, C.; Beutel, E.; Jaume', S.; Levine, N.; Doyle, B.

    2008-12-01

    Scientists at the College of Charleston are working with the state of South Carolina's Emergency Management Division to increase awareness and understanding of earthquake hazards throughout South Carolina. As part of this mission, the SCEEP (South Carolina Earthquake Education and Preparedness) program was formed at the College of Charleston to promote earthquake research, outreach, and education in the state of South Carolina. Working with local, regional, state and federal offices, SCEEP has developed education programs for everyone from professional hazard management teams to formal and informal educators. SCEEP also works with the media to ensure accurate reporting of earthquake and other hazard information and to increase the public's understanding of earthquake science and earthquake seismology. As part of this program, we have developed a series of activities that can be checked out by educators for use in their classrooms and in informal education venues. These activities are designed to provide educators with the information and tools they lack to adequately, informatively, and enjoyably teach about earthquake and earth science. The toolkits contain seven activities meeting a variety of National Education Standards, not only in Science, but also in Geography, Math, Social Studies, Arts Education, History and Language Arts - providing a truly multidisciplinary toolkit for educators. The activities provide information on earthquake myths, seismic waves, elastic rebound, vectors, liquefaction, location of an epicenter, and then finally South Carolina earthquakes. The activities are engaging and inquiry based, implementing proven effective strategies for peaking learners' interest in scientific phenomena. All materials are provided within the toolkit and so it is truly check and go. While the SCEEP team has provided instructions and grade level suggestions for implementing the activity in an educational setting, the educator has full reign on what to showcase

  14. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    Energy Technology Data Exchange (ETDEWEB)

    Wong, I.G.; Green, R.K.; Sun, J.I. [Woodward-Clyde Federal Services, Oakland, CA (United States); Pezzopane, S.K. [Geological Survey, Denver, CO (United States); Abrahamson, N.A. [Abrahamson (Norm A.), Piedmont, CA (United States); Quittmeyer, R.C. [Woodward-Clyde Federal Services, Las Vegas, NV (United States)

    1996-12-31

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M{sub w}) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M{sub w} 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M{sub w} 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper.

  15. Probing The Structure North China To Better Understand Its Evolution, Natural Resources, And Seismic Hazards (Invited)

    Science.gov (United States)

    Keller, G. R.; Gao, R.; Qu, G.; Li, Q.; Liu, M.

    2010-12-01

    also recorded across the southern portion of this array. This profile crossed a region where the 3 main faults that pose the major hazard to the city are expressed at the surface. Some shots along this profile were also recorded by the 3-D array, and an earthquake occurred along the edge of the array during one of recording windows. Together, these data are producing an improved understanding of the structure of this area and will aid hazard assessments. These efforts are also being used a basis to conduct comparative studies to better understand seismic hazards in the central U.S. and the tectonic evolution of both regions.

  16. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  17. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau

    Science.gov (United States)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail

    2018-02-01

    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  18. Approaches that use seismic hazard results to address topics of nuclear power plant seismic safety, with application to the Charleston earthquake issue

    International Nuclear Information System (INIS)

    Sewell, R.T.; McGuire, R.K.; Toro, G.R.; Stepp, J.C.; Cornell, C.A.

    1990-01-01

    Plant seismic safety indicators include seismic hazard at the SSE (safe shut-down earthquake) acceleration, seismic margin, reliability against core damage, and reliability against offsite consequences. This work examines the key role of hazard analysis in evaluating these indicators and in making rational decisions regarding plant safety. The paper outlines approaches that use seismic hazard results as a basis for plant seismic safety evaluation and applies one of these approaches to the Charleston earthquake issue. This approach compares seismic hazard results that account for the Charleston tectonic interpretation, using the EPRI-Seismicity Owners Group (SOG) methodology, with hazard results that are consistent with historical tectonic interpretations accepted in regulation. Based on hazard results for a set of 21 eastern U.S. nuclear power plant sites, the comparison shows that no systematic 'plant-to-plant' increase in hazard accompanies the Charleston hypothesis; differences in mean hazards for the two interpretations are generally insignificant relative to current uncertainties in seismic hazard. (orig.)

  19. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    Science.gov (United States)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  20. Fragility analysis of flood protection structures in earthquake and flood prone areas around Cologne, Germany for multi-hazard risk assessment

    Science.gov (United States)

    Tyagunov, Sergey; Vorogushyn, Sergiy; Munoz Jimenez, Cristina; Parolai, Stefano; Fleming, Kevin; Merz, Bruno; Zschau, Jochen

    2013-04-01

    The work presents a methodology for fragility analyses of fluvial earthen dikes in earthquake and flood prone areas. Fragility estimates are being integrated into the multi-hazard (earthquake-flood) risk analysis being undertaken within the framework of the EU FP7 project MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) for the city of Cologne, Germany. Scenarios of probable cascading events due to the earthquake-triggered failure of flood protection dikes and the subsequent inundation of surroundings are analyzed for the area between the gauges Andernach and Düsseldorf along the Rhine River. Along this river stretch, urban areas are partly protected by earthen dikes, which may be prone to failure during exceptional floods and/or earthquakes. The seismic fragility of the dikes is considered in terms of liquefaction potential (factor of safety), estimated by the use of the simplified procedure of Seed and Idriss. It is assumed that initiation of liquefaction at any point throughout the earthen dikes' body corresponds to the failure of the dike and, therefore, this should be taken into account for the flood risk calculations. The estimated damage potential of such structures is presented as a two-dimensional surface (as a function of seismic hazard and water level). Uncertainties in geometrical and geotechnical dike parameters are considered within the framework of Monte Carlo simulations. Taking into consideration the spatial configuration of the existing flood protection system within the area under consideration, seismic hazard curves (in terms of PGA) are calculated for sites along the river segment of interest at intervals of 1 km. The obtained estimates are used to calculate the flood risk when considering the temporal coincidence of seismic and flood events. Changes in flood risk for the considered hazard cascade scenarios are quantified and compared to the single-hazard scenarios.

  1. Pattern recognition methodologies and deterministic evaluation of seismic hazard: A strategy to increase earthquake preparedness

    International Nuclear Information System (INIS)

    Peresan, Antonella; Panza, Giuliano F.; Gorshkov, Alexander I.; Aoudia, Abdelkrim

    2001-05-01

    Several algorithms, structured according to a general pattern-recognition scheme, have been developed for the space-time identification of strong events. Currently, two of such algorithms are applied to the Italian territory, one for the recognition of earthquake-prone areas and the other, namely CN algorithm, for earthquake prediction purposes. These procedures can be viewed as independent experts, hence they can be combined to better constrain the alerted seismogenic area. We examine here the possibility to integrate CN intermediate-term medium-range earthquake predictions, pattern recognition of earthquake-prone areas and deterministic hazard maps, in order to associate CN Times of Increased Probability (TIPs) to a set of appropriate scenarios of ground motion. The advantage of this procedure mainly consists in the time information provided by predictions, useful to increase preparedness of safety measures and to indicate a priority for detailed seismic risk studies to be performed at a local scale. (author)

  2. Development of uniform hazard response spectra for rock sites considering line and point sources of earthquakes

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Kushwaha, H.S.

    2001-12-01

    Traditionally, the seismic design basis ground motion has been specified by normalised response spectral shapes and peak ground acceleration (PGA). The mean recurrence interval (MRI) used to computed for PGA only. It is shown that the MRI associated with such response spectra are not the same at all frequencies. The present work develops uniform hazard response spectra i.e. spectra having the same MRI at all frequencies for line and point sources of earthquakes by using a large number of strong motion accelerograms recorded on rock sites. Sensitivity of the number of the results to the changes in various parameters has also been presented. This work is an extension of an earlier work for aerial sources of earthquakes. These results will help to determine the seismic hazard at a given site and the associated uncertainities. (author)

  3. Aftershock Duration of the 1976 Ms 7.8 Tangshan Earthquake: Implication for the Seismic Hazard Model with a Sensitivity Analysis

    Science.gov (United States)

    Zhong, Q.; Shi, B.

    2011-12-01

    The disaster of the Ms 7.8 earthquake occurred in Tangshan, China, on July 28th 1976 caused at least 240,000 deaths. The mainshock was followed by two largest aftershocks, the Ms 7.1 occurred after 15 hr later of the mainshock, and the Ms 6.9 occurred on 15 November. The aftershock sequence is lasting to date, making the regional seismic activity rate around the Tangshan main fault much higher than that of before the main event. If these aftershocks are involved in the local main event catalog for the PSHA calculation purpose, the resultant seismic hazard calculation will be overestimated in this region and underestimated in other place. However, it is always difficult to accurately determine the time duration of aftershock sequences and identifies the aftershocks from main event catalog for seismologist. In this study, by using theoretical inference and empirical relation given by Dieterich, we intended to derive the plausible time length of aftershock sequences of the Ms 7.8 Tangshan earthquake. The aftershock duration from log-log regression approach gives us about 120 years according to the empirical Omori's relation. Based on Dietrich approach, it has been claimed that the aftershock duration is a function of remote shear stressing rate, normal stress acting on the fault plane, and fault frictional constitutive parameters. In general, shear stressing rate could be estimated in three ways: 1. Shear stressing rate could be written as a function of static stress drop and a mean earthquake recurrence time. In this case, the time length of aftershock sequences is about 70-100 years. However, since the recurrence time inherits a great deal of uncertainty. 2. Ziv and Rubin derived a general function between shear stressing rate, fault slip speed and fault width with a consideration that aftershock duration does not scale with mainshock magnitude. Therefore, from Ziv's consideration, the aftershock duration is about 80 years. 3. Shear stressing rate is also can be

  4. Workshop on New Madrid geodesy and the challenges of understanding intraplate earthquakes

    Science.gov (United States)

    Boyd, Oliver; Calais, Eric; Langbein, John; Magistrale, Harold; Stein, Seth; Zoback, Mark

    2013-01-01

    On March 4, 2011, 26 researchers gathered in Norwood, Massachusetts, for a workshop sponsored by the U.S. Geological Survey and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazard. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. Several researchers were invited by the organizing committee to give overview presentations while all participants were encouraged to present their most recent ideas. The overview presentations appear in this report along with a set of recommendations.

  5. Wicked Problems in Natural Hazard Assessment and Mitigation

    Science.gov (United States)

    Stein, S.; Steckler, M. S.; Rundle, J. B.; Dixon, T. H.

    2017-12-01

    Social scientists have defined "wicked" problems that are "messy, ill-defined, more complex than we fully grasp, and open to multiple interpretations based on one's point of view... No solution to a wicked problem is permanent or wholly satisfying, which leaves every solution open to easy polemical attack." These contrast with "tame" problems in which necessary information is available and solutions - even if difficult and expensive - are straightforward to identify and execute. Updating the U.S.'s aging infrastructure is a tame problem, because what is wrong and how to fix it are clear. In contrast, addressing climate change is a wicked problem because its effects are uncertain and the best strategies to address them are unclear. An analogous approach can be taken to natural hazard problems. In tame problems, we have a good model of the process, good information about past events, and data implying that the model should predict future events. In such cases, we can make a reasonable assessment of the hazard that can be used to develop mitigation strategies. Earthquake hazard mitigation for San Francisco is a relatively tame problem. We understand how the earthquakes result from known plate motions, have information about past earthquakes, and have geodetic data implying that future similar earthquakes will occur. As a result, it is straightforward to develop and implement mitigation strategies. However, in many cases, hazard assessment and mitigation is a wicked problem. How should we prepare for a great earthquake on plate boundaries where tectonics favor such events but we have no evidence that they have occurred and hence how large they may be or how often to expect them? How should we assess the hazard within plates, for example in the New Madrid seismic zone, where large earthquakes have occurred but we do not understand their causes and geodetic data show no strain accumulating? How can we assess the hazard and make sensible policy when the recurrence of

  6. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    Science.gov (United States)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  7. Incorporating human-triggered earthquake risks into energy and water policies

    Science.gov (United States)

    Klose, C. D.; Seeber, L.; Jacob, K. H.

    2010-12-01

    A comprehensive understanding of earthquake risks in urbanized regions requires an accurate assessment of both urban vulnerabilities and hazards from earthquakes, including ones whose timing might be affected by human activities. Socioeconomic risks associated with human-triggered earthquakes are often misconstrued and receive little scientific, legal, and public attention. Worldwide, more than 200 damaging earthquakes, associated with industrialization and urbanization, were documented since the 20th century. Geomechanical pollution due to large-scale geoengineering activities can advance the clock of earthquakes, trigger new seismic events or even shot down natural background seismicity. Activities include mining, hydrocarbon production, fluid injections, water reservoir impoundments and deep-well geothermal energy production. This type of geohazard has impacts on human security on a regional and national level. Some planned or considered future engineering projects raise particularly strong concerns about triggered earthquakes, such as for instance, sequestration of carbon dioxide by injecting it deep underground and large-scale natural gas production in the Marcellus shale in the Appalacian basin. Worldwide examples of earthquakes are discussed, including their associated losses of human life and monetary losses (e.g., 1989 Newcastle and Volkershausen earthquakes, 2001 Killari earthquake, 2006 Basel earthquake, 2010 Wenchuan earthquake). An overview is given on global statistics of human-triggered earthquakes, including depths and time delay of triggering. Lastly, strategies are described, including risk mitigation measures such as urban planning adaptations and seismic hazard mapping.

  8. Deep-Sea Turbidites as Guides to Holocene Earthquake History at the Cascadia Subduction Zone—Alternative Views for a Seismic-Hazard Workshop

    Science.gov (United States)

    Atwater, Brian F.; Griggs, Gary B.

    2012-01-01

    This report reviews the geological basis for some recent estimates of earthquake hazards in the Cascadia region between southern British Columbia and northern California. The largest earthquakes to which the region is prone are in the range of magnitude 8-9. The source of these great earthquakes is the fault down which the oceanic Juan de Fuca Plate is being subducted or thrust beneath the North American Plate. Geologic evidence for their occurrence includes sedimentary deposits that have been observed in cores from deep-sea channels and fans. Earthquakes can initiate subaqueous slumps or slides that generate turbidity currents and which produce the sedimentary deposits known as turbidites. The hazard estimates reviewed in this report are derived mainly from deep-sea turbidites that have been interpreted as proxy records of great Cascadia earthquakes. The estimates were first published in 2008. Most of the evidence for them is contained in a monograph now in press. We have reviewed a small part of this evidence, chiefly from Cascadia Channel and its tributaries, all of which head offshore the Pacific coast of Washington State. According to the recent estimates, the Cascadia plate boundary ruptured along its full length in 19 or 20 earthquakes of magnitude 9 in the past 10,000 years; its northern third broke during these giant earthquakes only, and southern segments produced at least 20 additional, lesser earthquakes of Holocene age. The turbidite case for full-length ruptures depends on stratigraphic evidence for simultaneous shaking at the heads of multiple submarine canyons. The simultaneity has been inferred primarily from turbidite counts above a stratigraphic datum, sandy beds likened to strong-motion records, and radiocarbon ages adjusted for turbidity-current erosion. In alternatives proposed here, this turbidite evidence for simultaneous shaking is less sensitive to earthquake size and frequency than previously thought. Turbidites far below a channel

  9. Perception of Natural Hazards and Risk among University of Washington Students

    Science.gov (United States)

    Herr, K.; Brand, B.; Hamlin, N.; Ou, J.; Thomas, B.; Tudor, E.

    2012-12-01

    Familiarity with a given population's perception of natural hazards and the threats they present is vital for the development of effective education prior to and emergency management response after a natural event. While much work has been done in other active tectonic regions, perception of natural hazards and risk among Pacific Northwest (PNW) residents is poorly constrained. The objective of this work is to assess the current perception of earthquake and volcanic hazards and risk in the PNW, and to better understand the factors which drive the public's behavior concerning preparedness and response. We developed a survey to assess the knowledge of natural hazards common to the region, their perception of risk concerning these hazards, and their level of preparedness should a natural hazard occur. The survey was distributed to University of Washington students and employees via an internet link as part of a class project in 'Living with Volcanoes' (ESS 106) in March of 2012, which returned more than 900 responses. The UW student population was chosen as our first "population" to assess because of their uniqueness as a large, semi-transient population (typical residence of less than 5 years). Only 50% of participants correctly reported their proximity to an active volcano, indicating either lack of knowledge of active volcanoes in the region or poor spatial awareness. When asked which area were most at risk to lahars, respondents indicated that all areas close to the hazard source, including topographically elevated regions, were at a higher risk than more distal and low-lying localities that are also at high risk, indicating a lack of knowledge concerning the topographic dependency of this hazard. Participants perceived themselves to be able to cope better with an earthquake than a volcanic event. This perception may be due to lack of knowledge of volcanic hazards and their extent or due to a false sense of security concerning earthquakes fostered by regular

  10. Global assessment of human losses due to earthquakes

    Science.gov (United States)

    Silva, Vitor; Jaiswal, Kishor; Weatherill, Graeme; Crowley, Helen

    2014-01-01

    Current studies have demonstrated a sharp increase in human losses due to earthquakes. These alarming levels of casualties suggest the need for large-scale investment in seismic risk mitigation, which, in turn, requires an adequate understanding of the extent of the losses, and location of the most affected regions. Recent developments in global and uniform datasets such as instrumental and historical earthquake catalogues, population spatial distribution and country-based vulnerability functions, have opened an unprecedented possibility for a reliable assessment of earthquake consequences at a global scale. In this study, a uniform probabilistic seismic hazard assessment (PSHA) model was employed to derive a set of global seismic hazard curves, using the open-source software OpenQuake for seismic hazard and risk analysis. These results were combined with a collection of empirical fatality vulnerability functions and a population dataset to calculate average annual human losses at the country level. The results from this study highlight the regions/countries in the world with a higher seismic risk, and thus where risk reduction measures should be prioritized.

  11. Large earthquakes and creeping faults

    Science.gov (United States)

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  12. How to eliminate non-damaging earthquakes from the results of a probabilistic seismic hazard analysis (PSHA)-A comprehensive procedure with site-specific application

    International Nuclear Information System (INIS)

    Kluegel, Jens-Uwe

    2009-01-01

    The results of probabilistic seismic hazard analyses are frequently presented in terms of uniform hazard spectra or hazard curves with spectral accelerations as the output parameter. The calculation process is based on the evaluation of the probability of exceedance of specified acceleration levels without consideration of the damaging effects of the causative earthquakes. The same applies to the empirical attenuation equations for spectral accelerations used in PSHA models. This makes interpreting and using the results in engineering or risk applications difficult. Uniform hazard spectra and the associated hazard curves may contain a significant amount of contributions of weak, low-energy earthquakes not able to damage the seismically designed structures of nuclear power plants. For the development of realistic engineering designs and for realistic seismic probabilistic risk assessments (seismic PRA) it is necessary to remove the contribution of non-damaging earthquakes from the results of a PSHA. A detailed procedure for the elimination of non-damaging earthquakes based on the CAV (Cumulative Absolute Velocity)-filtering approach was developed and applied to the results of the large-scale PEGASOS probabilistic seismic hazard study for the site of the Goesgen nuclear power plant. The procedure considers the full scope of epistemic uncertainty and aleatory variability present in the PEGASOS study. It involves the development of a set of empirical correlations for CAV and the subsequent development of a composite distribution for the probability of exceedance of the damaging threshold of 0.16 gs. Additionally, a method was developed to measure the difference in the damaging effects of earthquakes of different strengths by the ratio of a power function of ARIAS-intensity or, in the ideal case, by the ratio of the square roots of the associated strong motion durations. The procedure was applied for the update of the Goesgen seismic PRA and for the confirmation of a

  13. Introduction to the focus section on the 2015 Gorkha, Nepal, earthquake

    Science.gov (United States)

    Hough, Susan E.

    2015-01-01

    It has long been recognized that Nepal faces high earthquake hazard, with the most recent large (Mw>7.5) events in 1833 and 1934. When the 25 April 2015Mw 7.8 Gorkha earthquake struck, it appeared initially to be a realization of worst fears. In spite of its large magnitude and proximity to the densely populated Kathmandu valley, however, the level of damage was lower than anticipated, with most vernacular structures within the valley experiencing little or no structural damage. Outside the valley, catastrophic damage did occur in some villages, associated with the high vulnerability of stone masonry construction and, in many cases, landsliding. The unexpected observations from this expected earthquake provide an urgent impetus to understand the event itself and to better characterize hazard from future large Himalayan earthquakes. Toward this end, articles in this special focus section present and describe available data sets and initial results that better illuminate and interpret the earthquake and its effects.

  14. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  15. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    Science.gov (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  16. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  17. Vulnerability assessment of archaeological sites to earthquake hazard: An indicator based method integrating spatial and temporal aspects

    Directory of Open Access Journals (Sweden)

    Despina Minos-Minopoulos

    2017-07-01

    Full Text Available Across the world, numerous sites of cultural heritage value are at risk from a variety of human-induced and natural hazards such as war and earthquakes. Here we present and test a novel indicator-based method for assessing the vulnerability of archaeological sites to earthquakes. Vulnerability is approached as a dynamic element assessed through a combination of spatial and temporal parameters. The spatial parameters examine the susceptibility of the sites to the secondary Earthquake Environmental Effects of ground liquefaction, landslides and tsunami and are expressed through the Spatial Susceptibility Index (SSi. Parameters of physical vulnerability, economic importance and visitors density examine the temporal vulnerability of the sites expressed through the Temporal Vulnerability Index (TVi. The equally weighted sum of the spatial and temporal indexes represents the total Archaeological Site Vulnerability Index (A.S.V.I.. The A.S.V.I method is applied at 16 archaeological sites across Greece, allowing an assessment of their vulnerability. This then allows the establishment of a regional and national priority list for considering future risk mitigation. Results indicate that i the majority of the sites have low to moderate vulnerability to earthquake hazard, ii Neratzia Fortress on Kos and Heraion on Samos are characterised as highly vulnerable and should be prioritised for further studies and mitigation measures, and iii the majority of the sites are susceptible to at least one Earthquake Environmental Effect and present relatively high physical vulnerability attributed to the existing limited conservation works. This approach highlights the necessity for an effective vulnerability assessment methodology within the existing framework of disaster risk management for cultural heritage.

  18. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Science.gov (United States)

    Nelson, C. H.; Gutiérrez Pastor, J.; Goldfinger, C.; Escutia, C.

    2012-11-01

    We summarize the importance of great earthquakes (Mw ≳ 8) for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1) radiometric dating (14C method), and (2) relative dating, using hemipelagic sediment thickness and sedimentation rates (H method). The H method provides (1) the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2) the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia) or very close (San Andreas) to the early window for another great earthquake. On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs) are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km) than on passive margins (~1000 km). The great earthquakes along the Cascadia and northern California margins cause seismic strengthening of the sediment, which

  19. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  20. A consideration of hazards, earthquakes, aircraft crashes, explosions and fires in the safety of laboratories and plants

    International Nuclear Information System (INIS)

    Doumenc, A.; Faure, J.; Mohammadioun, B.; Jacquet, P.

    1987-03-01

    Although laboratories and plants differ from nuclear reactors both in their characteristics and sitings, safety measures developed for the hazards of earthquakes, aircraft crashes, explosions and fires are very similar. These measures provide a satisfactory level of safety for these installations [fr

  1. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    International Nuclear Information System (INIS)

    Delorey, Andrew A.; Elst, Nicholas J. van der; Johnson, Paul Allan

    2016-01-01

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering of earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Lastly, our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.

  2. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    Science.gov (United States)

    Delorey, Andrew; Van Der Elst, Nicholas; Johnson, Paul

    2017-01-01

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering of earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.

  3. Keeping pace with the science: Seismic hazard analysis in the central and eastern United States

    International Nuclear Information System (INIS)

    Coppersmith, K.J.; Youngs, R.R.

    1989-01-01

    Our evolving tectonic understanding of the causes and locations of earthquakes in the central and eastern US (CEUS) has been a challenge to probabilistic seismic hazard analyses (PSHA) methodologies. The authors summarize some of the more significant advances being made in characterizing the location, maximum earthquake size, recurrence, and ground motions associated with CEUS earthquakes

  4. Multidisciplinary Geo-scientific Hazard Analyses: Istanbul Microzonation Projects

    Science.gov (United States)

    Kara, Sema; Baş, Mahmut; Kılıç, Osman; Tarih, Ahmet; Yahya Menteşe, Emin; Duran, Kemal

    2017-04-01

    Istanbul (Turkey) is located on the west edge of North Anatolia Fault and hence is an earthquake prone city with a population that exceeds 15 million people. In addition, the city is still growing as center of commerce, tourism and culture that increases the exposure more and more. During the last decade, although Istanbul grew faster than ever in its history, precautions against a possible earthquake have also increased steadily. The two big earthquakes (in Kocaeli and Duzce Provinces) occurred in 1999 alongside Istanbul and these events became the trigger events that accelerated the disaster risk reduction activities in Istanbul. Following a loss estimation study carried out by Japanese International Cooperation Agency (JICA) in 2001 and Istanbul Earthquake Master Plan prepared by four major universities' researchers in 2003; it was evaluated that understanding and analyzing the geological structure in Istanbul was the main concern. Thereafter Istanbul Metropolitan Municipality's Directorate of Earthquake and Ground Research (DEGRE) carried out two major geo-scientific studies called "microzonation studies" covering 650 km2 of Istanbul's urbanized areas between 2006 and 2009. The studies were called "microzonation" because the analysis resolution was as dense as 250m grids and included various assessments on hazards such as ground shaking, liquefaction, karstification, landslide, flooding, and surface faulting. After the evaluation of geological, geotechnical and geophysical measurements; Earthquake and Tsunami Hazard Maps for all Istanbul, slope, engineering geology, ground water level, faulting, ground shaking, inundation, shear wave velocity and soil classification maps for the project areas were obtained. In the end "Land Suitability Maps" are derived from the combination of inputs using multi-hazard approach. As a result, microzonation is tool for risk oriented urban planning; consisting of interdisciplinary multi-hazard risk analyses. The outputs of

  5. How fault geometry controls earthquake magnitude

    Science.gov (United States)

    Bletery, Q.; Thomas, A.; Karlstrom, L.; Rempel, A. W.; Sladen, A.; De Barros, L.

    2016-12-01

    Recent large megathrust earthquakes, such as the Mw9.3 Sumatra-Andaman earthquake in 2004 and the Mw9.0 Tohoku-Oki earthquake in 2011, astonished the scientific community. The first event occurred in a relatively low-convergence-rate subduction zone where events of its size were unexpected. The second event involved 60 m of shallow slip in a region thought to be aseismicaly creeping and hence incapable of hosting very large magnitude earthquakes. These earthquakes highlight gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution. Here we show that gradients in dip angle exert a primary control on mega-earthquake occurrence. We calculate the curvature along the major subduction zones of the world and show that past mega-earthquakes occurred on flat (low-curvature) interfaces. A simplified analytic model demonstrates that shear strength heterogeneity increases with curvature. Stress loading on flat megathrusts is more homogeneous and hence more likely to be released simultaneously over large areas than on highly-curved faults. Therefore, the absence of asperities on large faults might counter-intuitively be a source of higher hazard.

  6. Earthquake Scenarios Based Upon the Data and Methodologies of the U.S. Geological Survey's National Seismic Hazard Mapping Project

    Science.gov (United States)

    Rukstales, K. S.; Petersen, M. D.; Frankel, A. D.; Harmsen, S. C.; Wald, D. J.; Quitoriano, V. R.; Haller, K. M.

    2011-12-01

    The U.S. Geological Survey's (USGS) National Seismic Hazard Mapping Project (NSHMP) utilizes a database of over 500 faults across the conterminous United States to constrain earthquake source models for probabilistic seismic hazard maps. Additionally, the fault database is now being used to produce a suite of deterministic ground motions for earthquake scenarios that are based on the same fault source parameters and empirical ground motion prediction equations used for the probabilistic hazard maps. Unlike the calculated hazard map ground motions, local soil amplification is applied to the scenario calculations based on the best available Vs30 (average shear-wave velocity down to 30 meters) mapping, or in some cases using topographic slope as a proxy. Systematic outputs include all standard USGS ShakeMap products, including GIS, KML, XML, and HAZUS input files. These data are available from the ShakeMap web pages with a searchable archive. The scenarios are being produced within the framework of a geographic information system (GIS) so that alternative scenarios can readily be produced by altering fault source parameters, Vs30 soil amplification, as well as the weighting of ground motion prediction equations used in the calculations. The alternative scenarios can then be used for sensitivity analysis studies to better characterize uncertainty in the source model and convey this information to decision makers. By providing a comprehensive collection of earthquake scenarios based upon the established data and methods of the USGS NSHMP, we hope to provide a well-documented source of data which can be used for visualization, planning, mitigation, loss estimation, and research purposes.

  7. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Directory of Open Access Journals (Sweden)

    C. H. Nelson

    2012-11-01

    Full Text Available We summarize the importance of great earthquakes (Mw ≳ 8 for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1 radiometric dating (14C method, and (2 relative dating, using hemipelagic sediment thickness and sedimentation rates (H method. The H method provides (1 the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2 the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia or very close (San Andreas to the early window for another great earthquake.

    On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km than on passive margins (~1000 km. The great earthquakes along the Cascadia and northern California margins

  8. GEOS seismograms for aftershocks of the earthquakes of December 7, 1988, near Spitak, Armenia SSR, during the time period 30 December 1988 14:00 through 2 January 1989 (UTC): Chapter D in Results and data from seismologic and geologic studies following earthquakes of December 7, 1988, near Spitak, Armenia SSR (Open-File Report 89-163)

    Science.gov (United States)

    Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward

    1989-01-01

    The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk

  9. Marmara Island earthquakes, of 1265 and 1935; Turkey

    Directory of Open Access Journals (Sweden)

    Y. Altınok

    2006-01-01

    Full Text Available The long-term seismicity of the Marmara Sea region in northwestern Turkey is relatively well-recorded. Some large and some of the smaller events are clearly associated with fault zones known to be seismically active, which have distinct morphological expressions and have generated damaging earthquakes before and later. Some less common and moderate size earthquakes have occurred in the vicinity of the Marmara Islands in the west Marmara Sea. This paper presents an extended summary of the most important earthquakes that have occurred in 1265 and 1935 and have since been known as the Marmara Island earthquakes. The informative data and the approaches used have therefore the potential of documenting earthquake ruptures of fault segments and may extend the records kept on earthquakes far before known history, rock falls and abnormal sea waves observed during these events, thus improving hazard evaluations and the fundamental understanding of the process of an earthquake.

  10. Meeting of the Central and Eastern U.S. (CEUS) Earthquake Hazards Program October 28–29, 2009

    Science.gov (United States)

    Tuttle, Martitia; Boyd, Oliver; McCallister, Natasha

    2013-01-01

    On October 28th and 29th, 2009, the U.S. Geological Survey Earthquake Hazards Program held a meeting of Central and Eastern United States investigators and interested parties in Memphis, Tennessee. The purpose of the meeting was to bring together the Central and Eastern United States earthquake-hazards community to present and discuss recent research results, to promote communication and collaboration, to garner input regarding future research priorities, to inform the community about research opportunities afforded by the 2010–2012 arrival of EarthScope/USArray in the central United States, and to discuss plans for the upcoming bicentennial of the 1811–1812 New Madrid earthquakes. The two-day meeting included several keynote speakers, oral and poster presentations by attendees, and breakout sessions. The meeting is summarized in this report and can be subdivided into four primary sections: (1) summaries of breakout discussion groups; (2) list of meeting participants; (3) submitted abstracts; and (4) slide presentations. The abstracts and slides are included “as submitted” by the meeting participants and have not been subject to any formal peer review process; information contained in these sections reflects the opinions of the presenter at the time of the meeting and does not constitute endorsement by the U.S. Geological Survey.

  11. Roaming earthquakes in China highlight midcontinental hazards

    Science.gov (United States)

    Liu, Mian; Wang, Hui

    2012-11-01

    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  12. An innovative assessment of the seismic hazard from Vrancea intermediate-depth earthquakes: Case studies in Romania and Bulgaria

    International Nuclear Information System (INIS)

    Panza, G.F.; Cioflan, C.; Marmureanu, G.; Kouteva, M.; Paskaleva, I.; Romanelli, F.

    2002-02-01

    An advanced procedure for ground motion, capable of synthesizing the seismic ground motion from basic understanding of fault mechanism and seismic wave propagation, is applied to the case studies of Bucharest (Romania) and Russe, NE Bulgaria, exposed to the seismic hazard from Vrancea events. Synthetic seismic signals along representative geological cross sections in Bucharest and Russe and been computed and the energetic input spectra have been derived both from the synthetic signals and the few existing records. The theoretical signals are successfully compared with the available observations. The site response has been calculated for three recent, strong and intermediate-depth, Vrancea earthquakes: August 30, 1986 and May 30 and 31, 1990. The used approach differs significantly from today's engineering practice that relays upon rock-site hazard maps and applies the site correction at a later stage. The obtained results show that it is very useful to estimate the site effect via waveform modelling, considering simultaneously the geotechnical properties of the site, the position and geometry of the seismic source and the mechanical properties of the propagation medium. (author)

  13. Social Media: Gateway to Public Preparedness and Understanding of GeoHazards

    Science.gov (United States)

    Ballmann, J. E.; Bohon, W.; Bartel, B. A.

    2016-12-01

    The clear, timely communication of natural hazards information is critical to providing the public with the tools and information they need to make informed decisions before, during, and after events such as earthquakes, tsunamis, and volcanic eruptions. For the geohazards community, this is a multi-sector collaboration involving partners from national, state, and local governments, businesses, educational organizations, non-profit groups, and scientific institutions, for the benefit and participation of the whole community. Communications channels must be clear, consistent, and unified for the sake of maximum reach. One method of public communication that has proven to be particularly effective in disseminating hazards-related information is social media. The broad social and geographic reach of social media coupled with its ubiquitous use in all age groups makes it a powerful way to reach large segments of the population. Social media is already widely used by mass media and scientific organizations to communicate science and hazards. However, it is important that science organizations present a united and clear message, particularly about hazards preparation and response. The Southern California Earthquake Center (SCEC), UNAVCO, and the Incorporated Research Institutions for Seismology (IRIS) have created a Joint Social Media Task Force. The objective of this collaboration is 1) to build social media communities and improve the reach of science messaging, 2) to create and present consistent and clear messaging across social media platforms and regional facilities, 3) to promote outstanding products and educational information , 4) to assist and collaborate in regional, national and international efforts (TweetChats, Reddit fora, ShakeOut, etc.) and 5) to assist and support the efforts of FEMA, the USGS and other partner organizations during crisis situations. Here, we outline the difficulties and successes of creating such an alliance and provide a road map

  14. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  15. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    Science.gov (United States)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  16. A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM

    Science.gov (United States)

    Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.

    2007-12-01

    The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.

  17. Earthquake induced landslide hazard field observatory in the Avcilar peninsula

    Science.gov (United States)

    Bigarre, Pascal; Coccia, Stella; Theoleyre, Fiona; Ergintav, Semih; Özel, Oguz; Yalçinkaya, Esref; Lenti, Luca; Martino, Salvatore; Gamba, Paolo; Zucca, Francesco; Moro, Marco

    2015-04-01

    SAR temporal series has been undertaken, providing global but accurate Identification and characterization of gravitational phenomena covering the aera. Evaluation of the resolution and identification of landslide hazard-related features using space multispectral/hyperspectral image data has been realized. Profit has been gained from a vast drilling and geological - geotechnical survey program undertaken by the Istanbul Metropolitan Area, to get important data to complete the geological model of the landslide as well as one deep borehole to set up permanent instrumentation on a quite large slow landslide, fully encircled by a dense building environment. The selected landslide was instrumented in 2014 with a real-time observational system including GPS, rainfall, piezometer and seismic monitoring. Objective of this permanent monitoring system is three folds: first to detect and quantify interaction between seismic motion, rainfall and mass movement, building a database opened to the scientific community in the future, second to help to calibrate dynamic numerical geomechanical simulations intending to study the sensitivity to seismic loading, and last but not least. Last but not least important geophysical field work has been conducted to assess seismic site effects already noticed during the 1999 earthquake .Data, metadata and main results are from now progressively compiled and formatted for appropriate integration in the cloud monitoring infrastructure for data sharing.

  18. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  19. An innovative view to the seismic hazard from strong Vrancea intermediate-depth earthquakes: the case studies of Bucharest (Romania) and Russe (Bulgaria)

    International Nuclear Information System (INIS)

    Panza, G.F.; Cioflan, C.; Marmureanu, G.; Kouteva, M.; Paskaleva, I.; Romanelli, F.

    2003-04-01

    An advanced procedure for ground motion modelling, capable of synthesizing the seismic ground motion from basic understanding of fault mechanism and seismic wave propagation, is applied to compute seismic signals at Bucharest (Romania) and Russe, NE Bulgaria, due to the seismic hazard from intermediate-depth Vrancea earthquakes. The theoretically obtained signals are successfully compared with the available observations. For both case studies site response estimates along selected geological cross sections are provided for three recent, strong and intermediate-depth, Vrancea earthquakes: August 30, 1986 and May 30 and 31, 1990. The applied ground motion modelling technique has proved that it is possible to investigate the local effects, taking into account both the seismic source and the propagation path effects. The computation of realistic seismic input, utilising the huge amount of geological, geophysical and geotechnical data, already available, goes well beyond the conventional deterministic approach and gives an economically valid scientific tool for seismic microzonation. (author)

  20. Academia Sinica, TW E-science to Assistant Seismic Observations for Earthquake Research, Monitor and Hazard Reduction Surrounding the South China Sea

    Science.gov (United States)

    Huang, Bor-Shouh; Liu, Chun-Chi; Yen, Eric; Liang, Wen-Tzong; Lin, Simon C.; Huang, Win-Gee; Lee, Shiann-Jong; Chen, Hsin-Yen

    Experience from the 1994 giant Sumatra earthquake, seismic and tsunami hazard have been considered as important issues in the South China Sea and its surrounding region, and attracted many seismologist's interesting. Currently, more than 25 broadband seismic instruments are currently operated by Institute of Earth Sciences, Academia Sinica in northern Vietnam to study the geodynamic evolution of the Red river fracture zone and rearranged to distribute to southern Vietnam recently to study the geodynamic evolution and its deep structures of the South China Sea. Similar stations are planned to deploy in Philippines in near future. In planning, some high quality stations may be as permanent stations and added continuous GPS observations, and instruments to be maintained and operated by several cooperation institutes, for instance, Institute of Geophysics, Vietnamese Acadamy of Sciences and Technology in Vietnam and Philippine Institute of Volcanology and Seismology in Philippines. Finally, those stations will be planed to upgrade as real time transmission stations for earthquake monitoring and tsunami warning. However, high speed data transfer within different agencies is always a critical issue for successful network operation. By taking advantage of both EGEE and EUAsiaGrid e-Infrastructure, Academia Sinica Grid Computing Centre coordinates researchers from various Asian countries to construct a platform to high performance data transfer for huge parallel computation. Efforts from this data service and a newly build earthquake data centre for data management may greatly improve seismic network performance. Implementation of Grid infrastructure and e-science issues in this region may assistant development of earthquake research, monitor and natural hazard reduction. In the near future, we will search for new cooperation continually from the surrounding countries of the South China Sea to install new seismic stations to construct a complete seismic network of the

  1. Evaluating earthquake hazards in the Los Angeles region; an earth-science perspective

    Science.gov (United States)

    Ziony, Joseph I.

    1985-01-01

    Potentially destructive earthquakes are inevitable in the Los Angeles region of California, but hazards prediction can provide a basis for reducing damage and loss. This volume identifies the principal geologically controlled earthquake hazards of the region (surface faulting, strong shaking, ground failure, and tsunamis), summarizes methods for characterizing their extent and severity, and suggests opportunities for their reduction. Two systems of active faults generate earthquakes in the Los Angeles region: northwest-trending, chiefly horizontal-slip faults, such as the San Andreas, and west-trending, chiefly vertical-slip faults, such as those of the Transverse Ranges. Faults in these two systems have produced more than 40 damaging earthquakes since 1800. Ninety-five faults have slipped in late Quaternary time (approximately the past 750,000 yr) and are judged capable of generating future moderate to large earthquakes and displacing the ground surface. Average rates of late Quaternary slip or separation along these faults provide an index of their relative activity. The San Andreas and San Jacinto faults have slip rates measured in tens of millimeters per year, but most other faults have rates of about 1 mm/yr or less. Intermediate rates of as much as 6 mm/yr characterize a belt of Transverse Ranges faults that extends from near Santa Barbara to near San Bernardino. The dimensions of late Quaternary faults provide a basis for estimating the maximum sizes of likely future earthquakes in the Los Angeles region: moment magnitude .(M) 8 for the San Andreas, M 7 for the other northwest-trending elements of that fault system, and M 7.5 for the Transverse Ranges faults. Geologic and seismologic evidence along these faults, however, suggests that, for planning and designing noncritical facilities, appropriate sizes would be M 8 for the San Andreas, M 7 for the San Jacinto, M 6.5 for other northwest-trending faults, and M 6.5 to 7 for the Transverse Ranges faults. The

  2. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    Science.gov (United States)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the

  3. GEOS seismograms recorded for aftershocks of the earthquakes of December 7, 1988, near Spitak, Armenia SSR, during the time period 3 January 1989 through 2 February 1988 (UTC)

    Science.gov (United States)

    Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward

    1989-01-01

    The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk

  4. Observations and recommendations regarding landslide hazards related to the January 13, 2001 M-7.6 El Salvador earthquake

    Science.gov (United States)

    Jibson, Randall W.; Crone, Anthony J.

    2001-01-01

    The January 13, 2001 earthquake (M-7.6) off the coast of El Salvador triggered widespread damaging landslides in many parts of the El Salvador. In the aftermath of the earthquake, the Salvadoran government requested technical assistance through the U.S. Agency for International Development (USAID); USAID, in turn, requested help from technical experts in landslide hazards from the U.S. Geological Survey. In response to that request, we arrived in El Salvador on January 31, 2001 and worked with USAID personnel and Salvadoran agency counterparts in visiting landslide sites and evaluating present and potential hazards. A preliminary, unofficial report was prepared at the end of our trip (February 9) to provide immediate information and assistance to interested agencies and parties. The current report is an updated and somewhat expanded version of that unofficial report. Because of the brief nature of this report, conclusions and recommendations contained herein should be considered tentative and may be revised in the future.

  5. Topographic changes and their driving factors after 2008 Wenchuan Earthquake

    Science.gov (United States)

    Li, C.; Wang, M.; Xie, J.; Liu, K.

    2017-12-01

    The Wenchuan Ms 8.0 Earthquake caused topographic change in the stricken areas because of the formation of numerous coseismic landslides. The emergence of new landslides and debris flows and movement of loose materials under the driving force of heavy rainfall could further shape the local topography. Dynamic topographic changes in mountainous areas stricken by major earthquakes have a strong linkage to the development and occurrence of secondary disasters. However, little attention has been paid to continuously monitoring mountain environment change after such earthquakes. A digital elevation model (DEM) is the main feature of the terrain surface, in our research, we extracted DEM in 2013 and 2015 of a typical mountainous area severely impacted by the 2008 Wenchuan earthquake from the ZY-3 stereo pair images with validation by field measurement. Combined with the elevation dataset in 2002 and 2010, we quantitatively assessed elevation changes in different years and qualitatively analyzed spatiotemporal variation of the terrain and mass movement across the study area. The results show that the earthquake stricken area experienced substantial elevation changes caused by seismic forces and subsequent rainfalls. Meanwhile, deposits after the earthquake are mainly accumulated on the river-channels and mountain ridges and deep gullies which increase the risk of other geo-hazards. And the heavy rainfalls after the earthquake have become the biggest driver of elevation reduction, which overwhelmed elevation increase during the major earthquake. Our study provided a better understanding of subsequent hazards and risks faced by residents and communities stricken by major earthquakes.

  6. Assessment of seismic hazard for NPP sites in France analysis of several aftershocks of November 8, 1983, Liege earthquake

    International Nuclear Information System (INIS)

    Mohammadioun, B.; Mohammadioun, G.; Bresson, A.

    1984-03-01

    Current French practice for assessing seismic hazard on the sites of nuclear facilities is outlined. The procedure calls for as rich and varied an assortment of actual earthquake recordings as can be procured, including earthquakes in France itself and in nearby countries, recorded by the CEA/IPSN's own staff. Following the November 8, 1983, Liege earthquake, suitably equipped, temporary recording stations were set up in the epicentral area in order to record its aftershocks. Ground motion time histories and response spectra were computed for several of these, and a quality factor Q was derived from these data for the most superficial sedimentary layers of the area. The values obtained show reasonable agreement with ones found for similar materials in other regions

  7. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  8. GEOS seismograms recorded for aftershocks of the earthquakes of December 7, 1988, near Spitak, Armenia SSR, during the time period 26 December 1988 14:00 through 29 December 1988 (UTC)

    Science.gov (United States)

    Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward

    1989-01-01

    The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk.

  9. Proceedings of the 11th United States-Japan natural resources panel for earthquake research, Napa Valley, California, November 16–18, 2016

    Science.gov (United States)

    Detweiler, Shane; Pollitz, Fred

    2017-10-18

    The UJNR Panel on Earthquake Research promotes advanced research toward a more fundamental understanding of the earthquake process and hazard estimation. The Eleventh Joint meeting was extremely beneficial in furthering cooperation and deepening understanding of problems common to both Japan and the United States.The meeting included productive exchanges of information on approaches to systematic observation and modeling of earthquake processes. Regarding the earthquake and tsunami of March 2011 off the Pacific coast of Tohoku and the 2016 Kumamoto earthquake sequence, the Panel recognizes that further efforts are necessary to achieve our common goal of reducing earthquake risk through close collaboration and focused discussions at the 12th UJNR meeting.

  10. Assessing Natural Hazard Vulnerability Through Marmara Region Using GIS

    Science.gov (United States)

    Sabuncu, A.; Garagon Dogru, A.; Ozener, H.

    2013-12-01

    Natural hazards are natural phenomenon occured in the Earth's system that include geological and meteorological events such as earthquakes, floods, landslides, droughts, fires and tsunamis. The metropolitan cities are vulnerable to natural hazards due to their population densities, industrial facilities and proporties. The urban layout of the megacities are complex since industrial facilities are interference with residential area. The Marmara region is placed in North-western Turkey suffered from natural hazards (earthquakes, floods etc.) for years. After 1999 Kocaeli and Duzce earthquakes and 2009 Istanbul flash floods, dramatic number of casualities and economic losses were reported by the authorities. Geographic information systems (GIS) have substantial capacity in order to develop natural disaster management. As these systems provide more efficient and reliable analysis and evaluation of the data in the management, and also convenient and better solutions for the decision making before during and after the natural hazards. The Earth science data and socio-economic data can be integrated into a GIS as different layers. Additionally, satellite data are used to understand the changes pre and post the natural hazards. GIS is a powerful software for the combination of different type of digital data. A natural hazard database for the Marmara region provides all different types of digital data to the users. All proper data collection processing and analysing are critical to evaluate and identify hazards. The natural hazard database allows users to monitor, analyze and query past and recent disasters in the Marmara Region. The long term aim of this study is to develop geodatabase and identify the natural hazard vulnerabilities of the metropolitan cities.

  11. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system

    Science.gov (United States)

    Takarada, S.

    2012-12-01

    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in

  12. Time-dependent earthquake hazard evaluation in seismogenic systems using mixed Markov Chains: An application to the Japan area

    Science.gov (United States)

    Herrera, C.; Nava, F. A.; Lomnitz, C.

    2006-08-01

    A previous work introduced a new method for seismic hazard evaluation in a system (a geographic area with distinct, but related seismogenic regions) based on modeling the transition probabilities of states (patterns of presence or absence of seismicity, with magnitude greater or equal to a threshold magnitude Mr, in the regions of the system, during a time interval Δt) as a Markov chain. Application of this direct method to the Japan area gave very good results. Given that the most important limitation of the direct method is the relative scarcity of large magnitude events, we decided to explore the possibility that seismicity with magnitude M ≥ Mmr contains information about the future occurrence of earthquakes with M ≥ Mmr > Mmr. This mixed Markov chain method estimates the probabilities of occurrence of a system state for M ≥ MMr on the basis of the observed state for M ≥ Mmr in the previous Δt. Application of the mixed method to the area of Japan gives better hazard estimations than the direct method; in particular for large earthquakes. As part of this study, the problem of performance evaluation of hazard estimation methods is addressed, leading to the use of grading functions.

  13. Earthquake hazard analysis for the different regions in and around Ağrı

    Energy Technology Data Exchange (ETDEWEB)

    Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    We investigated earthquake hazard parameters for Eastern part of Turkey by determining the a and b parameters in a Gutenberg–Richter magnitude–frequency relationship. For this purpose, study area is divided into seven different source zones based on their tectonic and seismotectonic regimes. The database used in this work was taken from different sources and catalogues such as TURKNET, International Seismological Centre (ISC), Incorporated Research Institutions for Seismology (IRIS) and The Scientific and Technological Research Council of Turkey (TUBITAK) for instrumental period. We calculated the a value, b value, which is the slope of the frequency–magnitude Gutenberg–Richter relationship, from the maximum likelihood method (ML). Also, we estimated the mean return periods, the most probable maximum magnitude in the time period of t-years and the probability for an earthquake occurrence for an earthquake magnitude ≥ M during a time span of t-years. We used Zmap software to calculate these parameters. The lowest b value was calculated in Region 1 covered Cobandede Fault Zone. We obtain the highest a value in Region 2 covered Kagizman Fault Zone. This conclusion is strongly supported from the probability value, which shows the largest value (87%) for an earthquake with magnitude greater than or equal to 6.0. The mean return period for such a magnitude is the lowest in this region (49-years). The most probable magnitude in the next 100 years was calculated and we determined the highest value around Cobandede Fault Zone. According to these parameters, Region 1 covered the Cobandede Fault Zone and is the most dangerous area around the Eastern part of Turkey.

  14. The «Natural Hazard WIKISAURUS»: explanation and understanding of natural hazards to build disaster resilience

    Science.gov (United States)

    Rapisardi, Elena; Di Franco, Sabina; Giardino, Marco

    2013-04-01

    In the Internet and Web 2.0 era, the need of information is increased. Moreover, recent major and minor disasters highlighted that information is a crucial element also in emergency management. Informing the population is now the focal point of any civil protection activity and program. Risk perception and social vulnerability become widely discussed issues "when a disaster occurs": a "day-after" approach that should be replaced by a "day-before" one. Is that a cultural problem? Is it a communication issue? As a matter of fact, nowadays academics, experts, institutions are called to be more effective in transferring natural hazards knowledge (technical, operational, historical, social) to the public, for switching from «protection/passivity» (focused on disaster event) to «resilience» (focused on vulnerability). However, this change includes to abandon the "Elites Knowledge" approach and to support "Open Knowledge" and "Open Data" perspectives. Validated scientific information on natural hazards is not yet a common heritage: there are several cases of misleading or inaccurate information published by media. During recent Italian national emergencies [Flash Floods Liguria-Toscana 2011, Earthquake Emilia-Romagna 2012], social media registered people not only asking for news on the disaster event, but also talking trivially about scientific contents on natural hazards. By considering these facts, in the framework of a phD program in Earth Science, a joint team UNITO-NatRisk and CNR-IIA conceived the web project "Natural Hazards Wikisaurus" [NHW], combining two previous experiences: "HyperIspro" - a wiki on civil protection set up by Giuseppe Zamberletti, former Italian minister of Civil Protection - and "Earth Thesaurus", developed by CNR-IIA. The team decided to start from the «words» using both the collaboration of the wiki concept (open and participatory knowledge) and the power of explanation of a thesaurus. Why? Because a word is not enough, as a term has

  15. Development of direct multi-hazard susceptibility assessment method for post-earthquake reconstruction planning in Nepal

    Science.gov (United States)

    Mavrouli, Olga; Rana, Sohel; van Westen, Cees; Zhang, Jianqiang

    2017-04-01

    After the devastating 2015 Gorkha earthquake in Nepal, reconstruction activities have been delayed considerably, due to many reasons, of a political, organizational and technical nature. Due to the widespread occurrence of co-seismic landslides, and the expectation that these may be aggravated or re-activated in future years during the intense monsoon periods, there is a need to evaluate for thousands of sites whether these are suited for reconstruction. In this evaluation multi-hazards, such as rockfall, landslides, debris flow, and flashfloods should be taken into account. The application of indirect knowledge-based, data-driven or physically-based approaches is not suitable due to several reasons. Physically-based models generally require a large number of parameters, for which data is not available. Data-driven, statistical methods, depend on historical information, which is less useful after the occurrence of a major event, such as an earthquake. Besides, they would lead to unacceptable levels of generalization, as the analysis is done based on rather general causal factor maps. The same holds for indirect knowledge-driven methods. However, location-specific hazards analysis is required using a simple method that can be used by many people at the local level. In this research, a direct scientific method was developed where local level technical people can easily and quickly assess the post-earthquake multi hazards following a decision tree approach, using an app on a smartphone or tablet. The methods assumes that a central organization, such as the Department of Soil Conservation and Watershed Management, generates spatial information beforehand that is used in the direct assessment at a certain location. Pre-earthquake, co-seismic and post-seismic landslide inventories are generated through the interpretation of Google Earth multi-temporal images, using anaglyph methods. Spatial data, such as Digital Elevation Models, land cover maps, and geological maps are

  16. Seismic hazard analysis with PSHA method in four cities in Java

    International Nuclear Information System (INIS)

    Elistyawati, Y.; Palupi, I. R.; Suharsono

    2016-01-01

    In this study the tectonic earthquakes was observed through the peak ground acceleration through the PSHA method by dividing the area of the earthquake source. This study applied the earthquake data from 1965 - 2015 that has been analyzed the completeness of the data, location research was the entire Java with stressed in four large cities prone to earthquakes. The results were found to be a hazard map with a return period of 500 years, 2500 years return period, and the hazard curve were four major cities (Jakarta, Bandung, Yogyakarta, and the city of Banyuwangi). Results Java PGA hazard map 500 years had a peak ground acceleration within 0 g ≥ 0.5 g, while the return period of 2500 years had a value of 0 to ≥ 0.8 g. While, the PGA hazard curves on the city's most influential source of the earthquake was from sources such as fault Cimandiri backgroud, for the city of Bandung earthquake sources that influence the seismic source fault dent background form. In other side, the city of Yogyakarta earthquake hazard curve of the most influential was the source of the earthquake background of the Opak fault, and the most influential hazard curve of Banyuwangi earthquake was the source of Java and Sumba megatruts earthquake. (paper)

  17. Seismic hazard assessment of Iran

    Directory of Open Access Journals (Sweden)

    M. Ghafory-Ashtiany

    1999-06-01

    Full Text Available The development of the new seismic hazard map of Iran is based on probabilistic seismic hazard computation using the historical earthquakes data, geology, tectonics, fault activity and seismic source models in Iran. These maps have been prepared to indicate the earthquake hazard of Iran in the form of iso-acceleration contour lines, and seismic hazard zoning, by using current probabilistic procedures. They display the probabilistic estimates of Peak Ground Acceleration (PGA for the return periods of 75 and 475 years. The maps have been divided into intervals of 0.25 degrees in both latitudinal and longitudinal directions to calculate the peak ground acceleration values at each grid point and draw the seismic hazard curves. The results presented in this study will provide the basis for the preparation of seismic risk maps, the estimation of earthquake insurance premiums, and the preliminary site evaluation of critical facilities.

  18. Harmonizing seismic hazard assessments for nuclear power plants

    International Nuclear Information System (INIS)

    Mallard, D.J.

    1993-01-01

    Even a cursory comparison between maps of global seismicity and NPP earthquake design levels reveals many inconsistencies. While, in part, this situation reflects the evolution in understanding of seismic hazards, mismatches can also be due to ongoing differences in the way the hazards are assessed and in local regulatory requirements. So far, formal international consensus has only been able to encompass broad principles, such as those recently recommended by the International Atomic Energy Agency, and even these can raise many technical issues, particularly relating to zones of diffuse seismicity. In the future, greater harmonisation in hazard assessments and, to some extent, in earthquake design levels could emerge through the more widespread use of probabilistic methods. International collaborative ventures and joint projects will be important for resolving anomalies in the existing databases and their interpretations, and for acquiring new data, but to achieve their ideal objectives, they will need to proceed in clearly defined stages. (author)

  19. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  20. Analysis of earthquake parameters to generate hazard maps by integrating AHP and GIS for Küçükçekmece region

    Directory of Open Access Journals (Sweden)

    T. Erden

    2012-02-01

    Full Text Available Definition of an earthquake includes parameters with respect to region of interest. Each of those parameters has different weights on the earthquake ground motion and effect. This study examines the weight of common parameters that have an influence on the effects of earthquakes. The Analytic Hierarchy Process (AHP is used for factor weighting of each parameter and Geographic Information Systems (GIS are used for simulating the results of the AHP on a spatial environment. In this study, it is aimed to generate a hierarchical structure of the model for the simulation of an earthquake hazard map (EHM. The parameters of the EHM, which are selected by the criterion of non-correlated factors, are: topography, distance to epicenter, soil classification, liquefaction, and fault/focal mechanism. As a result of the study, weights of the parameters that affect the earthquake ground motion at the study area are determined and compared with a selected attenuation relation map.

  1. Resilience to Interacting multi-natural hazards

    Science.gov (United States)

    Zhuo, Lu; Han, Dawei

    2016-04-01

    Conventional analyses of hazard assessment tend to focus on individual hazards in isolation. However, many parts of the world are usually affected by multiple natural hazards with the potential for interacting relationships. The understanding of such interactions, their impacts and the related uncertainties, are an important and topical area of research. Interacting multi-hazards may appear in different forms, including 1) CASCADING HAZARDS (a primary hazard triggering one or more secondary hazards such as an earthquake triggering landslides which may block river channels with dammed lakes and ensued floods), 2) CONCURRING HAZARDS (two or more primary hazards coinciding to trigger or exacerbate secondary hazards such as an earthquake and a rainfall event simultaneously creating landslides), and 3) ALTERING HAZARDS (a primary hazard increasing the probability of a secondary hazard occurring such as major earthquakes disturbing soil/rock materials by violent ground shaking which alter the regional patterns of landslides and debris flows in the subsequent years to come). All three types of interacting multi-hazards may occur in natural hazard prone regions, so it is important that research on hazard resilience should cover all of them. In the past decades, great progresses have been made in tackling disaster risk around the world. However, there are still many challenging issues to be solved, and the disasters over recent years have clearly demonstrated the inadequate resilience in our highly interconnected and interdependent systems. We have identified the following weaknesses and knowledge gaps in the current disaster risk management: 1) although our understanding in individual hazards has been greatly improved, there is a lack of sound knowledge about mechanisms and processes of interacting multi-hazards. Therefore, the resultant multi-hazard risk is often significantly underestimated with severe consequences. It is also poorly understood about the spatial and

  2. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    Science.gov (United States)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for

  3. Seismic hazard map of the western hemisphere

    Science.gov (United States)

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the

  4. Seismic hazard map of the western hemisphere

    Directory of Open Access Journals (Sweden)

    J. G. Tanner

    1999-06-01

    Full Text Available Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.. Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($ 6 billion, 1994 Northridge, CA ($ 25 billion, and 1995 Kobe, Japan (> $ 100 billion earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes, emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions

  5. CE-PA: A user's manual for determination of controlling earthquakes and development of seismic hazard information data base for the central and eastern United States

    International Nuclear Information System (INIS)

    Short, C.

    1995-05-01

    The CE-PA, Controlling Earthquake(s) through Probabilistic Analysis, software package developed at Lawrence Livermore National Laboratory (LLNL) is a research program used as part of a study performed for the US Office of Nuclear Regulatory Research Division Engineering project on Geosciences Issues in the revision of geological siting criteria. The objectives of this study were to explore ways on how to use results from probabilistic seismic hazard characterization (PSHC) to determine hazard-consistent scenario earthquakes and to develop design ground motion. The purpose of this document is to describe the CE-PA software to users. The software includes two operating system and process controllers plus several fortran routines and input decks. This manual gives an overview of the methodology to estimate controlling earthquakes in Section I. A descriptive overview of the procedures and the organization of the program modules used in CE-PA is provided in Section II. Section III contains four example executions with comments and a graphical display of each execution path, plus an overview of the directory/file structure. Section IV provides some general observations regarding the model

  6. Introduction to Plate Boundaries and Natural Hazards

    NARCIS (Netherlands)

    Duarte, João C.; Schellart, Wouter P.

    2016-01-01

    A great variety of natural hazards occur on Earth, including earthquakes, volcanic eruptions, tsunamis, landslides, floods, fires, tornadoes, hurricanes, and avalanches. The most destructive of these hazards, earthquakes, tsunamis, and volcanic eruptions, are mostly associated with tectonic plate

  7. Review Article: A comparison of flood and earthquake vulnerability assessment indicators

    Science.gov (United States)

    de Ruiter, Marleen C.; Ward, Philip J.; Daniell, James E.; Aerts, Jeroen C. J. H.

    2017-07-01

    In a cross-disciplinary study, we carried out an extensive literature review to increase understanding of vulnerability indicators used in the disciplines of earthquake- and flood vulnerability assessments. We provide insights into potential improvements in both fields by identifying and comparing quantitative vulnerability indicators grouped into physical and social categories. Next, a selection of index- and curve-based vulnerability models that use these indicators are described, comparing several characteristics such as temporal and spatial aspects. Earthquake vulnerability methods traditionally have a strong focus on object-based physical attributes used in vulnerability curve-based models, while flood vulnerability studies focus more on indicators applied to aggregated land-use classes in curve-based models. In assessing the differences and similarities between indicators used in earthquake and flood vulnerability models, we only include models that separately assess either of the two hazard types. Flood vulnerability studies could be improved using approaches from earthquake studies, such as developing object-based physical vulnerability curve assessments and incorporating time-of-the-day-based building occupation patterns. Likewise, earthquake assessments could learn from flood studies by refining their selection of social vulnerability indicators. Based on the lessons obtained in this study, we recommend future studies for exploring risk assessment methodologies across different hazard types.

  8. Tsunami simulations of mega-thrust earthquakes in the Nankai–Tonankai Trough (Japan) based on stochastic rupture scenarios

    KAUST Repository

    Goda, Katsuichiro

    2017-02-23

    In this study, earthquake rupture models for future mega-thrust earthquakes in the Nankai–Tonankai subduction zone are developed by incorporating the main characteristics of inverted source models of the 2011 Tohoku earthquake. These scenario ruptures also account for key features of the national tsunami source model for the Nankai–Tonankai earthquake by the Central Disaster Management Council of the Japanese Government. The source models capture a wide range of realistic slip distributions and kinematic rupture processes, reflecting the current best understanding of what may happen due to a future mega-earthquake in the Nankai–Tonankai Trough, and therefore are useful for conducting probabilistic tsunami hazard and risk analysis. A large suite of scenario rupture models is then used to investigate the variability of tsunami effects in coastal areas, such as offshore tsunami wave heights and onshore inundation depths, due to realistic variations in source characteristics. Such investigations are particularly valuable for tsunami hazard mapping and evacuation planning in municipalities along the Nankai–Tonankai coast.

  9. Regulatory Activities to the Natural Hazard

    International Nuclear Information System (INIS)

    Choi, Kangryong; Jung, Raeyoung

    2008-01-01

    The safety of the Nuclear Power Plants(NPPs) against the natural hazards has been investigated focused on earthquake and tsunami. Since the mass media and general people have high interests on nuclear safety whenever the natural hazards occur, earthquake and tsunami are not only technical safety concern, but also psychological issues in terms of public acceptance of nuclear energy. The Korean peninsula has been considered as a safe zone compared to neighboring countries against natural hazard, but the historical documents which state severely damaged events due to the strong earthquake make US paying careful attention to assure the safety against natural phenomenon. The potential and characteristics of earthquake and tsunami have been examined, and the status of seismic and tsunami safety of the NPPs in Korea is described. the follow-up action after disastrous huge earthquake and tsunami occurred in neighboring countries is summarized as well. The assessment results show that the NPPs in Korea are well designed, constructed and maintained with certain amount of safety margin against natural hazards, and the utility and the regulatory body are continuously doing an effort to enhance the safety with consideration of lessons learned from big events in other countries

  10. Bike Helmets and Black Riders: Experiential Approaches to Helping Students Understand Natural Hazard Assessment and Mitigation Issues

    Science.gov (United States)

    Stein, S. A.; Kley, J.; Hindle, D.; Friedrich, A. M.

    2014-12-01

    Defending society against natural hazards is a high-stakes game of chance against nature, involving tough decisions. How should a developing nation allocate its budget between building schools for towns without ones or making existing schools earthquake-resistant? Does it make more sense to build levees to protect against floods, or to prevent development in the areas at risk? Would more lives be saved by making hospitals earthquake-resistant, or using the funds for patient care? These topics are challenging because they are far from normal experience, in that they involve rare events and large sums. To help students in natural hazard classes conceptualize them, we pose tough and thought-provoking questions about complex issues involved and explore them together via lectures, videos, field trips, and in-class and homework questions. We discuss analogous examples from the students' experiences, drawing on a new book "Playing Against Nature, Integrating Science and Economics to Mitigate Natural Hazards in an Uncertain World". Asking whether they wear bicycle helmets and why or why not shows the cultural perception of risk. Individual students' responses vary, and the overall results vary dramatically between the US, UK, and Germany. Challenges in hazard assessment in an uncertain world are illustrated by asking German students whether they buy a ticket on public transportation - accepting a known cost - or "ride black" - not paying but risking a heavy fine if caught. We explore the challenge of balancing mitigation costs and benefits via the question "If you were a student in Los Angeles, how much more would you pay in rent each month to live in an earthquake-safe building?" Students learn that interdisciplinary thinking is needed, and that due to both uncertainties and sociocultural factors, no unique or right strategies exist for a particular community, much the less all communities. However, we can seek robust policies that give sensible results given

  11. Hazus® estimated annualized earthquake losses for the United States

    Science.gov (United States)

    Jaiswal, Kishor; Bausch, Doug; Rozelle, Jesse; Holub, John; McGowan, Sean

    2017-01-01

    Large earthquakes can cause social and economic disruption that can be unprecedented to any given community, and the full recovery from these impacts may or may not always be achievable. In the United States (U.S.), the 1994 M6.7 Northridge earthquake in California remains the third costliest disaster in U.S. history; and it was one of the most expensive disasters for the federal government. Internationally, earthquakes in the last decade alone have claimed tens of thousands of lives and caused hundreds of billions of dollars of economic impact throughout the globe (~90 billion U.S. dollars (USD) from 2008 M7.9 Wenchuan China, ~20 billion USD from 2010 M8.8 Maule earthquake in Chile, ~220 billion USD from 2011 M9.0 Tohoku Japan earthquake, ~25 billion USD from 2011 M6.3 Christchurch New Zealand, and ~22 billion USD from 2016 M7.0 Kumamoto Japan). Recent earthquakes show a pattern of steadily increasing damages and losses that are primarily due to three key factors: (1) significant growth in earthquake-prone urban areas, (2) vulnerability of the older building stock, including poorly engineered non-ductile concrete buildings, and (3) an increased interdependency in terms of supply and demand for the businesses that operate among different parts of the world. In the United States, earthquake risk continues to grow with increased exposure of population and development even though the earthquake hazard has remained relatively stable except for the regions of induced seismic activity. Understanding the seismic hazard requires studying earthquake characteristics and locales in which they occur, while understanding the risk requires an assessment of the potential damage from earthquake shaking to the built environment and to the welfare of people—especially in high-risk areas. Estimating the varying degree of earthquake risk throughout the United States is critical for informed decision-making on mitigation policies, priorities, strategies, and funding levels in the

  12. Success in transmitting hazard science

    Science.gov (United States)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in

  13. The Impact Hazard in the Context of Other Natural Hazards and Predictive Science

    Science.gov (United States)

    Chapman, C. R.

    1998-09-01

    The hazard due to impact of asteroids and comets has been recognized as analogous, in some ways, to other infrequent but consequential natural hazards (e.g. floods and earthquakes). Yet, until recently, astronomers and space agencies have felt no need to do what their colleagues and analogous agencies must do in order the assess, quantify, and communicate predictions to those with a practical interest in the predictions (e.g. public officials who must assess the threats, prepare for mitigation, etc.). Recent heightened public interest in the impact hazard, combined with increasing numbers of "near misses" (certain to increase as Spaceguard is implemented) requires that astronomers accept the responsibility to place their predictions and assessments in terms that may be appropriately considered. I will report on preliminary results of a multi-year GSA/NCAR study of "Prediction in the Earth Sciences: Use and Misuse in Policy Making" in which I have represented the impact hazard, while others have treated earthquakes, floods, weather, global climate change, nuclear waste disposal, acid rain, etc. The impact hazard presents an end-member example of a natural hazard, helping those dealing with more prosaic issues to learn from an extreme. On the other hand, I bring to the astronomical community some lessons long adopted in other cases: the need to understand the policy purposes of impact predictions, the need to assess potential societal impacts, the requirements to very carefully assess prediction uncertainties, considerations of potential public uses of the predictions, awareness of ethical considerations (e.g. conflicts of interest) that affect predictions and acceptance of predictions, awareness of appropriate means for publicly communicating predictions, and considerations of the international context (especially for a hazard that knows no national boundaries).

  14. Cascading elastic perturbation in Japan due to the 2012 M w 8.6 Indian Ocean earthquake.

    Science.gov (United States)

    Delorey, Andrew A; Chao, Kevin; Obara, Kazushige; Johnson, Paul A

    2015-10-01

    Since the discovery of extensive earthquake triggering occurring in response to the 1992 M w (moment magnitude) 7.3 Landers earthquake, it is now well established that seismic waves from earthquakes can trigger other earthquakes, tremor, slow slip, and pore pressure changes. Our contention is that earthquake triggering is one manifestation of a more widespread elastic disturbance that reveals information about Earth's stress state. Earth's stress state is central to our understanding of both natural and anthropogenic-induced crustal processes. We show that seismic waves from distant earthquakes may perturb stresses and frictional properties on faults and elastic moduli of the crust in cascading fashion. Transient dynamic stresses place crustal material into a metastable state during which the material recovers through a process termed slow dynamics. This observation of widespread, dynamically induced elastic perturbation, including systematic migration of offshore seismicity, strain transients, and velocity transients, presents a new characterization of Earth's elastic system that will advance our understanding of plate tectonics, seismicity, and seismic hazards.

  15. Engineering Applications Using Probabilistic Aftershock Hazard Analyses: Aftershock Hazard Map and Load Combination of Aftershocks and Tsunamis

    Directory of Open Access Journals (Sweden)

    Byunghyun Choi

    2017-12-01

    Full Text Available After the Tohoku earthquake in 2011, we observed that aftershocks tended to occur in a wide region after such a large earthquake. These aftershocks resulted in secondary damage or delayed rescue and recovery activities. In addition, it has been reported that there are regions where the intensity of the vibrations owing to the aftershocks was much stronger than those associated with the main shock. Therefore, it is necessary to consider the seismic risk associated with aftershocks. We used the data regarding aftershocks that was obtained from the Tohoku earthquake and various other historically large earthquakes. We investigated the spatial and temporal distribution of the aftershocks using the Gutenberg–Richter law and the modified Omori law. Subsequently, we previously proposed a probabilistic aftershock occurrence model that is expected to be useful to develop plans for recovery activities after future large earthquakes. In this study, the probabilistic aftershock hazard analysis is used to create aftershock hazard maps. We propose a hazard map focusing on the probability of aftershocks on the scale of the main shock for use with a recovery activity plan. Following the lessons learned from the 2011 Tohoku earthquake, we focus on the simultaneous occurrence of tsunamis and aftershocks just after a great subduction earthquake. The probabilistic aftershock hazard analysis is used to derive load combination equations of the load and resistance factor design. This design is intended to simultaneously consider tsunamis and aftershocks for tsunami-resistant designs of tsunami evacuation buildings.

  16. Seismic Hazard Analysis based on Earthquake Vulnerability and Peak Ground Acceleration using Microseismic Method at Universitas Negeri Semarang

    Science.gov (United States)

    Sulistiawan, H.; Supriyadi; Yulianti, I.

    2017-02-01

    Microseismic is a harmonic vibration of land that occurs continuously at a low frequency. The characteristics of microseismic represents the characteristics of the soil layer based on the value of its natural frequency. This paper presents the analysis of seismic hazard at Universitas Negeri Semarang using microseismic method. The data acquisition was done at 20 points with distance between points 300 m by using three component’s seismometer. The data was processed using Horizontal to Vertical Spectral Ratio (HVSR) method to obtain the natural frequency and amplification value. The value of the natural frequency and amplification used to determine the value of the earthquake vulnerability and peak ground acceleration (PGA). The result shows then the earthquake vulnerability value range from 0.2 to 7.5, while the value of the average peak ground acceleration (PGA) is in the range 10-24 gal. Therefore, the average peak ground acceleration equal to earthquake intensity IV MMI scale.

  17. Earthquake induced landslide hazard: a multidisciplinary field observatory in the Marmara SUPERSITE

    Science.gov (United States)

    Bigarré, Pascal

    2014-05-01

    Earthquake-triggered landslides have an increasing disastrous impact in seismic regions due to the fast growing urbanization and infrastructures. Just considering disasters from the last fifteen years, among which the 1999 Chi-Chi earthquake, the 2008 Wenchuan earthquake, and the 2011 Tohoku earthquake, these events generated tens of thousands of coseismic landslides. Those resulted in amazing death toll and considerable damages, affecting the regional landscape including its hydrological main features. Despite a strong impetus in research during past decades, knowledge on those geohazards is still fragmentary, while databases of high quality observational data are lacking. These phenomena call for further collaborative researches aiming eventually to enhance preparedness and crisis management. As one of the three SUPERSITE concept FP7 projects dealing with long term high level monitoring of major natural hazards at the European level, the MARSITE project gathers research groups in a comprehensive monitoring activity developed in the Sea of Marmara Region, one of the most densely populated parts of Europe and rated at high seismic risk level since the 1999 Izmit and Duzce devastating earthquakes. Besides the seismic threat, landslides in Turkey and in this region constitute an important source of loss. The 1999 Earthquake caused extensive landslides while tsunami effects were observed during the post-event surveys in several places along the coasts of the Izmit bay. The 6th Work Package of MARSITE project gathers 9 research groups to study earthquake-induced landslides focusing on two sub-regional areas of high interest. First, the Cekmece-Avcilar peninsula, located westwards of Istanbul, is a highly urbanized concentrated landslide prone area, showing high susceptibility to both rainfalls while affected by very significant seismic site effects. Second, the off-shore entrance of the Izmit Gulf, close to the termination of the surface rupture of the 1999 earthquake

  18. The 2014 United States National Seismic Hazard Model

    Science.gov (United States)

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Mueller, Charles; Haller, Kathleen; Frankel, Arthur; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen; Boyd, Oliver; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nicolas; Wheeler, Russell; Williams, Robert; Olsen, Anna H.

    2015-01-01

    New seismic hazard maps have been developed for the conterminous United States using the latest data, models, and methods available for assessing earthquake hazard. The hazard models incorporate new information on earthquake rupture behavior observed in recent earthquakes; fault studies that use both geologic and geodetic strain rate data; earthquake catalogs through 2012 that include new assessments of locations and magnitudes; earthquake adaptive smoothing models that more fully account for the spatial clustering of earthquakes; and 22 ground motion models, some of which consider more than double the shaking data applied previously. Alternative input models account for larger earthquakes, more complicated ruptures, and more varied ground shaking estimates than assumed in earlier models. The ground motions, for levels applied in building codes, differ from the previous version by less than ±10% over 60% of the country, but can differ by ±50% in localized areas. The models are incorporated in insurance rates, risk assessments, and as input into the U.S. building code provisions for earthquake ground shaking.

  19. Estimated airborne release of plutonium from the 102 Building at the General Electric Vallecitos Nuclear Center, Vallecitos, California, as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.; Hays, I.D.

    1980-12-01

    This report estimates the potential airborne releases of plutonium as a consequence of various severities of earthquake and wind hazard postulated for the 102 Building at the General Electric Vallecitos Nuclear Center in California. The releases are based on damage scenarios developed by other specialists. The hazard severities presented range up to a nominal velocity of 230 mph for wind hazard and are in excess of 0.8 g linear acceleration for earthquakes. The consequences of thrust faulting are considered. The approaches and factors used to estimate the releases are discussed. Release estimates range from 0.003 to 3 g Pu

  20. Understanding Great Earthquakes in Japan's Kanto Region

    Science.gov (United States)

    Kobayashi, Reiji; Curewitz, Daniel

    2008-10-01

    Third International Workshop on the Kanto Asperity Project; Chiba, Japan, 16-19 February 2008; The 1703 (Genroku) and 1923 (Taisho) earthquakes in Japan's Kanto region (M 8.2 and M 7.9, respectively) caused severe damage in the Tokyo metropolitan area. These great earthquakes occurred along the Sagami Trough, where the Philippine Sea slab is subducting beneath Japan. Historical records, paleoseismological research, and geophysical/geodetic monitoring in the region indicate that such great earthquakes will repeat in the future.

  1. Report on the Aseismic Slip, Tremor, and Earthquakes Workshop

    Science.gov (United States)

    Gomberg, Joan; Roeloffs, Evelyn; Trehu, Anne; Dragert, Herb; Meertens, Charles

    2008-01-01

    This report summarizes the discussions and information presented during the workshop on Aseismic Slip, Tremor, and Earthquakes. Workshop goals included improving coordination among those involved in conducting research related to these phenomena, assessing the implications for earthquake hazard assessment, and identifying ways to capitalize on the education and outreach opportunities presented by these phenomena. Research activities of focus included making, disseminating, and analyzing relevant measurements; the relationships among tremor, aseismic or 'slow-slip', and earthquakes; and discovering the underlying causative physical processes. More than 52 participants contributed to the workshop, held February 25-28, 2008 in Sidney, British Columbia. The workshop was sponsored by the U.S. Geological Survey, the National Science Foundation?s Earthscope Program and UNAVCO Consortium, and the Geological Survey of Canada. This report has five parts. In the first part, we integrate the information exchanged at the workshop as it relates to advancing our understanding of earthquake generation and hazard. In the second part, we summarize the ideas and concerns discussed in workshop working groups on Opportunities for Education and Outreach, Data and Instrumentation, User and Public Needs, and Research Coordination. The third part presents summaries of the oral presentations. The oral presentations are grouped as they were at the workshop in the categories of phenomenology, underlying physical processes, and implications for earthquake hazards. The fourth part contains the meeting program and the fifth part lists the workshop participants. References noted in parentheses refer to the authors of presentations made at the workshop, and published references are noted in square brackets and listed in the Reference section. Appendix A contains abstracts of all participant presentations and posters, which also have been posted online, along with presentations and author contact

  2. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    Energy Technology Data Exchange (ETDEWEB)

    Türker, Tuğba, E-mail: tturker@ktu.edu.tr [Karadeniz Technical University, Department of Geophysics, Trabzon/Turkey (Turkey); Bayrak, Yusuf, E-mail: ybayrak@agri.edu.tr [Ağrı İbrahim Çeçen University, Ağrı/Turkey (Turkey)

    2016-04-18

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M{sub S}=7.3 and 1897, M{sub S}=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M{sub S} magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99

  3. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    International Nuclear Information System (INIS)

    Türker, Tuğba; Bayrak, Yusuf

    2016-01-01

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M_S=7.3 and 1897, M_S=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M_S magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99 with an earthquake

  4. Comparison of Structurally Controlled Landslide Hazard Simulation to the Co-seismic Landslides Caused by the M 7.2 2013 Bohol Earthquake.

    Science.gov (United States)

    Galang, J. A. M. B.; Eco, R. C.; Lagmay, A. M. A.

    2014-12-01

    The M_w 7.2 October 15, 2013 Bohol earthquake is one of the more destructive earthquake to hit the Philippines in the 21st century. The epicenter was located in Sagbayan municipality, central Bohol and was generated by a previously unmapped reverse fault called the "Inabanga Fault". The earthquake resulted in 209 fatalities and over 57 million USD worth of damages. The earthquake generated co-seismic landslides most of which were related to fault structures. Unlike rainfall induced landslides, the trigger for co-seismic landslides happen without warning. Preparations for this type of landslides rely heavily on the identification of fracture-related slope instability. To mitigate the impacts of co-seismic landslide hazards, morpho-structural orientations of discontinuity sets were mapped using remote sensing techniques with the aid of a Digital Terrain Model (DTM) obtained in 2012. The DTM used is an IFSAR derived image with a 5-meter pixel resolution and approximately 0.5 meter vertical accuracy. Coltop 3D software was then used to identify similar structures including measurement of their dip and dip directions. The chosen discontinuity sets were then keyed into Matterocking software to identify potential rock slide zones due to planar or wedged discontinuities. After identifying the structurally-controlled unstable slopes, the rock mass propagation extent of the possible rock slides was simulated using Conefall. Separately, a manually derived landslide inventory has been performed using post-earthquake satellite images and LIDAR. The results were compared to the landslide inventory which identified at least 873 landslides. Out of the 873 landslides identified through the inventory, 786 or 90% intersect the simulated structural-controlled landslide hazard areas of Bohol. The results show the potential of this method to identify co-seismic landslide hazard areas for disaster mitigation. Along with computer methods to simulate shallow landslides, and debris flow

  5. Summary of November 2010 meeting to evaluate turbidite data for constraining the recurrence parameters of great Cascadia earthquakes for the update of national seismic hazard maps

    Science.gov (United States)

    Frankel, Arthur D.

    2011-01-01

    This report summarizes a meeting of geologists, marine sedimentologists, geophysicists, and seismologists that was held on November 18–19, 2010 at Oregon State University in Corvallis, Oregon. The overall goal of the meeting was to evaluate observations of turbidite deposits to provide constraints on the recurrence time and rupture extent of great Cascadia subduction zone (CSZ) earthquakes for the next update of the U.S. national seismic hazard maps (NSHM). The meeting was convened at Oregon State University because this is the major center for collecting and evaluating turbidite evidence of great Cascadia earthquakes by Chris Goldfinger and his colleagues. We especially wanted the participants to see some of the numerous deep sea cores this group has collected that contain the turbidite deposits. Great earthquakes on the CSZ pose a major tsunami, ground-shaking, and ground-failure hazard to the Pacific Northwest. Figure 1 shows a map of the Pacific Northwest with a model for the rupture zone of a moment magnitude Mw 9.0 earthquake on the CSZ and the ground shaking intensity (in ShakeMap format) expected from such an earthquake, based on empirical ground-motion prediction equations. The damaging effects of such an earthquake would occur over a wide swath of the Pacific Northwest and an accompanying tsunami would likely cause devastation along the Pacifc Northwest coast and possibly cause damage and loss of life in other areas of the Pacific. A magnitude 8 earthquake on the CSZ would cause damaging ground shaking and ground failure over a substantial area and could also generate a destructive tsunami. The recent tragic occurrence of the 2011 Mw 9.0 Tohoku-Oki, Japan, earthquake highlights the importance of having accurate estimates of the recurrence times and magnitudes of great earthquakes on subduction zones. For the U.S. national seismic hazard maps, estimating the hazard from the Cascadia subduction zone has been based on coastal paleoseismic evidence of great

  6. A procedure for assessing seismic hazard generated by Vrancea earthquakes and its application. III. A method for developing isoseismal and isoacceleration maps. Applications

    International Nuclear Information System (INIS)

    Enescu, D.; Enescu, B.D.

    2007-01-01

    A method for developing isoseismal and isoacceleration maps assumedly valid for future strong earthquakes (M GR > 6.7) is described as constituting the third stage of a procedure for assessing the seismic hazard generated by Vrancea earthquakes. The method relies on the results of the former two stages given by Enescu et al., and on further developments that are presented in this paper. Moreover, it is based on instrument recording data. Major earthquakes taking place in Vrancea (November 10, 1940 - M GR 7.4, March 4, 1977 - M GR = 7.2 and the strongest possible) were examined as a way to test the method. The method is also applied for an earthquake of magnitude M GR = 6.7. Given the successful results of the tests, the method can by used for predicting isoseismal and isoacceleration maps for future Vrancea earthquakes of various magnitudes M GR ≥ 6.7. (authors)

  7. Natural Hazards, Second Edition

    Science.gov (United States)

    Rouhban, Badaoui

    Natural disaster loss is on the rise, and the vulnerability of the human and physical environment to the violent forces of nature is increasing. In many parts of the world, disasters caused by natural hazards such as earthquakes, floods, landslides, drought, wildfires, intense windstorms, tsunami, and volcanic eruptions have caused the loss of human lives, injury, homelessness, and the destruction of economic and social infrastructure. Over the last few years, there has been an increase in the occurrence, severity, and intensity of disasters, culminating with the devastating tsunami of 26 December 2004 in South East Asia.Natural hazards are often unexpected or uncontrollable natural events of varying magnitude. Understanding their mechanisms and assessing their distribution in time and space are necessary for refining risk mitigation measures. This second edition of Natural Hazards, (following a first edition published in 1991 by Cambridge University Press), written by Edward Bryant, associate dean of science at Wollongong University, Australia, grapples with this crucial issue, aspects of hazard prediction, and other issues. The book presents a comprehensive analysis of different categories of hazards of climatic and geological origin.

  8. Development of seismic hazard analysis in Japan

    International Nuclear Information System (INIS)

    Itoh, T.; Ishii, K.; Ishikawa, Y.; Okumura, T.

    1987-01-01

    In recent years, seismic risk assessment of the nuclear power plant have been conducted increasingly in various countries, particularly in the United States to evaluate probabilistically the safety of existing plants under earthquake loading. The first step of the seismic risk assessment is the seismic hazard analysis, in which the relationship between the maximum earthquake ground motions at the plant site and their annual probability of exceedance, i.e. the seismic hazard curve, is estimated. In this paper, seismic hazard curves are evaluated and examined based on historical earthquake records model, in which seismic sources are modeled with area-sources, for several different sites in Japan. A new evaluation method is also proposed to compute the response spectra of the earthquake ground motions in connection with estimating the probabilistic structural response. Finally the numerical result of probabilistic risk assessment for a base-isolated three story RC structure, in which the frequency of seismic induced structural failure is evaluated combining the seismic hazard analysis, is described briefly

  9. Physics-Based Simulations of Natural Hazards

    Science.gov (United States)

    Schultz, Kasey William

    Earthquakes and tsunamis are some of the most damaging natural disasters that we face. Just two recent events, the 2004 Indian Ocean earthquake and tsunami and the 2011 Haiti earthquake, claimed more than 400,000 lives. Despite their catastrophic impacts on society, our ability to predict these natural disasters is still very limited. The main challenge in studying the earthquake cycle is the non-linear and multi-scale properties of fault networks. Earthquakes are governed by physics across many orders of magnitude of spatial and temporal scales; from the scale of tectonic plates and their evolution over millions of years, down to the scale of rock fracturing over milliseconds to minutes at the sub-centimeter scale during an earthquake. Despite these challenges, there are useful patterns in earthquake occurrence. One such pattern, the frequency-magnitude relation, relates the number of large earthquakes to small earthquakes and forms the basis for assessing earthquake hazard. However the utility of these relations is proportional to the length of our earthquake records, and typical records span at most a few hundred years. Utilizing physics based interactions and techniques from statistical physics, earthquake simulations provide rich earthquake catalogs allowing us to measure otherwise unobservable statistics. In this dissertation I will discuss five applications of physics-based simulations of natural hazards, utilizing an earthquake simulator called Virtual Quake. The first is an overview of computing earthquake probabilities from simulations, focusing on the California fault system. The second uses simulations to help guide satellite-based earthquake monitoring methods. The third presents a new friction model for Virtual Quake and describes how we tune simulations to match reality. The fourth describes the process of turning Virtual Quake into an open source research tool. This section then focuses on a resulting collaboration using Virtual Quake for a detailed

  10. SHEAT: a computer code for probabilistic seismic hazard analysis, user's manual

    International Nuclear Information System (INIS)

    Ebisawa, Katsumi; Kondo, Masaaki; Abe, Kiyoharu; Tanaka, Toshiaki; Takani, Michio.

    1994-08-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. Seismic hazard is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site. With the SHEAT code, seismic hazard is calculated by the following two steps: (1) Modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquakes) is modelled based on the historical earthquake records, active fault data and expert judgement. (2) Calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT code. It includes: (1) Outlines of the code, which include overall concept, logical process, code structure, data file used and special characteristics of the code, (2) Functions of subprograms and analytical models in them, (3) Guidance of input and output data, and (4) Sample run results. The code has widely been used at JAERI to analyze seismic hazard at various nuclear power plant sites in japan. (author)

  11. Mitigating mass movement caused by earthquakes and typhoons: a case study of central Taiwan

    Science.gov (United States)

    Lin, Jiun-Chuan

    2013-04-01

    Typhoons caused huge damages to Taiwan at the average of 3.8 times a year in the last 100 years, according to Central Weather Bureau data. After the Chi-Chi earthquake of 1999 at the magnitude of Richard Scale 7.3, typhoons with huge rainfall would cause huge debris flow and deposits at river channels. As a result of earthquakes, loose debris falls and flows became significant hazards in central Taiwan. Analysis of rainfall data and data about the sites of slope failure show that damage from natural hazards was enhanced in the last 20 years, as a result of the Chi-Chi earthquake. There are three main types of mass movement in Central Taiwan: landslides, debris flows and gully erosion. Landslides occurred mainly along hill slopes and river channel banks. Many dams, check dams, housing structures and even river channels can be raised to as high as 60 meters as a result of stacking up floating materials of landslides. Debris flows occurred mainly through typhoon periods and activated ancient debris deposition. New gullies were thus developed from deposits loosened and shaken up by earthquakes. Extreme earthquakes and typhoon events occurred frequently in the last 20 years. This paper analyzes the geological and geomorphologic background for the precarious areas and typhoons in central Taiwan, to make a systematic understanding of mass movement harzards. The mechanism and relations of debris flows and rainfall data in central Taiwan are analyzed. Ways for mitigating mass movement threats are also proposed in this paper. Keywords: mass movement, earthquakes, typhoons, hazard mitigation, central Ta

  12. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  13. Cascading elastic perturbation in Japan due to the 2012 Mw 8.6 Indian Ocean earthquake

    Science.gov (United States)

    Delorey, Andrew A.; Chao, Kevin; Obara, Kazushige; Johnson, Paul A.

    2015-01-01

    Since the discovery of extensive earthquake triggering occurring in response to the 1992 Mw (moment magnitude) 7.3 Landers earthquake, it is now well established that seismic waves from earthquakes can trigger other earthquakes, tremor, slow slip, and pore pressure changes. Our contention is that earthquake triggering is one manifestation of a more widespread elastic disturbance that reveals information about Earth’s stress state. Earth’s stress state is central to our understanding of both natural and anthropogenic-induced crustal processes. We show that seismic waves from distant earthquakes may perturb stresses and frictional properties on faults and elastic moduli of the crust in cascading fashion. Transient dynamic stresses place crustal material into a metastable state during which the material recovers through a process termed slow dynamics. This observation of widespread, dynamically induced elastic perturbation, including systematic migration of offshore seismicity, strain transients, and velocity transients, presents a new characterization of Earth’s elastic system that will advance our understanding of plate tectonics, seismicity, and seismic hazards. PMID:26601289

  14. The hazard map of ML6.6 0206 Meinong earthquake near Guanmiao and its Neotectonic implication

    Science.gov (United States)

    Chung, L. H.; Shyu, J. B. H.; Huang, M. H.; Yang, K. M.; Le Beon, M.; Lee, Y. H.; Chuang, R.; Yi, D.

    2016-12-01

    The serious damage was occurred in SW Taiwan by ML 6.6 0206 Meinong earthquake. Based on InSAR result, 10 cm oval-raised surface deformation is 15 km away from its epicenter, and two obviously N-S trend sharp phase change nearby Guanmiao area. Our field investigation shows bulling damage and surface fracture are high related with the two sharp phase change. Here, we perform the detailed shallow underground geometry by using reflection seismic data, geologic data, and field hazard investigation. This N-S trend surface deformation may be induced by local shallow folding, while the huge uplift west of Guanmiao may be related with pure shear deformation of thick clayey Gutingkeng (GTK) Formation. Our results imply that not only a moderate lower crustal earthquake can trigger active structure at shallower depth, but also those minor shallow active structures are occurred serious damage and surface deformation.

  15. The Implications of Strike-Slip Earthquake Source Properties on the Transform Boundary Development Process

    Science.gov (United States)

    Neely, J. S.; Huang, Y.; Furlong, K.

    2017-12-01

    Subduction-Transform Edge Propagator (STEP) faults, produced by the tearing of a subducting plate, allow us to study the development of a transform plate boundary and improve our understanding of both long-term geologic processes and short-term seismic hazards. The 280 km long San Cristobal Trough (SCT), formed by the tearing of the Australia plate as it subducts under the Pacific plate near the Solomon and Vanuatu subduction zones, shows along-strike variations in earthquake behaviors. The segment of the SCT closest to the tear rarely hosts earthquakes > Mw 6, whereas the SCT sections more than 80 - 100 km from the tear experience Mw7 earthquakes with repeated rupture along the same segments. To understand the effect of cumulative displacement on SCT seismicity, we analyze b-values, centroid-time delays and corner frequencies of the SCT earthquakes. We use the spectral ratio method based on Empirical Green's Functions (eGfs) to isolate source effects from propagation and site effects. We find high b-values along the SCT closest to the tear with values decreasing with distance before finally increasing again towards the far end of the SCT. Centroid time-delays for the Mw 7 strike-slip earthquakes increase with distance from the tear, but corner frequency estimates for a recent sequence of Mw 7 earthquakes are approximately equal, indicating a growing complexity in earthquake behavior with distance from the tear due to a displacement-driven transform boundary development process (see figure). The increasing complexity possibly stems from the earthquakes along the eastern SCT rupturing through multiple asperities resulting in multiple moment pulses. If not for the bounding Vanuatu subduction zone at the far end of the SCT, the eastern SCT section, which has experienced the most displacement, might be capable of hosting larger earthquakes. When assessing the seismic hazard of other STEP faults, cumulative fault displacement should be considered a key input in

  16. Natural Hazards Science at the U.S. Geological Survey

    Science.gov (United States)

    Perry, Suzanne C.; Jones, Lucile M.; Holmes, Robert R.

    2013-01-01

    The mission of the USGS in natural hazards is to develop and apply hazard science to help protect the safety, security, and economic well-being of the Nation. The costs and consequences of natural hazards can be enormous, and each year more people and infrastructure are at risk. The USGS conducts hazard research and works closely with stakeholders and cooperators to inform a broad range of planning and response activities at individual, local, State, national, and international levels. It has critical statutory and nonstatutory roles regarding floods, earthquakes, tsunamis, landslides, coastal erosion, volcanic eruptions, wildfires, and magnetic storms. USGS science can help to understand and reduce risks from natural hazards by providing the information that decisionmakers need to determine which risk management activities are worth­while.

  17. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  18. What caused a large number of fatalities in the Tohoku earthquake?

    Science.gov (United States)

    Ando, M.; Ishida, M.; Nishikawa, Y.; Mizuki, C.; Hayashi, Y.

    2012-04-01

    the 1960 Chile tsunami, which was significantly smaller than that of the 11 March tsunami. This sense of "knowing" put their lives at high risk. 5. Some local residents believed that with the presence of a breakwater, only slight flooding would occur. 6. Many people did not understand why tsunami is created under the sea. Therefore, relation of earthquake and tsunami is not quite linked to many people. These interviews made it clear that many deaths resulted because current technology and earthquake science underestimated tsunami heights, warning systems failed, and breakwaters were not strong or high enough. However, even if these problems occur in future earthquakes, better knowledge regarding earthquakes and tsunami hazards could save more lives. In an elementary school when children have fresh brain, it is necessary for them to learn the basic mechanism of tsunami generation.

  19. Hazard assessment for Romania–Bulgaria crossborder region

    International Nuclear Information System (INIS)

    Solakov, Dimcho; Simeonova, Stela; Alexandrova, Irena; Trifonova, Petya; Ardeleanu, Luminita; Cioflan, Carmen

    2014-01-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic hazard and vulnerability to earthquakes are steadily increasing as urbanisation and development occupy more areas that are prone to effects of strong earthquakes. The assessment of the seismic hazard is particularly important, because it provides valuable information for seismic safety and disaster mitigation, and it supports decision making for the benefit of society. The main objective of this study is to assess the seismic hazard for Romania-Bulgaria cross-border region on the basis of integrated basic geo-datasets

  20. Sedimentary evidence of historical and prehistorical earthquakes along the Venta de Bravo Fault System, Acambay Graben (Central Mexico)

    Science.gov (United States)

    Lacan, Pierre; Ortuño, María; Audin, Laurence; Perea, Hector; Baize, Stephane; Aguirre-Díaz, Gerardo; Zúñiga, F. Ramón

    2018-03-01

    The Venta de Bravo normal fault is one of the longest structures in the intra-arc fault system of the Trans-Mexican Volcanic Belt. It defines, together with the Pastores Fault, the 80 km long southern margin of the Acambay Graben. We focus on the westernmost segment of the Venta de Bravo Fault and provide new paleoseismological information, evaluate its earthquake history, and assess the related seismic hazard. We analyzed five trenches, distributed at three different sites, in which Holocene surface faulting offsets interbedded volcanoclastic, fluvio-lacustrine and colluvial deposits. Despite the lack of known historical destructive earthquakes along this fault, we found evidence of at least eight earthquakes during the late Quaternary. Our results indicate that this is one of the major seismic sources of the Acambay Graben, capable of producing by itself earthquakes with magnitudes (MW) up to 6.9, with a slip rate of 0.22-0.24 mm yr- 1 and a recurrence interval between 1940 and 2390 years. In addition, a possible multi-fault rupture of the Venta de Bravo Fault together with other faults of the Acambay Graben could result in a MW > 7 earthquake. These new slip rates, earthquake recurrence rates, and estimation of slips per event help advance our understanding of the seismic hazard posed by the Venta de Bravo Fault and provide new parameters for further hazard assessment.

  1. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro; Mai, Paul Martin; Yasuda, Tomohiro; Mori, Nobuhito

    2014-01-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  2. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro

    2014-09-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  3. Seismic hazard and seismic risk assessment based on the unified scaling law for earthquakes: Himalayas and adjacent regions

    Science.gov (United States)

    Nekrasova, A. K.; Kossobokov, V. G.; Parvez, I. A.

    2015-03-01

    For the Himalayas and neighboring regions, the maps of seismic hazard and seismic risk are constructed with the use of the estimates for the parameters of the unified scaling law for earthquakes (USLE), in which the Gutenberg-Richter law for magnitude distribution of seismic events within a given area is applied in the modified version with allowance for linear dimensions of the area, namely, log N( M, L) = A + B (5 - M) + C log L, where N( M, L) is the expected annual number of the earthquakes with magnitude M in the area with linear dimension L. The spatial variations in the parameters A, B, and C for the Himalayas and adjacent regions are studied on two time intervals from 1965 to 2011 and from 1980 to 2011. The difference in A, B, and C between these two time intervals indicates that seismic activity experiences significant variations on a scale of a few decades. With a global consideration of the seismic belts of the Earth overall, the estimates of coefficient A, which determines the logarithm of the annual average frequency of the earthquakes with a magnitude of 5.0 and higher in the zone with a linear dimension of 1 degree of the Earth's meridian, differ by a factor of 30 and more and mainly fall in the interval from -1.1 to 0.5. The values of coefficient B, which describes the balance between the number of earthquakes with different magnitudes, gravitate to 0.9 and range from less than 0.6 to 1.1 and higher. The values of coefficient C, which estimates the fractal dimension of the local distribution of epicenters, vary from 0.5 to 1.4 and higher. In the Himalayas and neighboring regions, the USLE coefficients mainly fall in the intervals of -1.1 to 0.3 for A, 0.8 to 1.3 for B, and 1.0 to 1.4 for C. The calculations of the local value of the expected peak ground acceleration (PGA) from the maximal expected magnitude provided the necessary basis for mapping the seismic hazards in the studied region. When doing this, we used the local estimates of the

  4. Probabilistic Tsunami Hazard Analysis of the Pacific Coast of Mexico: Case Study Based on the 1995 Colima Earthquake Tsunami

    Directory of Open Access Journals (Sweden)

    Nobuhito Mori

    2017-06-01

    Full Text Available This study develops a novel computational framework to carry out probabilistic tsunami hazard assessment for the Pacific coast of Mexico. The new approach enables the consideration of stochastic tsunami source scenarios having variable fault geometry and heterogeneous slip that are constrained by an extensive database of rupture models for historical earthquakes around the world. The assessment focuses upon the 1995 Jalisco–Colima Earthquake Tsunami from a retrospective viewpoint. Numerous source scenarios of large subduction earthquakes are generated to assess the sensitivity and variability of tsunami inundation characteristics of the target region. Analyses of nine slip models along the Mexican Pacific coast are performed, and statistical characteristics of slips (e.g., coherent structures of slip spectra are estimated. The source variability allows exploring a wide range of tsunami scenarios for a moment magnitude (Mw 8 subduction earthquake in the Mexican Pacific region to conduct thorough sensitivity analyses and to quantify the tsunami height variability. The numerical results indicate a strong sensitivity of maximum tsunami height to major slip locations in the source and indicate major uncertainty at the first peak of tsunami waves.

  5. Documentation for Initial Seismic Hazard Maps for Haiti

    Science.gov (United States)

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2010-01-01

    In response to the urgent need for earthquake-hazard information after the tragic disaster caused by the moment magnitude (M) 7.0 January 12, 2010, earthquake, we have constructed initial probabilistic seismic hazard maps for Haiti. These maps are based on the current information we have on fault slip rates and historical and instrumental seismicity. These initial maps will be revised and improved as more data become available. In the short term, more extensive logic trees will be developed to better capture the uncertainty in key parameters. In the longer term, we will incorporate new information on fault parameters and previous large earthquakes obtained from geologic fieldwork. These seismic hazard maps are important for the management of the current crisis and the development of building codes and standards for the rebuilding effort. The boundary between the Caribbean and North American Plates in the Hispaniola region is a complex zone of deformation. The highly oblique ~20 mm/yr convergence between the two plates (DeMets and others, 2000) is partitioned between subduction zones off of the northern and southeastern coasts of Hispaniola and strike-slip faults that transect the northern and southern portions of the island. There are also thrust faults within the island that reflect the compressional component of motion caused by the geometry of the plate boundary. We follow the general methodology developed for the 1996 U.S. national seismic hazard maps and also as implemented in the 2002 and 2008 updates. This procedure consists of adding the seismic hazard calculated from crustal faults, subduction zones, and spatially smoothed seismicity for shallow earthquakes and Wadati-Benioff-zone earthquakes. Each one of these source classes will be described below. The lack of information on faults in Haiti requires many assumptions to be made. These assumptions will need to be revisited and reevaluated as more fieldwork and research are accomplished. We made two sets of

  6. Global risk of big earthquakes has not recently increased.

    Science.gov (United States)

    Shearer, Peter M; Stark, Philip B

    2012-01-17

    The recent elevated rate of large earthquakes has fueled concern that the underlying global rate of earthquake activity has increased, which would have important implications for assessments of seismic hazard and our understanding of how faults interact. We examine the timing of large (magnitude M≥7) earthquakes from 1900 to the present, after removing local clustering related to aftershocks. The global rate of M≥8 earthquakes has been at a record high roughly since 2004, but rates have been almost as high before, and the rate of smaller earthquakes is close to its historical average. Some features of the global catalog are improbable in retrospect, but so are some features of most random sequences--if the features are selected after looking at the data. For a variety of magnitude cutoffs and three statistical tests, the global catalog, with local clusters removed, is not distinguishable from a homogeneous Poisson process. Moreover, no plausible physical mechanism predicts real changes in the underlying global rate of large events. Together these facts suggest that the global risk of large earthquakes is no higher today than it has been in the past.

  7. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  8. Multi-hazard approaches to civil infrastructure engineering

    CERN Document Server

    LaFave, James

    2016-01-01

    This collection focuses on the development of novel approaches to address one of the most pressing challenges of civil engineering, namely the mitigation of natural hazards. Numerous engineering books to date have focused on, and illustrate considerable progress toward, mitigation of individual hazards (earthquakes, wind, and so forth.). The current volume addresses concerns related to overall safety, sustainability and resilience of the built environment when subject to multiple hazards: natural disaster events that are concurrent and either correlated (e.g., wind and surge); uncorrelated (e.g., earthquake and flood); cascading (e.g., fire following earthquake); or uncorrelated and occurring at different times (e.g., wind and earthquake). The authors examine a range of specific topics including methodologies for vulnerability assessment of structures, new techniques to reduce the system demands through control systems; instrumentation, monitoring and condition assessment of structures and foundations; new te...

  9. Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Eisses, A.; Kell, A.; Kent, G. [UNR; Driscoll, N. [UCSD; Karlin, R.; Baskin, R. [USGS; Louie, J. [UNR; Pullammanappallil, S. [Optim

    2016-08-01

    Amy Eisses, Annie M. Kell, Graham Kent, Neal W. Driscoll, Robert E. Karlin, Robert L. Baskin, John N. Louie, Kenneth D. Smith, Sathish Pullammanappallil, 2011, Marine and land active-source seismic investigation of geothermal potential, tectonic structure, and earthquake hazards in Pyramid Lake, Nevada: presented at American Geophysical Union Fall Meeting, San Francisco, Dec. 5-9, abstract NS14A-08.

  10. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    Science.gov (United States)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015

  11. 75 FR 66388 - Scientific Earthquake Studies Advisory Committee

    Science.gov (United States)

    2010-10-28

    ..., including the multi-hazards demonstration project and earthquake early warning prototype development. The... DEPARTMENT OF THE INTERIOR U.S. Geological Survey [USGS-GX11GG009950000] Scientific Earthquake... Public Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next...

  12. 75 FR 2159 - Scientific Earthquake Studies Advisory Committee

    Science.gov (United States)

    2010-01-14

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey Scientific Earthquake Studies Advisory Committee... Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting at the U.S. Geological... participation in the National Earthquake Hazards Reduction Program. The Committee will receive updates and...

  13. Temporal Variation of Tectonic Tremor Activity Associated with Nearby Earthquakes

    Science.gov (United States)

    Chao, K.; Van der Lee, S.; Hsu, Y. J.; Pu, H. C.

    2017-12-01

    Tectonic tremor and slow slip events, located downdip from the seismogenic zone, hold the key to recurring patterns of typical earthquakes. Several findings of slow aseismic slip during the prenucletion processes of nearby earthquakes have provided new insight into the study of stress transform of slow earthquakes in fault zones prior to megathrust earthquakes. However, how tectonic tremor is associated with the occurrence of nearby earthquakes remains unclear. To enhance our understanding of the stress interaction between tremor and earthquakes, we developed an algorithm for the automatic detection and location of tectonic tremor in the collisional tectonic environment in Taiwan. Our analysis of a three-year data set indicates a short-term increase in the tremor rate starting at 19 days before the 2010 ML6.4 Jiashian main shock (Chao et al., JGR, 2017). Around the time when the tremor rate began to rise, one GPS station recorded a flip in its direction of motion. We hypothesize that tremor is driven by a slow-slip event that preceded the occurrence of the shallower nearby main shock, even though the inferred slip is too small to be observed by all GPS stations. To better quantify what the necessary condition for tremor to response to nearby earthquakes is, we obtained a 13-year ambient tremor catalog from 2004 to 2016 in the same region. We examine the spatiotemporal relationship between tremor and 37 ML>=5.0 (seven events with ML>=6.0) nearby earthquakes located within 0.5 degrees to the active tremor sources. The findings from this study can enhance our understanding of the interaction among tremor, slow slip, and nearby earthquakes in the high seismic hazard regions.

  14. Promise and problems in using stress triggering models for time-dependent earthquake hazard assessment

    Science.gov (United States)

    Cocco, M.

    2001-12-01

    Earthquake stress changes can promote failures on favorably oriented faults and modify the seismicity pattern over broad regions around the causative faults. Because the induced stress perturbations modify the rate of production of earthquakes, they alter the probability of seismic events in a specified time window. Comparing the Coulomb stress changes with the seismicity rate changes and aftershock patterns can statistically test the role of stress transfer in earthquake occurrence. The interaction probability may represent a further tool to test the stress trigger or shadow model. The probability model, which incorporate stress transfer, has the main advantage to include the contributions of the induced stress perturbation (a static step in its present formulation), the loading rate and the fault constitutive properties. Because the mechanical conditions of the secondary faults at the time of application of the induced load are largely unkown, stress triggering can only be tested on fault populations and not on single earthquake pairs with a specified time delay. The interaction probability can represent the most suitable tool to test the interaction between large magnitude earthquakes. Despite these important implications and the stimulating perspectives, there exist problems in understanding earthquake interaction that should motivate future research but at the same time limit its immediate social applications. One major limitation is that we are unable to predict how and if the induced stress perturbations modify the ratio between small versus large magnitude earthquakes. In other words, we cannot distinguish between a change in this ratio in favor of small events or of large magnitude earthquakes, because the interaction probability is independent of magnitude. Another problem concerns the reconstruction of the stressing history. The interaction probability model is based on the response to a static step; however, we know that other processes contribute to

  15. Seismic methodology in determining basis earthquake for nuclear installation

    International Nuclear Information System (INIS)

    Ameli Zamani, Sh.

    2008-01-01

    Design basis earthquake ground motions for nuclear installations should be determined to assure the design purpose of reactor safety: that reactors should be built and operated to pose no undue risk to public health and safety from earthquake and other hazards. Regarding the influence of seismic hazard to a site, large numbers of earthquake ground motions can be predicted considering possible variability among the source, path, and site parameters. However, seismic safety design using all predicted ground motions is practically impossible. In the determination of design basis earthquake ground motions it is therefore important to represent the influences of the large numbers of earthquake ground motions derived from the seismic ground motion prediction methods for the surrounding seismic sources. Viewing the relations between current design basis earthquake ground motion determination and modem earthquake ground motion estimation, a development of risk-informed design basis earthquake ground motion methodology is discussed for insight into the on going modernization of the Examination Guide for Seismic Design on NPP

  16. The International Platform on Earthquake Early Warning Systems (IP-EEWS)

    Science.gov (United States)

    Torres, Jair; Fanchiotti, Margherita

    2017-04-01

    The Sendai Framework for Disaster Risk Reduction 2015-2030 recognizes the need to "substantially increase the availability of and access to multi-hazard early warning systems and disaster risk information and assessments to the people by 2030" as one of its global targets (target "g"). While considerable progress has been made in recent decades, early warning systems (EWSs) continue to be less developed for geo-hazards and significant challenges remain in advancing the development of EWSs for specific hazards, particularly for fastest onset hazards such as earthquakes. An earthquake early warning system (EEWS) helps in disseminating timely information about potentially catastrophic earthquake hazards to the public, emergency managers and the private sector to provide enough time to implement automatized emergency measures. At the same time, these systems help to reduce considerably the CO2 emissions produced by the catastrophic impacts and subsequent effects of earthquakes, such as those generated by fires, collapses, and pollution (among others), as well as those produced in the recovery and reconstruction processes. In recent years, EEWSs have been developed independently in few countries: EEWSs have shown operational in Japan and Mexico, while other regions in California (USA), Turkey, Italy, Canada, South Korea and China (including Taiwan) are in the development stages or under restricted applications. Many other countries in the Indian Subcontinent, Southeast Asia, Central Asia, Middle East, Eastern Africa, Southeast Africa, as well as Central America, South America and the Caribbean, are located in some of the most seismically active regions in the world, or present moderate seismicity but high vulnerability, and would strongly benefit from the development of EEWSs. Given that, in many instances, the development of an EEWS still requires further testing, increased density coverage in seismic observation stations, regional coordination, and further scientific

  17. Approach to seismic hazard analysis for dam safety in the Sierra Nevada and Modoc Plateau of California

    International Nuclear Information System (INIS)

    Savage, W.U.; McLaren, M.K.; Edwards, W.D.; Page, W.D.

    1991-01-01

    Pacific Gas and Electric Company's hydroelectric generating system involves about 150 dams located in the Sierra Nevada and Modoc Plateau region of central and northern California. The utility's strategy for earthquake hazard assessment is described. The approach includes the following strategies: integrating regional tectonics, seismic geology, historical seismicity, microseismicity, and crustal structure to form a comprehensive regional understanding of the neotectonic setting; performing local studies to acquire data as needed to reduce uncertainties in geologic and seismic parameters of fault characteristics near specific dam sites; applying and extending recently developed geologic, seismologic, and earthquake engineering technologies to the current regional and site-specific information to evaluate fault characteristics, to estimate maximum earthquakes, and to characterize ground motion; and encouraging multiple independent reviews of earthquake hazard studies by conducting peer reviews, making field sites available to regulating agencies, and publishing results, methods and data in open literature. 46 refs., 8 tabs

  18. Chilean megathrust earthquake recurrence linked to frictional contrast at depth

    Science.gov (United States)

    Moreno, M.; Li, S.; Melnick, D.; Bedford, J. R.; Baez, J. C.; Motagh, M.; Metzger, S.; Vajedian, S.; Sippl, C.; Gutknecht, B. D.; Contreras-Reyes, E.; Deng, Z.; Tassara, A.; Oncken, O.

    2018-04-01

    Fundamental processes of the seismic cycle in subduction zones, including those controlling the recurrence and size of great earthquakes, are still poorly understood. Here, by studying the 2016 earthquake in southern Chile—the first large event within the rupture zone of the 1960 earthquake (moment magnitude (Mw) = 9.5)—we show that the frictional zonation of the plate interface fault at depth mechanically controls the timing of more frequent, moderate-size deep events (Mw shallow earthquakes (Mw > 8.5). We model the evolution of stress build-up for a seismogenic zone with heterogeneous friction to examine the link between the 2016 and 1960 earthquakes. Our results suggest that the deeper segments of the seismogenic megathrust are weaker and interseismically loaded by a more strongly coupled, shallower asperity. Deeper segments fail earlier ( 60 yr recurrence), producing moderate-size events that precede the failure of the shallower region, which fails in a great earthquake (recurrence >110 yr). We interpret the contrasting frictional strength and lag time between deeper and shallower earthquakes to be controlled by variations in pore fluid pressure. Our integrated analysis strengthens understanding of the mechanics and timing of great megathrust earthquakes, and therefore could aid in the seismic hazard assessment of other subduction zones.

  19. 78 FR 19004 - Scientific Earthquake Studies Advisory Committee

    Science.gov (United States)

    2013-03-28

    ... Hazards Program. Focus topics for this meeting include induced seismicity, earthquake early warning and... DEPARTMENT OF THE INTERIOR U.S. Geological Survey [GX13GG009950000] Scientific Earthquake Studies... Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting...

  20. Analysis on Two Typical Landslide Hazard Phenomena in The Wenchuan Earthquake by Field Investigations and Shaking Table Tests

    Directory of Open Access Journals (Sweden)

    Changwei Yang

    2015-08-01

    Full Text Available Based on our field investigations of landslide hazards in the Wenchuan earthquake, some findings can be reported: (1 the multi-aspect terrain facing empty isolated mountains and thin ridges reacted intensely to the earthquake and was seriously damaged; (2 the slope angles of most landslides was larger than 45°. Considering the above disaster phenomena, the reasons are analyzed based on shaking table tests of one-sided, two-sided and four-sided slopes. The analysis results show that: (1 the amplifications of the peak accelerations of four-sided slopes is stronger than that of the two-sided slopes, while that of the one-sided slope is the weakest, which can indirectly explain the phenomena that the damage is most serious; (2 the amplifications of the peak accelerations gradually increase as the slope angles increase, and there are two inflection points which are the point where the slope angle is 45° and where the slope angle is 50°, respectively, which can explain the seismic phenomenon whereby landslide hazards mainly occur on the slopes whose slope angle is bigger than 45°. The amplification along the slope strike direction is basically consistent, and the step is smooth.

  1. The analysis of historical seismograms: an important tool for seismic hazard assessment. Case histories from French and Italian earthquakes

    International Nuclear Information System (INIS)

    Pino, N.A.

    2011-01-01

    Seismic hazard assessment relies on the knowledge of the source characteristics of past earthquakes. Unfortunately, seismic waveform analysis, representing the most powerful tool for the investigation of earthquake source parameters, is only possible for events occurred in the last 100-120 years, i.e., since seismographs with known response function were developed. Nevertheless, during this time significant earthquakes have been recorded by such instruments and today, also thanks to technological progress, these data can be recovered and analysed by means of modern techniques. In this paper, aiming at giving a general sketch of possible analyses and attainable results in historical seismogram studies, I briefly describe the major difficulties in processing the original waveforms and present a review of the results that I obtained from previous seismogram analysis of selected significant historical earthquakes occurred during the first decades of the 20. century, including (A) the December 28, 1908, Messina straits (southern Italy), (B) the June 11, 1909, Lambesc (southern France) - both of which are the strongest ever recorded instrumentally in their respective countries - and (C) the July 13, 1930, Irpinia (southern Italy) events. For these earthquakes, the major achievements are represented by the assessment of the seismic moment (A, B, C), the geometry and kinematics of faulting (B, C), the fault length and an approximate slip distribution (A, C). The source characteristics of the studied events have also been interpreted in the frame of the tectonic environment active in the respective region of interest. In spite of the difficulties inherent to the investigation of old seismic data, these results demonstrate the invaluable and irreplaceable role of historical seismogram analysis in defining the local seismo-genic potential and, ultimately, for assessing the seismic hazard. The retrieved information is crucial in areas where important civil engineering works

  2. Leveraging geodetic data to reduce losses from earthquakes

    Science.gov (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    Seismic hazard assessments that are based on a variety of data and the best available science, coupled with rapid synthesis of real-time information from continuous monitoring networks to guide post-earthquake response, form a solid foundation for effective earthquake loss reduction. With this in mind, the Earthquake Hazards Program (EHP) of the U.S. Geological Survey (USGS) Natural Hazards Mission Area (NHMA) engages in a variety of undertakings, both established and emergent, in order to provide high quality products that enable stakeholders to take action in advance of and in response to earthquakes. Examples include the National Seismic Hazard Model (NSHM), development of tools for improved situational awareness such as earthquake early warning (EEW) and operational earthquake forecasting (OEF), research about induced seismicity, and new efforts to advance comprehensive subduction zone science and monitoring. Geodetic observations provide unique and complementary information directly relevant to advancing many aspects of these efforts (fig. 1). EHP scientists have long leveraged geodetic data for a range of influential studies, and they continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to natural hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, considering ways to fully utilize this observational resource for earthquake loss reduction is timely and essential. This report presents strategies, and the underlying scientific rationale, by which the EHP could achieve the following outcomes: The EHP is an authoritative source for the interpretation of geodetic data and its use for earthquake loss reduction throughout the United States and its territories.The USGS consistently provides timely, high quality geodetic data to stakeholders.Significant earthquakes are better characterized by incorporating geodetic data into USGS

  3. Comparing population exposure to multiple Washington earthquake scenarios for prioritizing loss estimation studies

    Science.gov (United States)

    Wood, Nathan J.; Ratliff, Jamie L.; Schelling, John; Weaver, Craig S.

    2014-01-01

    Scenario-based, loss-estimation studies are useful for gauging potential societal impacts from earthquakes but can be challenging to undertake in areas with multiple scenarios and jurisdictions. We present a geospatial approach using various population data for comparing earthquake scenarios and jurisdictions to help emergency managers prioritize where to focus limited resources on data development and loss-estimation studies. Using 20 earthquake scenarios developed for the State of Washington (USA), we demonstrate how a population-exposure analysis across multiple jurisdictions based on Modified Mercalli Intensity (MMI) classes helps emergency managers understand and communicate where potential loss of life may be concentrated and where impacts may be more related to quality of life. Results indicate that certain well-known scenarios may directly impact the greatest number of people, whereas other, potentially lesser-known, scenarios impact fewer people but consequences could be more severe. The use of economic data to profile each jurisdiction’s workforce in earthquake hazard zones also provides additional insight on at-risk populations. This approach can serve as a first step in understanding societal impacts of earthquakes and helping practitioners to efficiently use their limited risk-reduction resources.

  4. Washington Tsunami Hazard Mitigation Program

    Science.gov (United States)

    Walsh, T. J.; Schelling, J.

    2012-12-01

    Washington State has participated in the National Tsunami Hazard Mitigation Program (NTHMP) since its inception in 1995. We have participated in the tsunami inundation hazard mapping, evacuation planning, education, and outreach efforts that generally characterize the NTHMP efforts. We have also investigated hazards of significant interest to the Pacific Northwest. The hazard from locally generated earthquakes on the Cascadia subduction zone, which threatens tsunami inundation in less than hour following a magnitude 9 earthquake, creates special problems for low-lying accretionary shoreforms in Washington, such as the spits of Long Beach and Ocean Shores, where high ground is not accessible within the limited time available for evacuation. To ameliorate this problem, we convened a panel of the Applied Technology Council to develop guidelines for construction of facilities for vertical evacuation from tsunamis, published as FEMA 646, now incorporated in the International Building Code as Appendix M. We followed this with a program called Project Safe Haven (http://www.facebook.com/ProjectSafeHaven) to site such facilities along the Washington coast in appropriate locations and appropriate designs to blend with the local communities, as chosen by the citizens. This has now been completed for the entire outer coast of Washington. In conjunction with this effort, we have evaluated the potential for earthquake-induced ground failures in and near tsunami hazard zones to help develop cost estimates for these structures and to establish appropriate tsunami evacuation routes and evacuation assembly areas that are likely to to be available after a major subduction zone earthquake. We intend to continue these geotechnical evaluations for all tsunami hazard zones in Washington.

  5. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    Science.gov (United States)

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  6. Scenario for a Short-Term Probabilistic Seismic Hazard Assessment (PSHA in Chiayi, Taiwan

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan

    2013-01-01

    Full Text Available Using seismic activity and the Meishan earthquake sequence that occurred from 1904 to 1906, a scenario for short-term probabilistic seismic hazards in the Chiayi region of Taiwan is assessed. The long-term earthquake occurrence rate in Taiwan was evaluated using a smoothing kernel. The highest seismicity rate was calculated around the Chiayi region. To consider earthquake interactions, the rate-and-state friction model was introduced to estimate the seismicity rate evolution due to the Coulomb stress change. As imparted by the 1904 Touliu earthquake, stress changes near the 1906 Meishan and Yangshuigang epicenters was higher than the magnitude of tidal triggering. With regard to the impact of the Meishan earthquake, the region close to the Yangshuigang earthquake epicenter had a +0.75 bar stress increase. The results indicated significant interaction between the three damage events. Considering the path and site effect using ground motion prediction equations, a probabilistic seismic hazard in the form of a hazard evolution and a hazard map was assessed. A significant elevation in hazards following the three earthquakes in the sequence was determined. The results illustrate a possible scenario for seismic hazards in the Chiayi region which may take place repeatly in the future. Such scenario provides essential information on earthquake preparation, devastation estimations, emergency sheltering, utility restoration, and structure reconstruction.

  7. Near real-time aftershock hazard maps for earthquakes

    Science.gov (United States)

    McCloskey, J.; Nalbant, S. S.

    2009-04-01

    Stress interaction modelling is routinely used to explain the spatial relationships between earthquakes and their aftershocks. On 28 October 2008 a M6.4 earthquake occurred near the Pakistan-Afghanistan border killing several hundred and causing widespread devastation. A second M6.4 event occurred 12 hours later 20km to the south east. By making some well supported assumptions concerning the source event and the geometry of any likely triggered event it was possible to map those areas most likely to experience further activity. Using Google earth, it would further have been possible to identify particular settlements in the source area which were particularly at risk and to publish their locations globally within about 3 hours of the first earthquake. Such actions could have significantly focused the initial emergency response management. We argue for routine prospective testing of such forecasts and dialogue between social and physical scientists and emergency response professionals around the practical application of these techniques.

  8. Documentation for the Southeast Asia seismic hazard maps

    Science.gov (United States)

    Petersen, Mark; Harmsen, Stephen; Mueller, Charles; Haller, Kathleen; Dewey, James; Luco, Nicolas; Crone, Anthony; Lidke, David; Rukstales, Kenneth

    2007-01-01

    The U.S. Geological Survey (USGS) Southeast Asia Seismic Hazard Project originated in response to the 26 December 2004 Sumatra earthquake (M9.2) and the resulting tsunami that caused significant casualties and economic losses in Indonesia, Thailand, Malaysia, India, Sri Lanka, and the Maldives. During the course of this project, several great earthquakes ruptured subduction zones along the southern coast of Indonesia (fig. 1) causing additional structural damage and casualties in nearby communities. Future structural damage and societal losses from large earthquakes can be mitigated by providing an advance warning of tsunamis and introducing seismic hazard provisions in building codes that allow buildings and structures to withstand strong ground shaking associated with anticipated earthquakes. The Southeast Asia Seismic Hazard Project was funded through a United States Agency for International Development (USAID)—Indian Ocean Tsunami Warning System to develop seismic hazard maps that would assist engineers in designing buildings that will resist earthquake strong ground shaking. An important objective of this project was to discuss regional hazard issues with building code officials, scientists, and engineers in Thailand, Malaysia, and Indonesia. The code communities have been receptive to these discussions and are considering updating the Thailand and Indonesia building codes to incorporate new information (for example, see notes from Professor Panitan Lukkunaprasit, Chulalongkorn University in Appendix A).

  9. Coseismic Strain Steps of the 2008 Wenchuan Earthquake Indicate EW Extension of Tibetan Plateau and Increased Hazard South to Epicenter

    Science.gov (United States)

    Fu, G.; Shen, X.; Tang, J.; Fukuda, Y.

    2008-12-01

    The 2008 Wenchuan earthquake (Ms8.0) occurred at the east edge of Tibetan Plateau. It is the biggest seismic disaster in China since the 1976 Tangshan earthquake. To determine the effects of the earthquake on the deformation field of Tibetan Plateau, we collect and analyze continuing strain data of three stations before and after the earthquake in Tibetan Plateau observed by capacitance-type bore-hole strainmeters (Chi, 1985). We collect strain data in NS, EW, NE-SW and NW-NS directions at each borehole. Then we deduce the co-seismic strain steps at time point 14:28 of May 12, 2008 (at this time point the earthquake occurred) with the data before and after the earthquake using the least squares method. Our observation shows that in Tibetan Plateau significant co-seismic strain steps are accompanied with the 2008 Wenchuan earthquake. Extension in EW direction is observed at interior and north Tibetan Plateau which indicates a rapid EW extension of the whole Plateau. Field investigation shows that the 2008 Wenchuan earthquake is a manifestation of eastward growth of the Tibetan Plateau (Dong et al., 2008). Eastwards growth of the Tibetan Plateau results naturally in the extension of the Plateau in EW direction. Our co-seismic strain observation agrees well with the conclusion from surface rupture investigation. The magnitude of co-seismic strain step equals to five times of average year extensional strain rate throughout the plateau interior. Shortening in SE- NW direction is observed at the east edge of the Plateau. As hints that the eastward extension of Tibetan Plateau is resisted by Sichuan rigid basin which increases the potential earthquake hazard around the observation station, manifests the declaration from co-seismic stress changes calculation (Persons et al., 2008). Our observed co-seismic strain steps are in total lager than theoretical calculations of dislocation theories which indicate that magnitude of the great earthquake should be bigger than 7.9. Due

  10. Geological Deformations and Potential Hazards Triggered by the 01-12-2010 Haiti Earthquake: Insights from Google Earth Imagery

    Science.gov (United States)

    Doblas, M.; Benito, B.; Torres, Y.; Belizaire, D.; Dorfeuille, J.; Aretxabala, A.

    2013-05-01

    In this study we compare the different Google Earth imagery (GEI) available before and after the 01-12-2010 earthquake of Haiti and carry out a detailed analysis of the superficial seismic-related geological deformations in the following sites: 1) the capital Port-Au-Prince and other cities (Carrefour and Gresslier); 2) the mountainous area of the Massif de la Selle which is transected by the "Enriquillo-Plaintain-Garden" (EPG) interplate boundary-fault (that supposedly triggered the seism); 3) some of the most important river channels and their corresponding deltas (Momanche, Grise and Frorse). The initial results of our researches were published in March 2010 in a special web page created by the scientific community to try to mitigate the devastating effects of this catastrophe (http://supersites.earthobservations.org/haiti.php). Six types of superficial geological deformations triggered by the seismic event have been identified with the GEI: liquefaction structures, chaotic rupture zones, coastal and domal uplifts, river-delta turnovers, faults/ruptures and landslides. Potential geological hazards triggered by the Haiti earthquake include landslides, inundations, reactivation of active tectonic elements (e.g., fractures), river-delta turnovers, etc. We analyzed again the GEI after the rain period and, as expected, most of the geological deformations that we initially identified had been erased and/or modified by the water washout or buried by the sediments. In this sense the GEI constitutes an invaluable instrument in the analysis of seismic geological hazards: we still have the possibility to compare all the images before and after the seism that are recorded in its useful "time tool". These are in fact the only witnesses of most of the geological deformations triggered by the Haiti earthquake that remain stored in the virtual archives of the GEI. In fact a field trip to the area today would be useless as most of these structures have disappeared. We will show

  11. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    Science.gov (United States)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  12. Space-time behavior of continental intraplate earthquakes and implications for hazard assessment in China and the Central U.S.

    Science.gov (United States)

    Stein, Seth; Liu, Mian; Luo, Gang; Wang, Hui

    2014-05-01

    much faster than it accumulates today, suggesting that they result from recent fault activation that releases prestored strain energy in the crust. If so, this earthquake sequence is similar to aftershocks in that the rates of energy release should decay with time and the sequence of earthquakes will eventually end. We use simple physical analysis and numerical simulations to show that the current New Madrid earthquake sequence is likely ending or has ended. Recognizing that mid-continental earthquakes have long aftershock sequences and complex spatiotemporal occurrences is critical to improve hazard assessments

  13. Comprehensive understanding of a deep transition zone from an unstable- to stable-slip regime of the megathrust interplate earthquake

    Science.gov (United States)

    Kato, A.; Iidaka, T.; Ikuta, R.; Yoshida, Y.; Katsumata, K.; Iwasaki, T.; Sakai, S.; Yamaoka, K.; Watanabe, T.; Kunitomo, T.; Yamazaki, F.; Tsumura, N.; Nozaki, K.; Okubo, M.; Suzuki, S.; Hirata, N.; Zhang, H.; Thurber, C. H.

    2009-12-01

    Most slow slips have occurred in the deep transition zone from an unstable- to stable-slip regime. Detailed knowledge about a deep transition zone is essentially important to understand the mechanism of the slow slips, and the stress concentration process to the source region of the megathrust interplate earthquake. We have conducted a very dense seismic observation in the Tokai-region from the April to the August in 2008 through a linear deployment of 75 portable stations, in Japan. The array extended from the bottom part of the source region of the Tokai earthquake to deep low-frequency earthquakes (LFE, ~ 35 km depth) including the long-term slow-slip region (~ 25 km depth). Here we present a high-resolution tomographic imaging of seismic velocities and highly-accurate hypocenters including LFEs, using first arrival data from the dense seismograph deployment. We manually picked the first arrivals of P- and S- waves from each waveform for about 700 earthquakes including about 20 LFEs observed by the dense array. Then, we applied the TomoDD-code [Zhang and Thurber, 2003] to the arrival data set, adding an accurate double-difference data estimated by a waveform cross-correlation technique. A low velocity (Vp, Vs) layer with high Poisson’s ratio is clearly imaged, and tilts to the northwestward with a low dip angle, which corresponds to the subducting oceanic crust of the Philippine Sea Slab. Although seismicity within the oceanic crust is significantly low, few earthquakes occur within the oceanic crust. The LFEs are linearly aligned along the top surface of the subducting oceanic crust at depths from 30 to 40 km. The Poisson’s ratio within the oceanic crust does not show significant depth-dependent increase beneath the linear alignment of LFEs. This result argues against a depth section of Poisson’s ratio obtained in the SW Japan [Shelly et al., 2006]. Beneath the LFEs, active cluster of slab earthquakes are horizontally distributed. At the depths greater

  14. 78 FR 64973 - Scientific Earthquake Studies Advisory Committee (SESAC)

    Science.gov (United States)

    2013-10-30

    ... DEPARTMENT OF THE INTERIOR Geological Survey [GX14GG009950000] Scientific Earthquake Studies... Public Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next... Survey (USGS) on matters relating to the USGS's participation in the National Earthquake Hazards...

  15. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    Science.gov (United States)

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and

  16. NATURAL HAZARD ASSESSMENT OF SW MYANMAR - A CONTRIBUTION OF REMOTE SENSING AND GIS METHODS TO THE DETECTION OF AREAS VULNERABLE TO EARTHQUAKES AND TSUNAMI / CYCLONE FLOODING

    Directory of Open Access Journals (Sweden)

    George Pararas-Carayannis

    2009-01-01

    Full Text Available Myanmar, formerly Burma, is vulnerable to several natural hazards, such as earthquakes, cyclones, floods, tsunamis and landslides. The present study focuses on geomorphologic and geologic investigations of the south-western region of the country, based on satellite data (Shuttle Radar Topography Mission-SRTM, MODIS and LANDSAT. The main objective is to detect areas vulnerable to inundation by tsunami waves and cyclone surges. Since the region is also vulnerable to earthquake hazards, it is also important to identify seismotectonic patterns, the location of major active faults, and local site conditions that may enhance ground motions and earthquake intensities. As illustrated by this study, linear, topographic features related to subsurface tectonic features become clearly visible on SRTM-derived morphometric maps and on LANDSAT imagery. The GIS integrated evaluation of LANDSAT and SRTM data helps identify areas most susceptible to flooding and inundation by tsunamis and storm surges. Additionally, land elevation maps help identify sites greater than 10 m in elevation height, that would be suitable for the building of protective tsunami/cyclone shelters.

  17. Clustered and transient earthquake sequences in mid-continents

    Science.gov (United States)

    Liu, M.; Stein, S. A.; Wang, H.; Luo, G.

    2012-12-01

    Earthquakes result from sudden release of strain energy on faults. On plate boundary faults, strain energy is constantly accumulating from steady and relatively rapid relative plate motion, so large earthquakes continue to occur so long as motion continues on the boundary. In contrast, such steady accumulation of stain energy does not occur on faults in mid-continents, because the far-field tectonic loading is not steadily distributed between faults, and because stress perturbations from complex fault interactions and other stress triggers can be significant relative to the slow tectonic stressing. Consequently, mid-continental earthquakes are often temporally clustered and transient, and spatially migrating. This behavior is well illustrated by large earthquakes in North China in the past two millennia, during which no single large earthquakes repeated on the same fault segments, but moment release between large fault systems was complementary. Slow tectonic loading in mid-continents also causes long aftershock sequences. We show that the recent small earthquakes in the Tangshan region of North China are aftershocks of the 1976 Tangshan earthquake (M 7.5), rather than indicators of a new phase of seismic activity in North China, as many fear. Understanding the transient behavior of mid-continental earthquakes has important implications for assessing earthquake hazards. The sequence of large earthquakes in the New Madrid Seismic Zone (NMSZ) in central US, which includes a cluster of M~7 events in 1811-1812 and perhaps a few similar ones in the past millennium, is likely a transient process, releasing previously accumulated elastic strain on recently activated faults. If so, this earthquake sequence will eventually end. Using simple analysis and numerical modeling, we show that the large NMSZ earthquakes may be ending now or in the near future.

  18. Investigating landslides caused by earthquakes - A historical review

    Science.gov (United States)

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  19. Seismic hazard analysis for Jayapura city, Papua

    International Nuclear Information System (INIS)

    Robiana, R.; Cipta, A.

    2015-01-01

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds

  20. Seismic hazard analysis for Jayapura city, Papua

    Energy Technology Data Exchange (ETDEWEB)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A. [Geological Agency, Diponegoro Road No.57, Bandung, 40122 (Indonesia)

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  1. Trimming the UCERF2 hazard logic tree

    Science.gov (United States)

    Porter, Keith A.; Field, Edward H.; Milner, Kevin

    2012-01-01

    The Uniform California Earthquake Rupture Forecast 2 (UCERF2) is a fully time‐dependent earthquake rupture forecast developed with sponsorship of the California Earthquake Authority (Working Group on California Earthquake Probabilities [WGCEP], 2007; Field et al., 2009). UCERF2 contains 480 logic‐tree branches reflecting choices among nine modeling uncertainties in the earthquake rate model shown in Figure 1. For seismic hazard analysis, it is also necessary to choose a ground‐motion‐prediction equation (GMPE) and set its parameters. Choosing among four next‐generation attenuation (NGA) relationships results in a total of 1920 hazard calculations per site. The present work is motivated by a desire to reduce the computational effort involved in a hazard analysis without understating uncertainty. We set out to assess which branching points of the UCERF2 logic tree contribute most to overall uncertainty, and which might be safely ignored (set to only one branch) without significantly biasing results or affecting some useful measure of uncertainty. The trimmed logic tree will have all of the original choices from the branching points that contribute significantly to uncertainty, but only one arbitrarily selected choice from the branching points that do not.

  2. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  3. Performance of USGS one-year earthquake hazard map for natural and induced seismicity in the central and eastern United States

    Science.gov (United States)

    Brooks, E. M.; Stein, S.; Spencer, B. D.; Salditch, L.; Petersen, M. D.; McNamara, D. E.

    2017-12-01

    Seismicity in the central United States has dramatically increased since 2008 due to the injection of wastewater produced by oil and gas extraction. In response, the USGS created a one-year probabilistic hazard model and map for 2016 to describe the increased hazard posed to the central and eastern United States. Using the intensity of shaking reported to the "Did You Feel It?" system during 2016, we assess the performance of this model. Assessing the performance of earthquake hazard maps for natural and induced seismicity is conceptually similar but has practical differences. Maps that have return periods of hundreds or thousands of years— as commonly used for natural seismicity— can be assessed using historical intensity data that also span hundreds or thousands of years. Several different features stand out when assessing the USGS 2016 seismic hazard model for the central and eastern United States from induced and natural earthquakes. First, the model can be assessed as a forecast in one year, because event rates are sufficiently high to permit evaluation with one year of data. Second, because these models are projections from the previous year thus implicitly assuming that fluid injection rates remain the same, misfit may reflect changes in human activity. Our results suggest that the model was very successful by the metric implicit in probabilistic hazard seismic assessment: namely, that the fraction of sites at which the maximum shaking exceeded the mapped value is comparable to that expected. The model also did well by a misfit metric that compares the spatial patterns of predicted and maximum observed shaking. This was true for both the central and eastern United States as a whole, and for the region within it with the highest amount of seismicity, Oklahoma and its surrounding area. The model performed least well in northern Texas, over-stating hazard, presumably because lower oil and gas prices and regulatory action reduced the water injection volume

  4. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    Science.gov (United States)

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  5. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  6. Investigating Earthquake-induced Landslides­a Historical Review

    Science.gov (United States)

    Keefer, D. K.; Geological Survey, Us; Park, Menlo; Usa, Ca

    other anomalies in landslide occurrence. The documentation and synthesis of data on landslide occurrence in earthquakes has led to greatly increased understanding of the hazards associated with earthquake-induced landslides and to the development of models and methods for hazard mapping and evaluation. However, the number of earthquakes with relatively complete data on landslide occurrence is still small, and one of the most pressing research needs is for complete landslide inventories for many more events in a wider variety of environments. Such additional data, coupled with the increasing use of GIS and other current analytical tools should lead to substantial addi- tional refinements in models relating seismic shaking and geologic conditions to slope failure and thus to our ability to minimize damage and loss of life from seismically generated landslides. 2

  7. The California Hazards Institute

    Science.gov (United States)

    Rundle, J. B.; Kellogg, L. H.; Turcotte, D. L.

    2006-12-01

    California's abundant resources are linked with its natural hazards. Earthquakes, landslides, wildfires, floods, tsunamis, volcanic eruptions, severe storms, fires, and droughts afflict the state regularly. These events have the potential to become great disasters, like the San Francisco earthquake and fire of 1906, that overwhelm the capacity of society to respond. At such times, the fabric of civic life is frayed, political leadership is tested, economic losses can dwarf available resources, and full recovery can take decades. A patchwork of Federal, state and local programs are in place to address individual hazards, but California lacks effective coordination to forecast, prevent, prepare for, mitigate, respond to, and recover from, the harmful effects of natural disasters. Moreover, we do not know enough about the frequency, size, time, or locations where they may strike, nor about how the natural environment and man-made structures would respond. As California's population grows and becomes more interdependent, even moderate events have the potential to trigger catastrophes. Natural hazards need not become natural disasters if they are addressed proactively and effectively, rather than reactively. The University of California, with 10 campuses distributed across the state, has world-class faculty and students engaged in research and education in all fields of direct relevance to hazards. For that reason, the UC can become a world leader in anticipating and managing natural hazards in order to prevent loss of life and property and degradation of environmental quality. The University of California, Office of the President, has therefore established a new system-wide Multicampus Research Project, the California Hazards Institute (CHI), as a mechanism to research innovative, effective solutions for California. The CHI will build on the rich intellectual capital and expertise of the Golden State to provide the best available science, knowledge and tools for

  8. Progress in Understanding the Pre-Earthquake Associated Events by Analyzing IR Satellite Data

    Science.gov (United States)

    Ouzounov, Dimitar; Taylor, Patrick; Bryant, Nevin

    2004-01-01

    We present latest result in understanding the potential relationship between tectonic stress, electro-chemical and thermodynamic processes in the Earths crust and atmosphere with an increase in IR flux as a potential signature of electromagnetic (EM) phenomena that are related to earthquake activity, either pre-, co- or post seismic. Thermal infra-red (TIR) surveys performed by the polar orbiting (NOAA/AVHRR MODIS) and geosynchronous weather satellites (GOES, METEOSAT) gave an indication of the appearance (from days to weeks before the event) of "anomalous" space-time TIR transients that are associated with the location (epicenter and local tectonic structures) and time of a number of major earthquakes with M>5 and focal depths less than 50km. We analyzed broad category of associated pre-earthquake events, which provided evidence for changes in surface temperature, surface latent heat flux, chlorophyll concentrations, soil moisture, brightness temperature, emissivity of surface, water vapour in the atmosphere prior to the earthquakes occurred in Algeria, India, Iran, Italy, Mexico and Japan. The cause of such anomalies has been mainly related to the change of near-surface thermal properties due to complex lithosphere-hydrosphere-atmospheric interactions. As final results we present examples from the most recent (2000-2004) worldwide strong earthquakes and the techniques used to capture the tracks of EM emission mid-IR anomalies and a methodology for practical future use of such phenomena in the early warning systems.

  9. Subduction zone and crustal dynamics of western Washington; a tectonic model for earthquake hazards evaluation

    Science.gov (United States)

    Stanley, Dal; Villaseñor, Antonio; Benz, Harley

    1999-01-01

    The Cascadia subduction zone is extremely complex in the western Washington region, involving local deformation of the subducting Juan de Fuca plate and complicated block structures in the crust. It has been postulated that the Cascadia subduction zone could be the source for a large thrust earthquake, possibly as large as M9.0. Large intraplate earthquakes from within the subducting Juan de Fuca plate beneath the Puget Sound region have accounted for most of the energy release in this century and future such large earthquakes are expected. Added to these possible hazards is clear evidence for strong crustal deformation events in the Puget Sound region near faults such as the Seattle fault, which passes through the southern Seattle metropolitan area. In order to understand the nature of these individual earthquake sources and their possible interrelationship, we have conducted an extensive seismotectonic study of the region. We have employed P-wave velocity models developed using local earthquake tomography as a key tool in this research. Other information utilized includes geological, paleoseismic, gravity, magnetic, magnetotelluric, deformation, seismicity, focal mechanism and geodetic data. Neotectonic concepts were tested and augmented through use of anelastic (creep) deformation models based on thin-plate, finite-element techniques developed by Peter Bird, UCLA. These programs model anelastic strain rate, stress, and velocity fields for given rheological parameters, variable crust and lithosphere thicknesses, heat flow, and elevation. Known faults in western Washington and the main Cascadia subduction thrust were incorporated in the modeling process. Significant results from the velocity models include delineation of a previously studied arch in the subducting Juan de Fuca plate. The axis of the arch is oriented in the direction of current subduction and asymmetrically deformed due to the effects of a northern buttress mapped in the velocity models. This

  10. Seismic hazard, risk, and design for South America

    Science.gov (United States)

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best

  11. Investigating Landslides Caused by Earthquakes A Historical Review

    Science.gov (United States)

    Keefer, David K.

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated withearthquake-induced landslides. This paper traces thehistorical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquakes are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession ofpost-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing ``retrospective'' analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, syntheses of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  12. Challenges to communicate risks of human-caused earthquakes

    Science.gov (United States)

    Klose, C. D.

    2014-12-01

    The awareness of natural hazards has been up-trending in recent years. In particular, this is true for earthquakes, which increase in frequency and magnitude in regions that normally do not experience seismic activity. In fact, one of the major concerns for many communities and businesses is that humans today seem to cause earthquakes due to large-scale shale gas production, dewatering and flooding of mines and deep geothermal power production. Accordingly, without opposing any of these technologies it should be a priority of earth scientists who are researching natural hazards to communicate earthquake risks. This presentation discusses the challenges that earth scientists are facing to properly communicate earthquake risks, in light of the fact that human-caused earthquakes are an environmental change affecting only some communities and businesses. Communication channels may range from research papers, books and class room lectures to outreach events and programs, popular media events or even social media networks.

  13. Design and implementation of a voluntary collective earthquake insurance policy to cover low-income homeowners in a developing country

    OpenAIRE

    Marulanda, M.; Cardona, O.; Mora, Miguel; Barbat, Alex

    2018-01-01

    Understanding and evaluating disaster risk due to natural hazard events such as earthquakes creates powerful incentives for countries to develop planning options and tools to reduce potential damages. The use of models for earthquake risk evaluation allows obtaining outputs such as the loss exceedance curve, the expected annual loss and the probable maximum loss, which are probabilistic metrics useful for risk analyses, for designing strategies for risk reduction and mitigation, for emergency...

  14. A New Seismic Hazard Model for Mainland China

    Science.gov (United States)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.

    2017-12-01

    We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.

  15. Evaluation of Earthquake-Induced Effects on Neighbouring Faults and Volcanoes: Application to the 2016 Pedernales Earthquake

    Science.gov (United States)

    Bejar, M.; Alvarez Gomez, J. A.; Staller, A.; Luna, M. P.; Perez Lopez, R.; Monserrat, O.; Chunga, K.; Herrera, G.; Jordá, L.; Lima, A.; Martínez-Díaz, J. J.

    2017-12-01

    It has long been recognized that earthquakes change the stress in the upper crust around the fault rupture and can influence the short-term behaviour of neighbouring faults and volcanoes. Rapid estimates of these stress changes can provide the authorities managing the post-disaster situation with a useful tool to identify and monitor potential threads and to update the estimates of seismic and volcanic hazard in a region. Space geodesy is now routinely used following an earthquake to image the displacement of the ground and estimate the rupture geometry and the distribution of slip. Using the obtained source model, it is possible to evaluate the remaining moment deficit and to infer the stress changes on nearby faults and volcanoes produced by the earthquake, which can be used to identify which faults and volcanoes are brought closer to failure or activation. Although these procedures are commonly used today, the transference of these results to the authorities managing the post-disaster situation is not straightforward and thus its usefulness is reduced in practice. Here we propose a methodology to evaluate the potential influence of an earthquake on nearby faults and volcanoes and create easy-to-understand maps for decision-making support after an earthquake. We apply this methodology to the Mw 7.8, 2016 Ecuador earthquake. Using Sentinel-1 SAR and continuous GPS data, we measure the coseismic ground deformation and estimate the distribution of slip. Then we use this model to evaluate the moment deficit on the subduction interface and changes of stress on the surrounding faults and volcanoes. The results are compared with the seismic and volcanic events that have occurred after the earthquake. We discuss potential and limits of the methodology and the lessons learnt from discussion with local authorities.

  16. Microzonation of Seismic Hazard Potential in Taipei, Taiwan

    Science.gov (United States)

    Liu, K. S.; Lin, Y. P.

    2017-12-01

    The island of Taiwan lies at the boundary between the Philippine Sea plate and the Eurasia plate. Accordingly, the majority of seismic energy release near Taiwan originates from the two subduction zones. It is therefore not surprising that Taiwan has repeatedly been struck by large earthquakes such as 1986 Hualien earthquake, 1999 Chi Chi and 2002 Hualien earthquake. Microzonation of seismic hazard potential becomes necessary in Taipei City for the Central Geological Survey announced the Sanchiao active fault as Category II. In this study, a catalog of more than 2000 shallow earthquakes occurred from 1900 to 2015 with Mw magnitudes ranging from 5.0 to 8.2, and 11 disastrous earthquakes occurred from 1683-1899, as well as Sanchiao active fault in the vicinity are used to estimate the seismic hazard potential in Taipei City for seismic microzonation. Furthermore, the probabilities of seismic intensity exceeding CWB intensity 5, 6, 7 and MMI VI, VII, VIII in 10, 30, and 50-year periods in the above areas are also analyzed for the seismic microzonation. Finally, by comparing with the seismic zoning map of Taiwan in current building code that was revised after 921 earthquakes, Results of this study will show which areas with higher earthquake hazard potential in Taipei City. They provide a valuable database for the seismic design of critical facilities. It will help mitigate Taipei City earthquake disaster loss in the future, as well as provide critical information for emergency response plans.

  17. Natural hazards and risk reduction in Hawai'i: Chapter 10 in Characteristics of Hawaiian volcanoes

    Science.gov (United States)

    Kauahikaua, James P.; Tilling, Robert I.; Poland, Michael P.; Takahashi, T. Jane; Landowski, Claire M.

    2014-01-01

    Significant progress has been made over the past century in understanding, characterizing, and communicating the societal risks posed by volcanic, earthquake, and tsunami hazards in Hawai‘i. The work of the Hawaiian Volcano Observatory (HVO), with a century-long commitment to serving the public with credible hazards information, contributed substantially to this global progress. Thomas A. Jaggar, Jr., HVO’s founder, advocated that a scientific approach to understanding these hazards would result in strategies to mitigate their damaging effects. The resultant hazard-reduction methods range from prediction of eruptions and tsunamis, thereby providing early warnings for timely evacuation (if needed), to diversion of lava flows away from high-value infrastructure, such as hospitals. In addition to long-term volcano monitoring and multifaceted studies to better understand eruptive and seismic phenomena, HVO has continually and effectively communicated—through its publications, Web site, and public education/outreach programs—hazards information to emergency-management authorities, news media, and the public.

  18. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    Science.gov (United States)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  19. A Case Study of Geologic Hazards Affecting School Buildings: Evaluating Seismic Structural Vulnerability and Landslide Hazards at Schools in Aizawl, India

    Science.gov (United States)

    Perley, M. M.; Guo, J.

    2016-12-01

    India's National School Safety Program (NSSP) aims to assess all government schools in earthquake prone regions of the country. To supplement the Mizoram State Government's recent survey of 141 government schools, we screened an additional 16 private and 4 government schools for structural vulnerabilities due to earthquakes, as well as landslide hazards, in Mizoram's capital of Aizawl. We developed a geomorphologically derived landslide susceptibility matrix, which was cross-checked with Aizawl Municipal Corporation's landslide hazard map (provided by Lettis Consultants International), to determine the geologic hazards at each school. Our research indicates that only 7% of the 22 assessed school buildings are located within low landslide hazard zones; 64% of the school buildings, with approximately 9,500 students, are located within very high or high landslide hazard zones. Rapid Visual Screening (RVS) was used to determine the structural earthquake vulnerability of each school building. RVS is an initial vulnerability assessment procedure used to inventory and rank buildings that may be hazardous during an earthquake. Our study indicates that all of the 22 assessed school buildings have a damageability rating of Grade 3 or higher on the 5-grade EMS scale, suggesting a significant vulnerability and potential for damage in buildings, ranging from widespread cracking of columns and beam column joints to collapse. Additionally, 86% of the schools we visited had reinforced concrete buildings constructed before Aizawl's building regulations were passed in 2007, which can be assumed to lack appropriate seismic reinforcement. Using our findings, we will give recommendations to the Government of Mizoram to prevent unnecessary loss of life by minimizing each school's landslide risk and ensuring schools are earthquake-resistant.

  20. Tiechanshan-Tunghsiao anticline earthquake analysis: Implications for northwestern Taiwan potential carbon dioxide storage site seismic hazard

    Directory of Open Access Journals (Sweden)

    Ruey-Juin Rau

    2017-01-01

    Full Text Available We analyze the seismicity and earthquake focal mechanisms beneath the Tiechanshan-Tunghsiao (TCS-TH anticline over the last two decades for seismic hazard evaluation of a potential carbon dioxide storage site in northwestern Taiwan. Seismicity in the TCS-TH anticline indicates both spatial and temporal clustering at a depth range of 7 - 12 km. Thirteen 3.0 ≤ ML ≤ 5.2 earthquake focal mechanisms show a combination of thrust, strike-slip, and normal faulting mechanisms under the TCS-TH anticline. A 1992 ML 5.2 earthquake with a focal depth of ~10 km, the largest event ever recorded beneath the TCS-TH anticline during the last two decades, has a normal fault mechanism with the T-axis trending NNE-SSW and nodal planes oriented NNW-SSE, dipping either gently to the NNE or steeply to the SSW. Thrust fault mechanisms that occurred with mostly E-W or NWW-SEE striking P-axes and strike-slip faulting events indicate NWW-SEE striking P-axes and NNE-SSW trending T-axes, which are consistent with the regional plate convergence direction. For the strike-slip faulting events, if we take the N-S or NNW-SSE striking nodal planes as the fault planes, the strike-slip faults are sinistral motions and correspond to the Tapingting fault, which is a strike-slip fault reactivated from the inherited normal fault and intersects the Tiechanshan and Tunghsiao anticlines.

  1. Keeping pace with the science: Seismic hazard analysis in the western United States

    International Nuclear Information System (INIS)

    Youngs, R.R.; Coppersmith, K.J.

    1989-01-01

    Recent years have witnessed rapid advances in the understanding of the earthquake generation process in the western US, with particular emphasis on geologic studies of fault behavior and seismologic studies of the rupture process. The authors discuss how probabilistic seismic hazard analysis (PSHA) methodologies have been refined to keep pace with scientific understanding. Identified active faults are modeled as three-dimensional surfaces with the rupture shape and distribution of nucleation points estimated from physical constraints and seismicity. Active blind thrust ramps at depth and sources associated with subduction zones such as the Cascadia zone off Oregon and Washington can also be modeled. Maximum magnitudes are typically estimated from evaluations of possible rupture dimensions and empirical relations between these dimensions and earthquake magnitude. A rapidly evolving technique for estimating the length of future ruptures on a fault is termed segmentation, and incorporates behavior and geometric fault characteristics. To extend the short historical record, fault slip rate is now commonly used to constrain earthquake recurrence. Paleoseismic studies of fault behavior have led to the characteristic earthquake recurrence model specifying the relative frequency of earthquakes of various sizes. Recent studies have indicated the importance of faulting style and crustal structure on earthquake ground motions. For site-specific applications, empirical estimation techniques are being supplemented with numerical modeling approaches

  2. Earthquakes, Cities, and Lifelines: lessons integrating tectonics, society, and engineering in middle school Earth Science

    Science.gov (United States)

    Toke, N.; Johnson, A.; Nelson, K.

    2010-12-01

    Earthquakes are one of the most widely covered geologic processes by the media. As a result students, even at the middle school level, arrive in the classroom with preconceptions about the importance and hazards posed by earthquakes. Therefore earthquakes represent not only an attractive topic to engage students when introducing tectonics, but also a means to help students understand the relationships between geologic processes, society, and engineering solutions. Facilitating understanding of the fundamental connections between science and society is important for the preparation of future scientists and engineers as well as informed citizens. Here, we present a week-long lesson designed to be implemented in five one hour sessions with classes of ~30 students. It consists of two inquiry-based mapping investigations, motivational presentations, and short readings that describe fundamental models of plate tectonics, faults, and earthquakes. The readings also provide examples of engineering solutions such as the Alaskan oil pipeline which withstood multi-meter surface offset in the 2002 Denali Earthquake. The first inquiry-based investigation is a lesson on tectonic plates. Working in small groups, each group receives a different world map plotting both topography and one of the following data sets: GPS plate motion vectors, the locations and types of volcanoes, the location of types of earthquakes. Using these maps and an accompanying explanation of the data each group’s task is to map plate boundary locations. Each group then presents a ~10 minute summary of the type of data they used and their interpretation of the tectonic plates with a poster and their mapping results. Finally, the instructor will facilitate a class discussion about how the data types could be combined to understand more about plate boundaries. Using student interpretations of real data allows student misconceptions to become apparent. Throughout the exercise we record student preconceptions

  3. Safety Aspects of Sustainable Storage Dams and Earthquake Safety of Existing Dams

    Directory of Open Access Journals (Sweden)

    Martin Wieland

    2016-09-01

    Full Text Available The basic element in any sustainable dam project is safety, which includes the following safety elements: ① structural safety, ② dam safety monitoring, ③ operational safety and maintenance, and ④ emergency planning. Long-term safety primarily includes the analysis of all hazards affecting the project; that is, hazards from the natural environment, hazards from the man-made environment, and project-specific and site-specific hazards. The special features of the seismic safety of dams are discussed. Large dams were the first structures to be systematically designed against earthquakes, starting in the 1930s. However, the seismic safety of older dams is unknown, as most were designed using seismic design criteria and methods of dynamic analysis that are considered obsolete today. Therefore, we need to reevaluate the seismic safety of existing dams based on current state-of-the-art practices and rehabilitate deficient dams. For large dams, a site-specific seismic hazard analysis is usually recommended. Today, large dams and the safety-relevant elements used for controlling the reservoir after a strong earthquake must be able to withstand the ground motions of a safety evaluation earthquake. The ground motion parameters can be determined either by a probabilistic or a deterministic seismic hazard analysis. During strong earthquakes, inelastic deformations may occur in a dam; therefore, the seismic analysis has to be carried out in the time domain. Furthermore, earthquakes create multiple seismic hazards for dams such as ground shaking, fault movements, mass movements, and others. The ground motions needed by the dam engineer are not real earthquake ground motions but models of the ground motion, which allow the safe design of dams. It must also be kept in mind that dam safety evaluations must be carried out several times during the long life of large storage dams. These features are discussed in this paper.

  4. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  5. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  6. Sediment gravity flows triggered by remotely generated earthquake waves

    Science.gov (United States)

    Johnson, H. Paul; Gomberg, Joan S.; Hautala, Susan L.; Salmi, Marie S.

    2017-06-01

    Recent great earthquakes and tsunamis around the world have heightened awareness of the inevitability of similar events occurring within the Cascadia Subduction Zone of the Pacific Northwest. We analyzed seafloor temperature, pressure, and seismic signals, and video stills of sediment-enveloped instruments recorded during the 2011-2015 Cascadia Initiative experiment, and seafloor morphology. Our results led us to suggest that thick accretionary prism sediments amplified and extended seismic wave durations from the 11 April 2012 Mw8.6 Indian Ocean earthquake, located more than 13,500 km away. These waves triggered a sequence of small slope failures on the Cascadia margin that led to sediment gravity flows culminating in turbidity currents. Previous studies have related the triggering of sediment-laden gravity flows and turbidite deposition to local earthquakes, but this is the first study in which the originating seismic event is extremely distant (> 10,000 km). The possibility of remotely triggered slope failures that generate sediment-laden gravity flows should be considered in inferences of recurrence intervals of past great Cascadia earthquakes from turbidite sequences. Future similar studies may provide new understanding of submarine slope failures and turbidity currents and the hazards they pose to seafloor infrastructure and tsunami generation in regions both with and without local earthquakes.

  7. Earthquakes, detecting and understanding them

    International Nuclear Information System (INIS)

    2008-05-01

    The signatures at the surface of the Earth is continually changing on a geological timescale. The tectonic plates, which make up this surface, are moving in relation to each other. On human timescale, these movements are the result of earthquakes, which suddenly, release energy accumulated over a period of time. The vibrations they produce propagate through the interior of the Earth: these are seismic waves. However, other phenomena can generate seismic waves, such as volcanoes, quarry blasts, etc. The surf of the ocean waves on the coasts, the wind in the trees and human activity (industry and road traffic) all contribute to the 'seismic background noise'. Sensors are able to detect signals from events which are then discriminated, analyzed and located. Earthquakes and active volcanoes are not distributed randomly over the surface of the globe: they mainly coincide with mountain chains and ocean trenches and ridges. 'An earthquake results from the abrupt release of the energy accumulated by movements and rubbing of different plates'. The study of the propagation of seismic waves has allowed to determine the outline of the plates inside the Earth and has highlighted their movements. There are seven major plates which are colliding, diverging or sliding past each other. Each year the continents move several centimeters with respect to one another. This process, known as 'continental drift', was finally explained by plate tectonics. The initial hypothesis for this science dates from the beginning of the 20. century, but it was not confirmed until the 1960's. It explains that convection inside the Earth is the source of the forces required for these movements. This science, as well as explaining these great movements, has provided a coherent, unifying and quantitative framework, which unites the explanations for all the geophysical phenomena under one mechanism. (authors)

  8. An Arduino project to record ground motion and to learn on earthquake hazard at high school

    Science.gov (United States)

    Saraò, Angela; Barnaba, Carla; Clocchiatti, Marco; Zuliani, David

    2015-04-01

    Through a multidisciplinary work that integrates Technology education with Earth Sciences, we implemented an educational program to raise the students' awareness of seismic hazard and to disseminate good practices of earthquake safety. Using free software and low-cost open hardware, the students of a senior class of the high school Liceo Paschini in Tolmezzo (NE Italy) implemented a seismograph using the Arduino open-source electronics platform and the ADXL345 sensors to emulate a low cost seismometer (e.g. O-NAVI sensor of the Quake-Catcher Network, http://qcn.stanford.edu). To accomplish their task the students were addressed to use the web resources for technical support and troubleshooting. Shell scripts, running on local computers under Linux OS, controlled the process of recording and display data. The main part of the experiment was documented using the DokuWiki style. Some propaedeutic lessons in computer sciences and electronics were needed to build up the necessary skills of the students and to fill in the gap of their background knowledge. In addition lectures by seismologists and laboratory activity allowed the class to exploit different aspects of the physics of the earthquake and particularly of the seismic waves, and to become familiar with the topics of seismic hazard through an inquiry-based learning. The Arduino seismograph achieved can be used for educational purposes and it can display tremors on the local network of the school. For sure it can record the ground motion due to a seismic event that can occur in the area, but further improvements are necessary for a quantitative analysis of the recorded signals.

  9. Integrating Caribbean Seismic and Tsunami Hazard into Public Policy and Action

    Science.gov (United States)

    von Hillebrandt-Andrade, C.

    2012-12-01

    processes. For example, earthquake and tsunami exercises are conducted separately, without taking into consideration the compounding effects. Recognizing this deficiency, the UNESCO IOC Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS) which was established in 2005, decided to include the tsunami and earthquake impacts for the upcoming March 20, 2013 regional CARIBE WAVE/LANTEX tsunami exercise. In addition to the tsunami wave heights predicted by the National Weather Service Tsunami Warning Centers in Alaska and Hawaii, the USGS PAGER and SHAKE MAP results for the M8.5 scenario earthquake in the southern Caribbean were also integrated into the manual. Additionally, in recent catastrophic planning for Puerto Rico, FEMA did request the local researchers to determine both the earthquake and tsunami impacts for the same source. In the US, despite that the lead for earthquakes and tsunamis lies within two different agencies, USGS and NOAA/NWS, it has been very beneficial that the National Tsunami Hazard Mitigation Program partnership includes both agencies. By working together, the seismic and tsunami communities can achieve an even better understanding of the hazards, but also foster more actions on behalf of government officials and the populations at risk.

  10. Imaging and Understanding Foreshock and Aftershock Behavior Around the 2014 Iquique, Northern Chile, Earthquake

    Science.gov (United States)

    Yang, H.; Meng, X.; Peng, Z.; Newman, A. V.; Hu, S.; Williamson, A.

    2014-12-01

    On April 1st, 2014, a moment magnitude (MW) 8.2 earthquake occurred offshore Iquique, Northern Chile. There were numerous smaller earthquakes preceding and following the mainshock, making it an ideal case to study the spatio-temporal relation among these events and their association with the mainshock. We applied a matched-filter technique to detect previously missing foreshocks and aftershocks of the 2014 Iquique earthquake. Using more than 900 template events recorded by 19 broadband seismic stations (network code CX) operated by the GEOFON Program of GFZ Potsdam, we found 4392 earthquakes between March 1st and April 3rd, 2014, including more than 30 earthquakes with magnitude larger than 4 that were previously missed in the catalog from the Chile National Seismological Center. Additionally, we found numerous small earthquakes with magnitudes between 1 and 2 preceding the largest foreshock, an MW 6.7 event occurring on March 16th, approximately 2 weeks before the Iquique mainshock. We observed that the foreshocks migrated northward at a speed of approximately 6 km/day. Using a finite fault slip model of the mainshock determined from teleseismic waveform inversion (Hayes, 2014), we calculated the Coulomb stress changes in the nearby regions of the mainshock. We found that there was ~200% increase in seismicity in the areas with increased Coulomb stress. Our next step is to evaluate the Coulomb stress changes associated with earlier foreshocks and their roles in triggering later foreshocks, and possibly the mainshock. For this, we plan to create a fault model of the temporal evolution of the Coulomb behavior along the interface with time, assuming Wells and Coppersmith (1994) type fault parameters. These results will be compared with double-difference relocations (using HypoDD), presenting a more accurate understanding of the spatial-temporal evolution of foreshocks and aftershocks of the 2014 Iquique earthquake.

  11. Seismic hazard assessment of the Hanford region, Eastern Washington State

    International Nuclear Information System (INIS)

    Youngs, R.R.; Coppersmith, K.J.; Power, M.S.; Swan, F.H. III

    1985-01-01

    A probabilistic seismic hazard assessment was made for a site within the Hanford region of eastern Washington state, which is characterized as an intraplate region having a relatively low rate of seismic activity. Probabilistic procedures, such as logic trees, were utilized to account for the uncertainties in identifying and characterizing the potential seismic sources in the region. Logic trees provide a convenient, flexible means of assessing the values and relative likelihoods of input parameters to the hazard model that may be dependent upon each other. Uncertainties accounted for in this way include the tectonic model, segmentation, capability, fault geometry, maximum earthquake magnitude, and earthquake recurrence rate. The computed hazard results are expressed as a distribution from which confidence levels are assessed. Analysis of the results show the contributions to the total hazard from various seismic sources and due to various earthquake magnitudes. In addition, the contributions of uncertainties in the various source parameters to the uncertainty in the computed hazard are assessed. For this study, the major contribution to uncertainty in the computed hazard are due to uncertainties in the applicable tectonic model and the earthquake recurrence rate. This analysis serves to illustrate some of the probabilistic tools that are available for conducting seismic hazard assessments and for analyzing the results of these studies. 5 references, 7 figures

  12. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  13. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  14. Landslides in everyday life: An interdisciplinary approach to understanding vulnerability in the Himalayas

    Science.gov (United States)

    Sudmeier-Rieux, K.; Breguet, A.; Dubois, J.; Jaboyedoff, M.

    2009-04-01

    Several thousand landslides were triggered by the Kashmir earthquake, scarring the hillside with cracks. Monsoon rains continue to trigger landslides, which have increased the exposure of populations because of lost agricultural lands, blocked roads and annual fatalities due to landslides. The great majority of these landslides are shallow and relatively small but greatly impacting the population. In this region, landslides were a factor before the earthquake, mainly due to road construction and gravel excavation, but the several thousand landslides triggered by the earthquake have completely overwhelmed the local population and authorities. In Eastern Nepal, the last large earthquake to hit this region occurred in 1988, also triggering numerous landslides and cracks. Here, landslides can be considered a more common phenomenon, yet coping capacities amount to local observations of landslide movement, subsequent abandonment of houses and land as they become too dangerous. We present a comparative case study from Kashmir, Pakistan and Eastern Nepal, highlighting an interdisciplinary approach to understanding the complex interactions between land use, landslides and vulnerability. Our approach sets out to understand underlying causes of the massive landslides triggered by the 2005 earthquake in Kashmir, Pakistan, and also the increasing number of landslides in Nepal. By approaching the issue of landslides from multiple angles (risk perceptions, land use, local coping capacities, geological assessment, risk mapping) and multiple research techniques (remote sensing, GIS, geological assessment, participatory mapping, focus groups) we are better able to create a more complete picture of the "hazardscape". We find that by combining participatory social science research with hazard mapping, we obtain a more complete understanding of underlying causes, coping strategies and possible mitigation options, placing natural hazards in the context of everyday life. This method is

  15. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  16. Goce derived geoid changes before the Pisagua 2014 earthquake

    Directory of Open Access Journals (Sweden)

    Orlando Álvarez

    2018-01-01

    Full Text Available The analysis of space – time surface deformation during earthquakes reveals the variable state of stress that occurs at deep crustal levels, and this information can be used to better understand the seismic cycle. Understanding the possible mechanisms that produce earthquake precursors is a key issue for earthquake prediction. In the last years, modern geodesy can map the degree of seismic coupling during the interseismic period, as well as the coseismic and postseismic slip for great earthquakes along subduction zones. Earthquakes usually occur due to mass transfer and consequent gravity variations, where these changes have been monitored for intraplate earthquakes by means of terrestrial gravity measurements. When stresses and correspondent rupture areas are large, affecting hundreds of thousands of square kilometres (as occurs in some segments along plate interface zones, satellite gravimetry data become relevant. This is due to the higher spatial resolution of this type of data when compared to terrestrial data, and also due to their homogeneous precision and availability across the whole Earth. Satellite gravity missions as GOCE can map the Earth gravity field with unprecedented precision and resolution. We mapped geoid changes from two GOCE satellite models obtained by the direct approach, which combines data from other gravity missions as GRACE and LAGEOS regarding their best characteristics. The results show that the geoid height diminished from a year to five months before the main seismic event in the region where maximum slip occurred after the Pisagua Mw = 8.2 great megathrust earthquake. This diminution is interpreted as accelerated inland-directed interseismic mass transfer before the earthquake, coinciding with the intermediate degree of seismic coupling reported in the region. We highlight the advantage of satellite data for modelling surficial deformation related to pre-seismic displacements. This deformation, combined to

  17. The evaluation of the earthquake hazard using the exponential distribution method for different seismic source regions in and around Ağrı

    Energy Technology Data Exchange (ETDEWEB)

    Bayrak, Yusuf, E-mail: ybayrak@agri.edu.tr [Ağrı İbrahim Çeçen University, Ağrı/Turkey (Turkey); Türker, Tuğba, E-mail: tturker@ktu.edu.tr [Karadeniz Technical University, Department of Geophysics, Trabzon/Turkey (Turkey)

    2016-04-18

    The aim of this study; were determined of the earthquake hazard using the exponential distribution method for different seismic sources of the Ağrı and vicinity. A homogeneous earthquake catalog has been examined for 1900-2015 (the instrumental period) with 456 earthquake data for Ağrı and vicinity. Catalog; Bogazici University Kandilli Observatory and Earthquake Research Institute (Burke), National Earthquake Monitoring Center (NEMC), TUBITAK, TURKNET the International Seismological Center (ISC), Seismological Research Institute (IRIS) has been created using different catalogs like. Ağrı and vicinity are divided into 7 different seismic source regions with epicenter distribution of formed earthquakes in the instrumental period, focal mechanism solutions, and existing tectonic structures. In the study, the average magnitude value are calculated according to the specified magnitude ranges for 7 different seismic source region. According to the estimated calculations for 7 different seismic source regions, the biggest difference corresponding with the classes of determined magnitudes between observed and expected cumulative probabilities are determined. The recurrence period and earthquake occurrence number per year are estimated of occurring earthquakes in the Ağrı and vicinity. As a result, 7 different seismic source regions are determined occurrence probabilities of an earthquake 3.2 magnitude, Region 1 was greater than 6.7 magnitude, Region 2 was greater than than 4.7 magnitude, Region 3 was greater than 5.2 magnitude, Region 4 was greater than 6.2 magnitude, Region 5 was greater than 5.7 magnitude, Region 6 was greater than 7.2 magnitude, Region 7 was greater than 6.2 magnitude. The highest observed magnitude 7 different seismic source regions of Ağrı and vicinity are estimated 7 magnitude in Region 6. Region 6 are determined according to determining magnitudes, occurrence years of earthquakes in the future years, respectively, 7.2 magnitude was in 158

  18. GPS Imaging of Time-Variable Earthquake Hazard: The Hilton Creek Fault, Long Valley California

    Science.gov (United States)

    Hammond, W. C.; Blewitt, G.

    2016-12-01

    The Hilton Creek Fault, in Long Valley, California is a down-to-the-east normal fault that bounds the eastern edge of the Sierra Nevada/Great Valley microplate, and lies half inside and half outside the magmatically active caldera. Despite the dense coverage with GPS networks, the rapid and time-variable surface deformation attributable to sporadic magmatic inflation beneath the resurgent dome makes it difficult to use traditional geodetic methods to estimate the slip rate of the fault. While geologic studies identify cumulative offset, constrain timing of past earthquakes, and constrain a Quaternary slip rate to within 1-5 mm/yr, it is not currently possible to use geologic data to evaluate how the potential for slip correlates with transient caldera inflation. To estimate time-variable seismic hazard of the fault we estimate its instantaneous slip rate from GPS data using a new set of algorithms for robust estimation of velocity and strain rate fields and fault slip rates. From the GPS time series, we use the robust MIDAS algorithm to obtain time series of velocity that are highly insensitive to the effects of seasonality, outliers and steps in the data. We then use robust imaging of the velocity field to estimate a gridded time variable velocity field. Then we estimate fault slip rate at each time using a new technique that forms ad-hoc block representations that honor fault geometries, network complexity, connectivity, but does not require labor-intensive drawing of block boundaries. The results are compared to other slip rate estimates that have implications for hazard over different time scales. Time invariant long term seismic hazard is proportional to the long term slip rate accessible from geologic data. Contemporary time-invariant hazard, however, may differ from the long term rate, and is estimated from the geodetic velocity field that has been corrected for the effects of magmatic inflation in the caldera using a published model of a dipping ellipsoidal

  19. The 2014 update to the National Seismic Hazard Model in California

    Science.gov (United States)

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  20. Natural hazards science strategy

    Science.gov (United States)

    Holmes, Robert R.; Jones, Lucile M.; Eidenshink, Jeffery C.; Godt, Jonathan W.; Kirby, Stephen H.; Love, Jeffrey J.; Neal, Christina A.; Plant, Nathaniel G.; Plunkett, Michael L.; Weaver, Craig S.; Wein, Anne; Perry, Suzanne C.

    2012-01-01

    The mission of the U.S. Geological Survey (USGS) in natural hazards is to develop and apply hazard science to help protect the safety, security, and economic well-being of the Nation. The costs and consequences of natural hazards can be enormous, and each year more people and infrastructure are at risk. USGS scientific research—founded on detailed observations and improved understanding of the responsible physical processes—can help to understand and reduce natural hazard risks and to make and effectively communicate reliable statements about hazard characteristics, such as frequency, magnitude, extent, onset, consequences, and where possible, the time of future events.To accomplish its broad hazard mission, the USGS maintains an expert workforce of scientists and technicians in the earth sciences, hydrology, biology, geography, social and behavioral sciences, and other fields, and engages cooperatively with numerous agencies, research institutions, and organizations in the public and private sectors, across the Nation and around the world. The scientific expertise required to accomplish the USGS mission in natural hazards includes a wide range of disciplines that this report refers to, in aggregate, as hazard science.In October 2010, the Natural Hazards Science Strategy Planning Team (H–SSPT) was charged with developing a long-term (10-year) Science Strategy for the USGS mission in natural hazards. This report fulfills that charge, with a document hereinafter referred to as the Strategy, to provide scientific observations, analyses, and research that are critical for the Nation to become more resilient to natural hazards. Science provides the information that decisionmakers need to determine whether risk management activities are worthwhile. Moreover, as the agency with the perspective of geologic time, the USGS is uniquely positioned to extend the collective experience of society to prepare for events outside current memory. The USGS has critical statutory

  1. Cooperative earthquake research between the United States and the People's Republic of China

    Energy Technology Data Exchange (ETDEWEB)

    Russ, D.P.; Johnson, L.E.

    1986-01-01

    This paper describes cooperative research by scientists of the US and the People's Republic of China (PRC) which has resulted in important new findings concerning the fundamental characteristics of earthquakes and new insight into mitigating earthquake hazards. There have been over 35 projects cooperatively sponsored by the Earthquake Studies Protocol in the past 5 years. The projects are organized into seven annexes, including investigations in earthquake prediction, intraplate faults and earthquakes, earthquake engineering and hazards investigation, deep crustal structure, rock mechanics, seismology, and data exchange. Operational earthquake prediction experiments are currently being developed at two primary sites: western Yunnan Province near the town of Xiaguan, where there are several active faults, and the northeast China plain, where the devastating 1976 Tangshan earthquake occurred.

  2. Earthquake Loss Assessment for the Evaluation of the Sovereign Risk and Financial Sustainability of Countries and Cities

    Science.gov (United States)

    Cardona, O. D.

    2013-05-01

    Recently earthquakes have struck cities both from developing as well as developed countries, revealing significant knowledge gaps and the need to improve the quality of input data and of the assumptions of the risk models. The quake and tsunami in Japan (2011) and the disasters due to earthquakes in Haiti (2010), Chile (2010), New Zealand (2011) and Spain (2011), only to mention some unexpected impacts in different regions, have left several concerns regarding hazard assessment as well as regarding the associated uncertainties to the estimation of the future losses. Understanding probable losses and reconstruction costs due to earthquakes creates powerful incentives for countries to develop planning options and tools to cope with sovereign risk, including allocating the sustained budgetary resources necessary to reduce those potential damages and safeguard development. Therefore the use of robust risk models is a need to assess the future economic impacts, the country's fiscal responsibilities and the contingent liabilities for governments and to formulate, justify and implement risk reduction measures and optimal financial strategies of risk retention and transfer. Special attention should be paid to the understanding of risk metrics such as the Loss Exceedance Curve (empiric and analytical) and the Expected Annual Loss in the context of conjoint and cascading hazards.

  3. Probabilistic Seismic Hazard Assessment for Northeast India Region

    Science.gov (United States)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  4. Disaster Risks Reduction for Extreme Natural Hazards

    Science.gov (United States)

    Plag, H.; Jules-Plag, S.

    2013-12-01

    Mega disasters associated with extreme natural hazards have the potential to escalate the global sustainability crisis and put us close to the boundaries of the safe operating space for humanity. Floods and droughts are major threats that potentially could reach planetary extent, particularly through secondary economic and social impacts. Earthquakes and tsunamis frequently cause disasters that eventually could exceed the immediate coping capacity of the global economy, particularly since we have built mega cities in hazardous areas that are now ready to be harvested by natural hazards. Unfortunately, the more we learn to cope with the relatively frequent hazards (50 to 100 years events), the less we are worried about the low-probability, high-impact events (a few hundred and more years events). As a consequence, threats from the 500 years flood, drought, volcano eruption are not appropriately accounted for in disaster risk reduction (DRR) discussions. Extreme geohazards have occurred regularly throughout the past, but mostly did not cause major disasters because exposure of human assets to hazards was much lower in the past. The most extreme events that occurred during the last 2,000 years would today cause unparalleled damage on a global scale and could worsen the sustainability crisis. Simulation of these extreme hazards under present conditions can help to assess the disaster risk. Recent extreme earthquakes have illustrated the destruction they can inflict, both directly and indirectly through tsunamis. Large volcano eruptions have the potential to impact climate, anthropogenic infrastructure and resource supplies on global scale. During the last 2,000 years several large volcano eruptions occurred, which under today's conditions are associated with extreme disaster risk. The comparison of earthquakes and volcano eruptions indicates that large volcano eruptions are the low-probability geohazards with potentially the highest impact on our civilization

  5. Local amplification of seismic waves from the Denali earthquake and damaging seiches in Lake Union, Seattle, Washington

    Science.gov (United States)

    Barberopoulou, A.; Qamar, A.; Pratt, T.L.; Creager, K.C.; Steele, W.P.

    2004-01-01

    The Mw7.9 Denali, Alaska earthquake of 3 November, 2002, caused minor damage to at least 20 houseboats in Seattle, Washington by initiating water waves in Lake Union. These water waves were likely initiated during the large amplitude seismic surface waves from this earthquake. Maps of spectral amplification recorded during the Denali earthquake on the Pacific Northwest Seismic Network (PNSN) strong-motion instruments show substantially increased shear and surface wave amplitudes coincident with the Seattle sedimentary basin. Because Lake Union is situated on the Seattle basin, the size of the water waves may have been increased by local amplification of the seismic waves by the basin. Complete hazard assessments require understanding the causes of these water waves during future earthquakes. Copyright 2004 by the American Geophysical Union.

  6. The application of the geography census data in seismic hazard assessment

    Science.gov (United States)

    Yuan, Shen; Ying, Zhang

    2017-04-01

    Limited by basic data timeliness to earthquake emergency database in Sichuan province, after the earthquake disaster assessment results and the actual damage there is a certain gap. In 2015, Sichuan completed the province census for the first time which including topography, traffic, vegetation coverage, water area, desert and bare ground, traffic network, the census residents and facilities, geographical unit, geological hazard as well as the Lushan earthquake-stricken area's town planning construction and ecological environment restoration. On this basis, combining with the existing achievements of basic geographic information data and high resolution image data, supplemented by remote sensing image interpretation and geological survey, Carried out distribution and change situation of statistical analysis and information extraction for earthquake disaster hazard-affected body elements such as surface coverage, roads, structures infrastructure in Lushan county before 2013 after 2015. At the same time, achieved the transformation and updating from geographical conditions census data to earthquake emergency basic data through research their data type, structure and relationship. Finally, based on multi-source disaster information including hazard-affected body changed data and Lushan 7.0 magnitude earthquake CORS network coseismal displacement field, etc. obtaining intensity control points through information fusion. Then completed the seismic influence field correction and assessed earthquake disaster again through Sichuan earthquake relief headquarters technology platform. Compared the new assessment result,original assessment result and actual earthquake disaster loss which shows that the revised evaluation result is more close to the actual earthquake disaster loss. In the future can realize geographical conditions census data to earthquake emergency basic data's normalized updates, ensure the timeliness to earthquake emergency database meanwhile improve the

  7. Multi scenario seismic hazard assessment for Egypt

    Science.gov (United States)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-05-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  8. Satellite-based emergency mapping using optical imagery: experience and reflections from the 2015 Nepal earthquakes

    Science.gov (United States)

    Williams, Jack G.; Rosser, Nick J.; Kincey, Mark E.; Benjamin, Jessica; Oven, Katie J.; Densmore, Alexander L.; Milledge, David G.; Robinson, Tom R.; Jordan, Colm A.; Dijkstra, Tom A.

    2018-01-01

    Landslides triggered by large earthquakes in mountainous regions contribute significantly to overall earthquake losses and pose a major secondary hazard that can persist for months or years. While scientific investigations of coseismic landsliding are increasingly common, there is no protocol for rapid (hours-to-days) humanitarian-facing landslide assessment and no published recognition of what is possible and what is useful to compile immediately after the event. Drawing on the 2015 Mw 7.8 Gorkha earthquake in Nepal, we consider how quickly a landslide assessment based upon manual satellite-based emergency mapping (SEM) can be realistically achieved and review the decisions taken by analysts to ascertain the timeliness and type of useful information that can be generated. We find that, at present, many forms of landslide assessment are too slow to generate relative to the speed of a humanitarian response, despite increasingly rapid access to high-quality imagery. Importantly, the value of information on landslides evolves rapidly as a disaster response develops, so identifying the purpose, timescales, and end users of a post-earthquake landslide assessment is essential to inform the approach taken. It is clear that discussions are needed on the form and timing of landslide assessments, and how best to present and share this information, before rather than after an earthquake strikes. In this paper, we share the lessons learned from the Gorkha earthquake, with the aim of informing the approach taken by scientists to understand the evolving landslide hazard in future events and the expectations of the humanitarian community involved in disaster response.

  9. Updating the USGS seismic hazard maps for Alaska

    Science.gov (United States)

    Mueller, Charles; Briggs, Richard; Wesson, Robert L.; Petersen, Mark D.

    2015-01-01

    The U.S. Geological Survey makes probabilistic seismic hazard maps and engineering design maps for building codes, emergency planning, risk management, and many other applications. The methodology considers all known earthquake sources with their associated magnitude and rate distributions. Specific faults can be modeled if slip-rate or recurrence information is available. Otherwise, areal sources are developed from earthquake catalogs or GPS data. Sources are combined with ground-motion estimates to compute the hazard. The current maps for Alaska were developed in 2007, and included modeled sources for the Alaska-Aleutian megathrust, a few crustal faults, and areal seismicity sources. The megathrust was modeled as a segmented dipping plane with segmentation largely derived from the slip patches of past earthquakes. Some megathrust deformation is aseismic, so recurrence was estimated from seismic history rather than plate rates. Crustal faults included the Fairweather-Queen Charlotte system, the Denali–Totschunda system, the Castle Mountain fault, two faults on Kodiak Island, and the Transition fault, with recurrence estimated from geologic data. Areal seismicity sources were developed for Benioff-zone earthquakes and for crustal earthquakes not associated with modeled faults. We review the current state of knowledge in Alaska from a seismic-hazard perspective, in anticipation of future updates of the maps. Updated source models will consider revised seismicity catalogs, new information on crustal faults, new GPS data, and new thinking on megathrust recurrence, segmentation, and geometry. Revised ground-motion models will provide up-to-date shaking estimates for crustal earthquakes and subduction earthquakes in Alaska.

  10. Slope instabilities triggered by the 2011 Lorca earthquake (M{sub w} 5.1): a comparison and revision of hazard assessments of earthquake-triggered landslides in Murcia; Inestabilidades de ladera provocadas por el terremoto de Lorca de 2011 (Mw 5,1): comparacion y revision de estudios de peligrosidad de movimientos de ladera por efecto sismico en Murcia

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez-Peces, M. J.; Garcia-Mayordomo, J.; Martinez-Diaz, J. J.; Tsige, M.

    2012-11-01

    The Lorca basin has been the object of recent research aimed at studying the phenomenon of earthquake induced landslides and their assessment within the context of different seismic scenarios, bearing in mind the influence of soil and topographical amplification effects. Nevertheless, it was not until the Lorca earthquakes of 11 May 2011 that it became possible to adopt a systematic approach to the problem. We provide here an inventory of slope instabilities triggered by the Lorca earthquakes comprising 100 cases, mainly small rock and soil falls (1 to 100 m{sup 3}). The distribution of these instabilities is compared to two different earthquake-triggered landslide hazard maps: one considering the occurrence of the most probable earthquake for a 475-yr return period in the Lorca basin (M{sub w} = 5.0), which was previously published on the basis of a low-resolution digital elevation model (DEM), and a second one matching the occurrence of the M{sub w} = 5.1 2011 Lorca earthquake, which was undertaken using a higher resolution DEM. The most frequent Newmark displacement values related to the slope failures triggered by the 2011 Lorca earthquakes are smaller than 2 cm in both hazard scenarios and coincide with areas where significant soil and topographical seismic amplification effects have occurred.

  11. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  12. Variations in population vulnerability to tectonic and landslide-related tsunami hazards in Alaska

    Science.gov (United States)

    Wood, Nathan J.; Peters, Jeff

    2015-01-01

    Effective tsunami risk reduction requires an understanding of how at-risk populations are specifically vulnerable to tsunami threats. Vulnerability assessments primarily have been based on single hazard zones, even though a coastal community may be threatened by multiple tsunami sources that vary locally in terms of inundation extents and wave arrival times. We use the Alaskan coastal communities of Cordova, Kodiak, Seward, Valdez, and Whittier (USA), as a case study to explore population vulnerability to multiple tsunami threats. We use anisotropic pedestrian evacuation models to assess variations in population exposure as a function of travel time out of hazard zones associated with tectonic and landslide-related tsunamis (based on scenarios similar to the 1964 M w9.2 Good Friday earthquake and tsunami disaster). Results demonstrate that there are thousands of residents, employees, and business customers in tsunami hazard zones associated with tectonically generated waves, but that at-risk individuals will likely have sufficient time to evacuate to high ground before waves are estimated to arrive 30–60 min after generation. Tsunami hazard zones associated with submarine landslides initiated by a subduction zone earthquake are smaller and contain fewer people, but many at-risk individuals may not have enough time to evacuate as waves are estimated to arrive in 1–2 min and evacuations may need to occur during earthquake ground shaking. For all hazard zones, employees and customers at businesses far outnumber residents at their homes and evacuation travel times are highest on docks and along waterfronts. Results suggest that population vulnerability studies related to tsunami hazards should recognize non-residential populations and differences in wave arrival times if emergency managers are to develop realistic preparedness and outreach efforts.

  13. Contribution of Satellite Gravimetry to Understanding Seismic Source Processes of the 2011 Tohoku-Oki Earthquake

    Science.gov (United States)

    Han, Shin-Chan; Sauber, Jeanne; Riva, Riccardo

    2011-01-01

    The 2011 great Tohoku-Oki earthquake, apart from shaking the ground, perturbed the motions of satellites orbiting some hundreds km away above the ground, such as GRACE, due to coseismic change in the gravity field. Significant changes in inter-satellite distance were observed after the earthquake. These unconventional satellite measurements were inverted to examine the earthquake source processes from a radically different perspective that complements the analyses of seismic and geodetic ground recordings. We found the average slip located up-dip of the hypocenter but within the lower crust, as characterized by a limited range of bulk and shear moduli. The GRACE data constrained a group of earthquake source parameters that yield increasing dip (7-16 degrees plus or minus 2 degrees) and, simultaneously, decreasing moment magnitude (9.17-9.02 plus or minus 0.04) with increasing source depth (15-24 kilometers). The GRACE solution includes the cumulative moment released over a month and demonstrates a unique view of the long-wavelength gravimetric response to all mass redistribution processes associated with the dynamic rupture and short-term postseismic mechanisms to improve our understanding of the physics of megathrusts.

  14. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  15. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  16. Coulomb Stress Change and Seismic Hazard of Rift Zones in Southern Tibet after the 2015 Mw7.8 Nepal Earthquake and Its Mw7.3 Aftershock

    Science.gov (United States)

    Dai, Z.; Zha, X.; Lu, Z.

    2015-12-01

    In southern Tibet (30~34N, 80~95E), many north-trending rifts, such as Yadong-Gulu and Lunggar rifts, are characterized by internally drained graben or half-graben basins bounded by active normal faults. Some developed rifts have become a portion of important transportation lines in Tibet, China. Since 1976, eighty-seven >Mw5.0 earthquakes have happened in the rift regions, and fifty-five events have normal faulting focal mechanisms according to the GCMT catalog. These rifts and normal faults are associated with both the EW-trending extension of the southern Tibet and the convergence between Indian and Tibet. The 2015 Mw7.8 Nepal great earthquake and its Mw7.3 aftershock occurred at the main Himalayan Thrust zone and caused tremendous damages in Kathmandu region. Those earthquakes will lead to significant viscoelastic deformation and stress changes in the southern Tibet in the future. To evaluate the seismic hazard in the active rift regions in southern Tibet, we modeled the slip distribution of the 2015 Nepal great earthquakes using the InSAR displacement field from the ALOS-2 satellite SAR data, and calculated the Coulomb failure stress (CFS) on these active normal faults in the rift zones. Because the estimated CFS depends on the geometrical parameters of receiver faults, it is necessary to get the accurate fault parameters in the rift zones. Some historical earthquakes have been studied using the field data, teleseismic data and InSAR observations, but results are in not agreement with each other. In this study, we revaluated the geometrical parameters of seismogenic faults occurred in the rift zones using some high-quality coseismic InSAR observations and teleseismic body-wave data. Finally, we will evaluate the seismic hazard in the rift zones according to the value of the estimated CFS and aftershock distribution.

  17. An Atlas of ShakeMaps and population exposure catalog for earthquake loss modeling

    Science.gov (United States)

    Allen, T.I.; Wald, D.J.; Earle, P.S.; Marano, K.D.; Hotovec, A.J.; Lin, K.; Hearne, M.G.

    2009-01-01

    We present an Atlas of ShakeMaps and a catalog of human population exposures to moderate-to-strong ground shaking (EXPO-CAT) for recent historical earthquakes (1973-2007). The common purpose of the Atlas and exposure catalog is to calibrate earthquake loss models to be used in the US Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER). The full ShakeMap Atlas currently comprises over 5,600 earthquakes from January 1973 through December 2007, with almost 500 of these maps constrained-to varying degrees-by instrumental ground motions, macroseismic intensity data, community internet intensity observations, and published earthquake rupture models. The catalog of human exposures is derived using current PAGER methodologies. Exposure to discrete levels of shaking intensity is obtained by correlating Atlas ShakeMaps with a global population database. Combining this population exposure dataset with historical earthquake loss data, such as PAGER-CAT, provides a useful resource for calibrating loss methodologies against a systematically-derived set of ShakeMap hazard outputs. We illustrate two example uses for EXPO-CAT; (1) simple objective ranking of country vulnerability to earthquakes, and; (2) the influence of time-of-day on earthquake mortality. In general, we observe that countries in similar geographic regions with similar construction practices tend to cluster spatially in terms of relative vulnerability. We also find little quantitative evidence to suggest that time-of-day is a significant factor in earthquake mortality. Moreover, earthquake mortality appears to be more systematically linked to the population exposed to severe ground shaking (Modified Mercalli Intensity VIII+). Finally, equipped with the full Atlas of ShakeMaps, we merge each of these maps and find the maximum estimated peak ground acceleration at any grid point in the world for the past 35 years. We subsequently compare this "composite ShakeMap" with existing global

  18. Seismic hazard studies in Egypt

    Directory of Open Access Journals (Sweden)

    Abuo El-Ela A. Mohamed

    2012-12-01

    Full Text Available The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba–Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5° within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA values were detected in the western part of the western desert and it is less than 25 gal.

  19. Input parameters for the statistical seismic hazard assessment in central part of Romania territory using crustal earthquakes

    International Nuclear Information System (INIS)

    Moldovan, A.I.; Bazacliu, O.; Popescu, E.

    2004-01-01

    The seismic hazard assessment in dense-populated geographical regions and subsequently the design of the strategic objectives (dams, nuclear power plants, etc.) are based on the knowledge of the seismicity parameters of the seismic sources which can generate ground motion amplitudes above the minimum level considered risky at the specific site and the way the seismic waves propagate between the focus and the site. The purpose of this paper is to provide a set of information required for a probabilistic assessment of the seismic hazard in the central Romanian territory relative to the following seismic sources: Fagaras zone (FC), Campulung zone (CP), and Transilvania zone (TD) all of them in the crust domain. Extremely vulnerable objectives are present in the central part of Romania, including cities of Pitesti and Sibiu and the 'Vidraru' dam. The analysis that we propose implies: (1) geometrical definition of the seismic sources, (2) estimation of the maximum possible magnitude, (3) estimation of the frequency - magnitude relationship and (4) estimation of the attenuation laws. As an example, the obtained input parameters are used to evaluate the seismic hazard distribution due to the crustal earthquakes applying the McGuire's procedure (1976). These preliminary results are in good agreement with the previous research based on deterministic approach (Radulian et al., 2000). (authors)

  20. Distinguishing megathrust from intraplate earthquakes using lacustrine turbidites (Laguna Lo Encañado, Central Chile)

    Science.gov (United States)

    Van Daele, Maarten; Araya-Cornejo, Cristian; Pille, Thomas; Meyer, Inka; Kempf, Philipp; Moernaut, Jasper; Cisternas, Marco

    2017-04-01

    One of the main challenges in seismically active regions is differentiating paleo-earthquakes resulting from different fault systems, such as the megathrust versus intraplate faults in subductions settings. Such differentiation is, however, key for hazard assessments based on paleoseismic records. Laguna Lo Encañado (33.7°S; 70.3°W; 2492 m a.s.l.) is located in the Central Chilean Andes, 50 km east of Santiago de Chile, a metropole with about 7,000,000 inhabitants. During the last century the study area experienced 3 large megathrust earthquakes (1906, 1985 and 2010) and 2 intraplate earthquakes (1945 and 1958) (Lomnitz, 1960). While the megathrust earthquakes cause Modified Mercalli Intensities (MMIs) of VI to VII at the lake (Van Daele et al., 2015), the intraplate earthquakes cause peak MMIs up to IX (Sepúlveda et al., 2008). Here we present a turbidite record of Laguna Lo Encañado going back to 1900 AD. While geophysical data (3.5 kHz subbottom seismic profiles and side-scan sonar data) provides a bathymetry and an overview of the sedimentary environment, we study 15 short cores in order to understand the depositional processes resulting in the encountered lacustrine turbidites. All mentioned earthquakes triggered turbidites in the lake, which are all linked to slumps in proximal areas, and are thus resulting from mass wasting of the subaquatic slopes. However, turbidites linked to the intraplate earthquakes are additionally covered by turbidites of a finer-grained, more clastic nature. We link the latter to post-seismic erosion of onshore landslides, which need higher MMIs to be triggered than subaquatic mass movements (Howarth et al., 2014). While intraplate earthquakes can cause MMIs up to IX and higher, megathrust earthquakes do not cause sufficiently high MMIs at the lake to trigger voluminous onshore landslides. Hence, the presence of these post-seismic turbidites allows to distinguish turbidites triggered by intraplate earthquakes from those

  1. Ground motion models used in the 2014 U.S. National Seismic Hazard Maps

    Science.gov (United States)

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.

    2015-01-01

    The National Seismic Hazard Maps (NSHMs) are an important component of seismic design regulations in the United States. This paper compares hazard using the new suite of ground motion models (GMMs) relative to hazard using the suite of GMMs applied in the previous version of the maps. The new source characterization models are used for both cases. A previous paper (Rezaeian et al. 2014) discussed the five NGA-West2 GMMs used for shallow crustal earthquakes in the Western United States (WUS), which are also summarized here. Our focus in this paper is on GMMs for earthquakes in stable continental regions in the Central and Eastern United States (CEUS), as well as subduction interface and deep intraslab earthquakes. We consider building code hazard levels for peak ground acceleration (PGA), 0.2-s, and 1.0-s spectral accelerations (SAs) on uniform firm-rock site conditions. The GMM modifications in the updated version of the maps created changes in hazard within 5% to 20% in WUS; decreases within 5% to 20% in CEUS; changes within 5% to 15% for subduction interface earthquakes; and changes involving decreases of up to 50% and increases of up to 30% for deep intraslab earthquakes for most U.S. sites. These modifications were combined with changes resulting from modifications in the source characterization models to obtain the new hazard maps.

  2. Statistical validation of earthquake related observations

    Science.gov (United States)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  3. Evidence of a Large Triggered Event in the Nepal Himalaya Following the Gorkha Earthquake: Implications Toward Enhanced Seismic Hazard

    Science.gov (United States)

    Mandal, Prantik

    2018-03-01

    A DC (double couple) constrained multiple point-source moment-tensor inversion is performed on the band-passed (0.008-0.10 Hz) displacement data of the 25 April (M w 7.8) 2015 Nepal mainshock, from 17 broadband stations in India. Our results reveal that the 25 April event (strike = 324°, dip = 14°, rake = 88°) ruptured the north-dipping main Himalayan thrust (MHT) at 16 km depth. We modeled the Coulomb failure stress changes (ΔCFS) produced by the slip on the fault plane of the 25 April Nepal mainshock. A strong correlation with occurrences of aftershocks and regions of increased positive ΔCFS is obtained below the aftershock zone of the 2015 Nepal mainshock. We notice that predicted ΔCFS at 16 km depth show a positive Coulomb stress of 0.06 MPa at the location of the 12 May 2015 event. These small modeled stress changes can lead to trigger events if the crust is already near to failure, but these small stresses can also advance the occurrence of future earthquakes. The main finding of our ΔCFS modeling implies that the 25 April event increased the Coulomb stress changes by 0.06 MPa at 16 km depth below the site of the 12 May event, and thus, this event can be termed as triggered. We propose that the seismic hazard in the Himalaya is not only caused by the mainshock slip on the MHT; rather, the occurrence of large triggered event on the MHT can also enhance our understanding of the seismic hazard in the Nepal Himalaya.

  4. Dynamic rupture scenarios from Sumatra to Iceland - High-resolution earthquake source physics on natural fault systems

    Science.gov (United States)

    Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie

    2017-04-01

    Capturing the observed complexity of earthquake sources in dynamic rupture simulations may require: non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure. All of these factors have been independently shown to alter dynamic rupture behavior and thus possibly influence the degree of realism attainable via simulated ground motions. In this presentation we will show examples of high-resolution earthquake scenarios, e.g. based on the 2004 Sumatra-Andaman Earthquake, the 1994 Northridge earthquake and a potential rupture of the Husavik-Flatey fault system in Northern Iceland. The simulations combine a multitude of representations of source complexity at the necessary spatio-temporal resolution enabled by excellent scalability on modern HPC systems. Such simulations allow an analysis of the dominant factors impacting earthquake source physics and ground motions given distinct tectonic settings or distinct focuses of seismic hazard assessment. Across all simulations, we find that fault geometry concurrently with the regional background stress state provide a first order influence on source dynamics and the emanated seismic wave field. The dynamic rupture models are performed with SeisSol, a software package based on an ADER-Discontinuous Galerkin scheme for solving the spontaneous dynamic earthquake rupture problem with high-order accuracy in space and time. Use of unstructured tetrahedral meshes allows for a realistic representation of the non-planar fault geometry, subsurface structure and bathymetry. The results presented highlight the fact that modern numerical methods are essential to further our understanding of earthquake source physics and complement both physic-based ground motion research and empirical approaches in seismic hazard analysis.

  5. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  6. Effects of deep basins on structural collapse during large subduction earthquakes

    Science.gov (United States)

    Marafi, Nasser A.; Eberhard, Marc O.; Berman, Jeffrey W.; Wirth, Erin A.; Frankel, Arthur

    2017-01-01

    Deep sedimentary basins are known to increase the intensity of ground motions, but this effect is implicitly considered in seismic hazard maps used in U.S. building codes. The basin amplification of ground motions from subduction earthquakes is particularly important in the Pacific Northwest, where the hazard at long periods is dominated by such earthquakes. This paper evaluates the effects of basins on spectral accelerations, ground-motion duration, spectral shape, and structural collapse using subduction earthquake recordings from basins in Japan that have similar depths as the Puget Lowland basin. For three of the Japanese basins and the Puget Lowland basin, the spectral accelerations were amplified by a factor of 2 to 4 for periods above 2.0 s. The long-duration subduction earthquakes and the effects of basins on spectral shape combined, lower the spectral accelerations at collapse for a set of building archetypes relative to other ground motions. For the hypothetical case in which these motions represent the entire hazard, the archetypes would need to increase up to 3.3 times its strength to compensate for these effects.

  7. Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    Science.gov (United States)

    Necmioglu, Ocal; Meral Ozel, Nurcan

    2015-04-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the earthquakes resembling the

  8. Probabilistic seismic hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D. [New Brunswick Power Corp., Point Lepreau Generating Station, Lepreau, New Brunswick (Canada); Lavine, A. [AMEC Foster Wheeler Environment and Infrastructure Americas, Oakland, California (United States); Egan, J. [SAGE Engineers, Oakland, California (United States)

    2015-09-15

    A Probabilistic Seismic Hazard Assessment (PSHA) has been performed for the Point Lepreau Generating Station (PLGS). The objective is to provide characterization of the earthquake ground shaking that will be used to evaluate seismic safety. The assessment is based on the current state of knowledge of the informed scientific and engineering community regarding earthquake hazards in the site region, and includes two primary components-a seismic source model and a ground motion model. This paper provides the methodology and results of the PLGS PSHA. The implications of the updated hazard information for site safety are discussed in a separate paper. (author)

  9. Probabilistic seismic hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau, NB (Canada); Lavine, A., E-mail: alexis.lavine@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure Americas, Oakland, CA (United States); Egan, J., E-mail: jegan@sageengineers.com [SAGE Engineers, Oakland, CA (United States)

    2015-07-01

    A Probabilistic Seismic Hazard Assessment (PSHA) has been performed for the Point Lepreau Generating Station (PLGS). The objective is to provide characterization of the earthquake ground shaking that will be used to evaluate seismic safety. The assessment is based on the current state of knowledge of the informed scientific and engineering community regarding earthquake hazards in the site region, and includes two primary components--a seismic source model and a ground motion model. This paper provides the methodology and results of the PLGS PSHA. The implications of the updated hazard information for site safety are discussed in a separate paper. (author)

  10. The HayWired earthquake scenario—Engineering implications

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2018-04-18

    The HayWired Earthquake Scenario—Engineering Implications is the second volume of U.S. Geological Survey (USGS) Scientific Investigations Report 2017–5013, which describes the HayWired scenario, developed by USGS and its partners. The scenario is a hypothetical yet scientifically realistic earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after a magnitude-7 earthquake (mainshock) on the Hayward Fault and its aftershocks.Analyses in this volume suggest that (1) 800 deaths and 16,000 nonfatal injuries result from shaking alone, plus property and direct business interruption losses of more than $82 billion from shaking, liquefaction, and landslides; (2) the building code is designed to protect lives, but even if all buildings in the region complied with current building codes, 0.4 percent could collapse, 5 percent could be unsafe to occupy, and 19 percent could have restricted use; (3) people expect, prefer, and would be willing to pay for greater resilience of buildings; (4) more than 22,000 people could require extrication from stalled elevators, and more than 2,400 people could require rescue from collapsed buildings; (5) the average east-bay resident could lose water service for 6 weeks, some for as long as 6 months; (6) older steel-frame high-rise office buildings and new reinforced-concrete residential buildings in downtown San Francisco and Oakland could be unusable for as long as 10 months; (7) about 450 large fires could result in a loss of residential and commercial building floor area equivalent to more than 52,000 single-family homes and cause property (building and content) losses approaching $30 billion; and (8) combining earthquake early warning (ShakeAlert) with “drop, cover, and hold on” actions could prevent as many as 1,500 nonfatal injuries out of 18,000 total estimated nonfatal injuries from shaking and liquefaction hazards.

  11. Knowledge base about earthquakes as a tool to minimize strong events consequences

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  12. Basic earthquake engineering from seismology to analysis and design

    CERN Document Server

    Sucuoğlu, Halûk

    2014-01-01

    This book provides senior undergraduate students, master students and structural engineers who do not have a background in the field with core knowledge of structural earthquake engineering that will be invaluable in their professional lives. The basics of seismotectonics, including the causes, magnitude, and intensity of earthquakes, are first explained. Then the book introduces basic elements of seismic hazard analysis and presents the concept of a seismic hazard map for use in seismic design. Subsequent chapters cover key aspects of the response analysis of simple systems and building struc­tures to earthquake ground motions, design spectrum, the adoption of seismic analysis procedures in seismic design codes, seismic design principles and seismic design of reinforced concrete structures. Helpful worked examples on seismic analysis of linear, nonlinear and base isolated buildings, earthquake-resistant design of frame and frame-shear wall systems are included, most of which can be solved using a hand calcu...

  13. Earthquake Probability Assessment for the Active Faults in Central Taiwan: A Case Study

    Directory of Open Access Journals (Sweden)

    Yi-Rui Lee

    2016-06-01

    Full Text Available Frequent high seismic activities occur in Taiwan due to fast plate motions. According to the historical records the most destructive earthquakes in Taiwan were caused mainly by inland active faults. The Central Geological Survey (CGS of Taiwan has published active fault maps in Taiwan since 1998. There are 33 active faults noted in the 2012 active fault map. After the Chi-Chi earthquake, CGS launched a series of projects to investigate the details to better understand each active fault in Taiwan. This article collected this data to develop active fault parameters and referred to certain experiences from Japan and the United States to establish a methodology for earthquake probability assessment via active faults. We consider the active faults in Central Taiwan as a good example to present the earthquake probability assessment process and results. The appropriate “probability model” was used to estimate the conditional probability where M ≥ 6.5 and M ≥ 7.0 earthquakes. Our result shows that the highest earthquake probability for M ≥ 6.5 earthquake occurring in 30, 50, and 100 years in Central Taiwan is the Tachia-Changhua fault system. Conversely, the lowest earthquake probability is the Chelungpu fault. The goal of our research is to calculate the earthquake probability of the 33 active faults in Taiwan. The active fault parameters are important information that can be applied in the following seismic hazard analysis and seismic simulation.

  14. Safety analysis of nuclear containment vessels subjected to strong earthquakes and subsequent tsunamis

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Feng; Li, Hong Zhi [Dept. Structural Engineering, Tongji University, Shanghai (China)

    2017-08-15

    Nuclear power plants under expansion and under construction in China are mostly located in coastal areas, which means they are at risk of suffering strong earthquakes and subsequent tsunamis. This paper presents a safety analysis for a new reinforced concrete containment vessel in such events. A finite element method-based model was built, verified, and first used to understand the seismic performance of the containment vessel under earthquakes with increased intensities. Then, the model was used to assess the safety performance of the containment vessel subject to an earthquake with peak ground acceleration (PGA) of 0.56g and subsequent tsunamis with increased inundation depths, similar to the 2011 Great East earthquake and tsunami in Japan. Results indicated that the containment vessel reached Limit State I (concrete cracking) and Limit State II (concrete crushing) when the PGAs were in a range of 0.8–1.1g and 1.2–1.7g, respectively. The containment vessel reached Limit State I with a tsunami inundation depth of 10 m after suffering an earthquake with a PGA of 0.56g. A site-specific hazard assessment was conducted to consider the likelihood of tsunami sources.

  15. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  16. Influence of behavioral biases on the assessment of multi-hazard risks and the implementation of multi-hazard risks mitigation measures: case study of multi-hazard cyclone shelters in Tamil Nadu, India

    Science.gov (United States)

    Komendantova, Nadejda; Patt, Anthony

    2013-04-01

    In December 2004, a multiple hazards event devastated the Tamil Nadu province of India. The Sumatra -Andaman earthquake with a magnitude of Mw=9.1-9.3 caused the Indian Ocean tsunami with wave heights up to 30 m, and flooding that reached up to two kilometers inland in some locations. More than 7,790 persons were killed in the province of Tamil Nadu, with 206 in its capital Chennai. The time lag between the earthquake and the tsunami's arrival in India was over an hour, therefore, if a suitable early warning system existed, a proper means of communicating the warning and shelters existing for people would exist, than while this would not have prevented the destruction of infrastructure, several thousands of human lives would have been saved. India has over forty years of experience in the construction of cyclone shelters. With additional efforts and investment, these shelters could be adapted to other types of hazards such as tsunamis and flooding, as well as the construction of new multi-hazard cyclone shelters (MPCS). It would therefore be possible to mitigate one hazard such as cyclones by the construction of a network of shelters while at the same time adapting these shelters to also deal with, for example, tsunamis, with some additional investment. In this historical case, the failure to consider multiple hazards caused significant human losses. The current paper investigates the patterns of the national decision-making process with regards to multiple hazards mitigation measures and how the presence of behavioral and cognitive biases influenced the perceptions of the probabilities of multiple hazards and the choices made for their mitigation by the national decision-makers. Our methodology was based on the analysis of existing reports from national and international organizations as well as available scientific literature on behavioral economics and natural hazards. The results identified several biases in the national decision-making process when the

  17. Demonstration of pb-PSHA with Ras-Elhekma earthquake, Egypt

    Directory of Open Access Journals (Sweden)

    Elsayed Fergany

    2017-06-01

    Full Text Available The main goal of this work is to: (1 argue for the importance of a physically-based probabilistic seismic hazard analysis (pb-PSHA methodology and show examples to support the argument from recent events, (2 demonstrate the methodology with the ground motion simulations of May 28, 1998, Mw = 5.5 Ras-Elhekma earthquake, north Egypt. The boundaries for the possible rupture parameters that may have been identified prior to the 1998 Ras-Elhekma earthquake were estimated. A range of simulated ground-motions for the Ras-Elhekma earthquake was “predicted” for frequency 0.5–25 Hz at three sites, where the large earthquake was recorded, with average epicentral distances of 220 km. The best rupture model of the 1998 Ras-Elhekma earthquake was identified by calculated the goodness of fit between observed and synthesized records at sites FYM, HAG, and KOT. We used the best rupture scenario of the 1998 earthquake to synthesize the ground motions at interested sites where the main shock was not recorded. Based on the good fit of simulated and observed seismograms, we concluded that this methodology can provide realistic ground motion of an earthquake and highly recommended for engineering purposes in advance or foregoing large earthquakes at non record sites. We propose that there is a need for this methodology for good-representing the true hazard with reducing uncertainties.

  18. Iranian earthquakes, a uniform catalog with moment magnitudes

    Science.gov (United States)

    Karimiparidari, Sepideh; Zaré, Mehdi; Memarian, Hossein; Kijko, Andrzej

    2013-07-01

    A uniform earthquake catalog is an essential tool in any seismic hazard analysis. In this study, an earthquake catalog of Iran and adjacent areas was compiled, using international and national databanks. The following priorities were applied in selecting magnitude and earthquake location: (a) local catalogs were given higher priority for establishing the location of an earthquake and (b) global catalogs were preferred for determining earthquake magnitudes. Earthquakes that have occurred within the bounds between 23-42° N and 42-65° E, with a magnitude range of M W 3.5-7.9, from the third millennium BC until April 2010 were included. In an effort to avoid the "boundary effect," since the newly compiled catalog will be mainly used for seismic hazard assessment, the study area includes the areas adjacent to Iran. The standardization of the catalog in terms of magnitude was achieved by the conversion of all types of magnitude into moment magnitude, M W, by using the orthogonal regression technique. In the newly compiled catalog, all aftershocks were detected, based on the procedure described by Gardner and Knopoff (Bull Seismol Soc Am 64:1363-1367, 1974). The seismicity parameters were calculated for the six main tectonic seismic zones of Iran, i.e., the Zagros Mountain Range, the Alborz Mountain Range, Central Iran, Kope Dagh, Azerbaijan, and Makran.

  19. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  20. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  1. Seismic hazard assessment: Issues and alternatives

    Science.gov (United States)

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  2. Research on the spatial analysis method of seismic hazard for island

    International Nuclear Information System (INIS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-01-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform. (paper)

  3. Research on the spatial analysis method of seismic hazard for island

    Science.gov (United States)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  4. Widespread seismicity excitation following the 2011 M=9.0 Tohoku, Japan, earthquake and its implications for seismic hazard

    Science.gov (United States)

    Toda, S.; Stein, R. S.; Lin, J.

    2011-12-01

    trench slope normal faults, the Kanto fragment beneath Tokyo, the Itoigawa-Shizuoka Tectonic Line, and several other major faults were brought significantly closer to failure. Elevated seismicity in these areas is evident and sustained higher than normal during the 4.5 months after the Tohoku earthquake. Since several faults are overdue and closer to the next failure, an urgent update of the probabilistic seismic hazard map incorporating the impact of the great Tohoku earthquake is required.

  5. Earthquake research for the safer siting of critical facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cluff, J.L. (ed.)

    1980-01-01

    The task of providing the necessities for living, such as adequate electrical power, water, and fuel, is becoming more complicated with time. Some of the facilities that provide these necessities would present potential hazards to the population if serious damage were to occur to them during earthquakes. Other facilities must remain operable immediately after an earthquake to provide life-support services to people who have been affected. The purpose of this report is to recommend research that will improve the information available to those who must decide where to site these critical facilities, and thereby mitigate the effects of the earthquake hazard. The term critical facility is used in this report to describe facilities that could seriously affect the public well-being through loss of life, large financial loss, or degradation of the environment if they were to fail. The term critical facility also is used to refer to facilities that, although they pose a limited hazard to the public, are considered critical because they must continue to function in the event of a disaster so that they can provide vital services.

  6. A first-order second-moment calculation for seismic hazard assessment with the consideration of uncertain magnitude conversion

    Directory of Open Access Journals (Sweden)

    J. P. Wang

    2013-10-01

    Full Text Available Earthquake size can be described with different magnitudes for different purposes. For example, local magnitude ML is usually adopted to compile an earthquake catalog, and moment magnitude Mw is often prescribed by a ground motion model. Understandably, when inconsistent units are encountered in an earthquake analysis, magnitude conversion needs to be performed beforehand. However, the conversion is not expected at full certainty owing to the model error of empirical relationships. This paper introduces a novel first-order second-moment (FOSM calculation to estimate the annual rate of earthquake motion (or seismic hazard on a probabilistic basis, including the consideration of the uncertain magnitude conversion and three other sources of earthquake uncertainties. In addition to the methodology, this novel FOSM application to engineering seismology is demonstrated in this paper with a case study. With a local ground motion model, magnitude conversion relationship and earthquake catalog, the analysis shows that the best-estimate annual rate of peak ground acceleration (PGA greater than 0.18 g (induced by earthquakes is 0.002 per year at a site in Taipei, given the uncertainties of magnitude conversion, earthquake size, earthquake location, and motion attenuation.

  7. A first-order second-moment calculation for seismic hazard assessment with the consideration of uncertain magnitude conversion

    Science.gov (United States)

    Wang, J. P.; Yun, X.; Wu, Y.-M.

    2013-10-01

    Earthquake size can be described with different magnitudes for different purposes. For example, local magnitude ML is usually adopted to compile an earthquake catalog, and moment magnitude Mw is often prescribed by a ground motion model. Understandably, when inconsistent units are encountered in an earthquake analysis, magnitude conversion needs to be performed beforehand. However, the conversion is not expected at full certainty owing to the model error of empirical relationships. This paper introduces a novel first-order second-moment (FOSM) calculation to estimate the annual rate of earthquake motion (or seismic hazard) on a probabilistic basis, including the consideration of the uncertain magnitude conversion and three other sources of earthquake uncertainties. In addition to the methodology, this novel FOSM application to engineering seismology is demonstrated in this paper with a case study. With a local ground motion model, magnitude conversion relationship and earthquake catalog, the analysis shows that the best-estimate annual rate of peak ground acceleration (PGA) greater than 0.18 g (induced by earthquakes) is 0.002 per year at a site in Taipei, given the uncertainties of magnitude conversion, earthquake size, earthquake location, and motion attenuation.

  8. Understanding Animal Detection of Precursor Earthquake Sounds.

    Science.gov (United States)

    Garstang, Michael; Kelley, Michael C

    2017-08-31

    We use recent research to provide an explanation of how animals might detect earthquakes before they occur. While the intrinsic value of such warnings is immense, we show that the complexity of the process may result in inconsistent responses of animals to the possible precursor signal. Using the results of our research, we describe a logical but complex sequence of geophysical events triggered by precursor earthquake crustal movements that ultimately result in a sound signal detectable by animals. The sound heard by animals occurs only when metal or other surfaces (glass) respond to vibrations produced by electric currents induced by distortions of the earth's electric fields caused by the crustal movements. A combination of existing measurement systems combined with more careful monitoring of animal response could nevertheless be of value, particularly in remote locations.

  9. Time-decreasing hazard and increasing time until the next earthquake

    International Nuclear Information System (INIS)

    Corral, Alvaro

    2005-01-01

    The existence of a slowly always decreasing probability density for the recurrence times of earthquakes in the stationary case implies that the occurrence of an event at a given instant becomes more unlikely as time since the previous event increases. Consequently, the expected waiting time to the next earthquake increases with the elapsed time, that is, the event moves away fast to the future. We have found direct empirical evidence of this counterintuitive behavior in two worldwide catalogs as well as in diverse regional catalogs. Universal scaling functions describe the phenomenon well

  10. The Psychology of Hazard Risk Perception

    Science.gov (United States)

    Thompson, K. F.

    2012-12-01

    A critical step in preparing for natural hazards is understanding the risk: what is the hazard, its likelihood and range of impacts, and what are the vulnerabilities of the community? Any hazard forecast naturally includes a degree of uncertainty, and often these uncertainties are expressed in terms of probabilities. There is often a strong understanding of probability among the physical scientists and emergency managers who create hazard forecasts and issue watches, warnings, and evacuation orders, and often such experts expect similar levels of risk fluency among the general public—indeed, the Working Group on California Earthquake Probabilities (WGCEP) states in the introduction to its earthquake rupture forecast maps that "In daily living, people are used to making decisions based on probabilities—from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [1] However, cognitive psychologists have shown in numerous studies [see, e.g., 2-5] that the WGCEP's expectation of probability literacy is inaccurate. People neglect, distort, misjudge, or misuse probability information, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [6]. Even the most ubiquitous of probabilistic information—weather forecasts—are systematically misinterpreted [7]. So while disaster risk analysis and assessment is undoubtedly a critical step in public preparedness and hazard mitigation plans, it is equally important that scientists and practitioners understand the common psychological barriers to accurate probability perception before they attempt to communicate hazard risks to the public. This paper discusses several common, systematic distortions in probability perception and use, including: the influence of personal experience on use of statistical information; temporal discounting and construal level theory; the effect

  11. Fractal analysis of the spatial distribution of earthquakes along the Hellenic Subduction Zone

    Science.gov (United States)

    Papadakis, Giorgos; Vallianatos, Filippos; Sammonds, Peter

    2014-05-01

    The Hellenic Subduction Zone (HSZ) is the most seismically active region in Europe. Many destructive earthquakes have taken place along the HSZ in the past. The evolution of such active regions is expressed through seismicity and is characterized by complex phenomenology. The understanding of the tectonic evolution process and the physical state of subducting regimes is crucial in earthquake prediction. In recent years, there is a growing interest concerning an approach to seismicity based on the science of complex systems (Papadakis et al., 2013; Vallianatos et al., 2012). In this study we calculate the fractal dimension of the spatial distribution of earthquakes along the HSZ and we aim to understand the significance of the obtained values to the tectonic and geodynamic evolution of this area. We use the external seismic sources provided by Papaioannou and Papazachos (2000) to create a dataset regarding the subduction zone. According to the aforementioned authors, we define five seismic zones. Then, we structure an earthquake dataset which is based on the updated and extended earthquake catalogue for Greece and the adjacent areas by Makropoulos et al. (2012), covering the period 1976-2009. The fractal dimension of the spatial distribution of earthquakes is calculated for each seismic zone and for the HSZ as a unified system using the box-counting method (Turcotte, 1997; Robertson et al., 1995; Caneva and Smirnov, 2004). Moreover, the variation of the fractal dimension is demonstrated in different time windows. These spatiotemporal variations could be used as an additional index to inform us about the physical state of each seismic zone. As a precursor in earthquake forecasting, the use of the fractal dimension appears to be a very interesting future work. Acknowledgements Giorgos Papadakis wish to acknowledge the Greek State Scholarships Foundation (IKY). References Caneva, A., Smirnov, V., 2004. Using the fractal dimension of earthquake distributions and the

  12. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  13. Numerical simulation of the 1976 Ms7.8 Tangshan Earthquake

    Science.gov (United States)

    Li, Zhengbo; Chen, Xiaofei

    2017-04-01

    An Ms 7.8 earthquake happened in Tangshan in 1976, causing more than 240000 people death and almost destroying the whole city. Numerous studies indicated that the surface rupture zone extends 8 to 11 km in the south of Tangshan City. The fault system is composed with more than ten NE-trending right-lateral strike-slip left-stepping echelon faults, with a general strike direction of N30°E. However, recent scholars proposed that the surface ruptures appeared in a larger area. To simulate the rupture process closer to the real situation, the curvilinear grid finite difference method presented by Zhang et al. (2006, 2014) which can handle the free surface and the complex geometry were implemented to investigate the dynamic rupture and ground motion of Tangshan earthquake. With the data from field survey, seismic section, borehole and trenching results given by different studies, several fault geometry models were established. The intensity, the seismic waveform and the displacement resulted from the simulation of different models were compared with the observed data. The comparison of these models shows details of the rupture process of the Tangshan earthquake and implies super-shear may occur during the rupture, which is important for better understanding of this complicated rupture process and seismic hazard distributions of this earthquake.

  14. Catalog of Hawaiian earthquakes, 1823-1959

    Science.gov (United States)

    Klein, Fred W.; Wright, Thomas L.

    2000-01-01

    This catalog of more than 17,000 Hawaiian earthquakes (of magnitude greater than or equal to 5), principally located on the Island of Hawaii, from 1823 through the third quarter of 1959 is designed to expand our ability to evaluate seismic hazard in Hawaii, as well as our knowledge of Hawaiian seismic rhythms as they relate to eruption cycles at Kilauea and Mauna Loa volcanoes and to subcrustal earthquake patterns related to the tectonic evolution of the Hawaiian chain.

  15. Including foreshocks and aftershocks in time-independent probabilistic seismic hazard analyses

    Science.gov (United States)

    Boyd, Oliver S.

    2012-01-01

    Time‐independent probabilistic seismic‐hazard analysis treats each source as being temporally and spatially independent; hence foreshocks and aftershocks, which are both spatially and temporally dependent on the mainshock, are removed from earthquake catalogs. Yet, intuitively, these earthquakes should be considered part of the seismic hazard, capable of producing damaging ground motions. In this study, I consider the mainshock and its dependents as a time‐independent cluster, each cluster being temporally and spatially independent from any other. The cluster has a recurrence time of the mainshock; and, by considering the earthquakes in the cluster as a union of events, dependent events have an opportunity to contribute to seismic ground motions and hazard. Based on the methods of the U.S. Geological Survey for a high‐hazard site, the inclusion of dependent events causes ground motions that are exceeded at probability levels of engineering interest to increase by about 10% but could be as high as 20% if variations in aftershock productivity can be accounted for reliably.

  16. The Contribution of Palaeoseismology to Seismic Hazard Assessment in Site Evaluation for Nuclear Installations

    International Nuclear Information System (INIS)

    2015-06-01

    IAEA Safety Standards Series No. SSG-9, Seismic Hazards in Site Evaluation for Nuclear Installations, published in 2010, covers all aspects of site evaluation relating to seismic hazards and recommends the use of prehistoric, historical and instrumental earthquake data in seismic hazard assessments. Prehistoric data on earthquakes cover a much longer period than do historical and instrumental data. However, gathering such data is generally difficult in most regions of the world, owing to an absence of human records. Prehistoric data on earthquakes can be obtained through the use of palaeoseismic techniques. This publication describes the current status and practices of palaeoseismology, in order to support Member States in meeting the recommendations of SSG-9 and in establishing the necessary earthquake related database for seismic hazard assessment and reassessment. At a donors’ meeting of the International Seismic Safety Centre Extrabudgetary Project in January 2011, it was suggested to develop detailed guidelines on seismic hazards. Soon after the meeting, the disastrous Great East Japan Earthquake and Tsunami of 11 March 2011 and the consequent accident at the Fukushima Daiichi nuclear power plant occurred. The importance of palaeoseismology for seismic hazard assessment in site evaluation was highlighted by the lessons learned from the Fukushima Daiichi nuclear power plant accident. However, no methodology for performing investigations using palaeoseismic techniques has so far been available in an IAEA publication. The detailed guidelines and practical tools provided here will be of value to nuclear power plant operating organizations, regulatory bodies, vendors, technical support organizations and researchers in the area of seismic hazard assessment in site evaluation for nuclear installations, and the information will be of importance in support of hazard assessments in the future

  17. Maturity of nearby faults influences seismic hazard from hydraulic fracturing

    Science.gov (United States)

    Kozłowska, Maria; Brudzinski, Michael R.; Friberg, Paul; Skoumal, Robert J.; Baxter, Nicholas D.; Currie, Brian S.

    2018-02-01

    Understanding the causes of human-induced earthquakes is paramount to reducing societal risk. We investigated five cases of seismicity associated with hydraulic fracturing (HF) in Ohio since 2013 that, because of their isolation from other injection activities, provide an ideal setting for studying the relations between high-pressure injection and earthquakes. Our analysis revealed two distinct groups: (i) deeper earthquakes in the Precambrian basement, with larger magnitudes (M > 2), b-values 1.5, and few post–shut-in earthquakes. Based on geologic history, laboratory experiments, and fault modeling, we interpret the deep seismicity as slip on more mature faults in older crystalline rocks and the shallow seismicity as slip on immature faults in younger sedimentary rocks. This suggests that HF inducing deeper seismicity may pose higher seismic hazards. Wells inducing deeper seismicity produced more water than wells with shallow seismicity, indicating more extensive hydrologic connections outside the target formation, consistent with pore pressure diffusion influencing seismicity. However, for both groups, the 2 to 3 h between onset of HF and seismicity is too short for typical fluid pressure diffusion rates across distances of ˜1 km and argues for poroelastic stress transfer also having a primary influence on seismicity.

  18. Maturity of nearby faults influences seismic hazard from hydraulic fracturing.

    Science.gov (United States)

    Kozłowska, Maria; Brudzinski, Michael R; Friberg, Paul; Skoumal, Robert J; Baxter, Nicholas D; Currie, Brian S

    2018-02-20

    Understanding the causes of human-induced earthquakes is paramount to reducing societal risk. We investigated five cases of seismicity associated with hydraulic fracturing (HF) in Ohio since 2013 that, because of their isolation from other injection activities, provide an ideal setting for studying the relations between high-pressure injection and earthquakes. Our analysis revealed two distinct groups: ( i ) deeper earthquakes in the Precambrian basement, with larger magnitudes (M > 2), b-values 1.5, and few post-shut-in earthquakes. Based on geologic history, laboratory experiments, and fault modeling, we interpret the deep seismicity as slip on more mature faults in older crystalline rocks and the shallow seismicity as slip on immature faults in younger sedimentary rocks. This suggests that HF inducing deeper seismicity may pose higher seismic hazards. Wells inducing deeper seismicity produced more water than wells with shallow seismicity, indicating more extensive hydrologic connections outside the target formation, consistent with pore pressure diffusion influencing seismicity. However, for both groups, the 2 to 3 h between onset of HF and seismicity is too short for typical fluid pressure diffusion rates across distances of ∼1 km and argues for poroelastic stress transfer also having a primary influence on seismicity.

  19. Luminescence dating of some historical/pre-historical natural hazards of India

    International Nuclear Information System (INIS)

    Gartia, R.K.

    2008-01-01

    The Indian sub-continent is characterized by host of natural hazards like earthquake, tsunami, cyclones, floods, landslides/mudflows. It is necessary to build-up a database of historical/pre-historical natural hazards for planning scenarios for emergency response to various them. In short, there is a vast scope of providing chronology to hazardous events by using known techniques of dating including luminescence dating which has an excellent window span down from few hundred years to one hundred thousand years. In this work we report the dates of some historical/pre-historical natural hazards of India. In particular we focus on three kinds of natural hazards namely, earthquakes, tsunami, and mudflows. For example of earthquake we cover a historical earthquake of Manipur that created two massive fissures at Kumbi, 25 km from the state capital, Imphal. For pre-historical ones, we cover Assam-Shillong area known for its highest levels of seismicity in India. We demonstrate the evidence of a paleo-tsunami that devastated Mahabalipuram near Chennai. Incidentally, Mahabalipuram was badly affected by the great tsunami of 26th Dec 2004. Finally, luminescence dating technique has been applied to some historical/pre-historical mudflows of Manipur. A recent mudflow on 10th July 2004 damaged more than 90 houses, block National Highway-39, the life-line of Manipur for more than a fort-night. (author)

  20. Tsunami hazard map in eastern Bali

    International Nuclear Information System (INIS)

    Afif, Haunan; Cipta, Athanasius

    2015-01-01

    Bali is a popular tourist destination both for Indonesian and foreign visitors. However, Bali is located close to the collision zone between the Indo-Australian Plate and Eurasian Plate in the south and back-arc thrust off the northern coast of Bali resulted Bali prone to earthquake and tsunami. Tsunami hazard map is needed for better understanding of hazard level in a particular area and tsunami modeling is one of the most reliable techniques to produce hazard map. Tsunami modeling conducted using TUNAMI N2 and set for two tsunami sources scenarios which are subduction zone in the south of Bali and back thrust in the north of Bali. Tsunami hazard zone is divided into 3 zones, the first is a high hazard zones with inundation height of more than 3m. The second is a moderate hazard zone with inundation height 1 to 3m and the third is a low tsunami hazard zones with tsunami inundation heights less than 1m. Those 2 scenarios showed southern region has a greater potential of tsunami impact than the northern areas. This is obviously shown in the distribution of the inundated area in the south of Bali including the island of Nusa Penida, Nusa Lembongan and Nusa Ceningan is wider than in the northern coast of Bali although the northern region of the Nusa Penida Island more inundated due to the coastal topography

  1. Tsunami hazard map in eastern Bali

    Energy Technology Data Exchange (ETDEWEB)

    Afif, Haunan, E-mail: afif@vsi.esdm.go.id [Geological Agency, Bandung (Indonesia); Cipta, Athanasius [Geological Agency, Bandung (Indonesia); Australian National University, Canberra (Australia)

    2015-04-24

    Bali is a popular tourist destination both for Indonesian and foreign visitors. However, Bali is located close to the collision zone between the Indo-Australian Plate and Eurasian Plate in the south and back-arc thrust off the northern coast of Bali resulted Bali prone to earthquake and tsunami. Tsunami hazard map is needed for better understanding of hazard level in a particular area and tsunami modeling is one of the most reliable techniques to produce hazard map. Tsunami modeling conducted using TUNAMI N2 and set for two tsunami sources scenarios which are subduction zone in the south of Bali and back thrust in the north of Bali. Tsunami hazard zone is divided into 3 zones, the first is a high hazard zones with inundation height of more than 3m. The second is a moderate hazard zone with inundation height 1 to 3m and the third is a low tsunami hazard zones with tsunami inundation heights less than 1m. Those 2 scenarios showed southern region has a greater potential of tsunami impact than the northern areas. This is obviously shown in the distribution of the inundated area in the south of Bali including the island of Nusa Penida, Nusa Lembongan and Nusa Ceningan is wider than in the northern coast of Bali although the northern region of the Nusa Penida Island more inundated due to the coastal topography.

  2. Tsunami hazard map in eastern Bali

    Science.gov (United States)

    Afif, Haunan; Cipta, Athanasius

    2015-04-01

    Bali is a popular tourist destination both for Indonesian and foreign visitors. However, Bali is located close to the collision zone between the Indo-Australian Plate and Eurasian Plate in the south and back-arc thrust off the northern coast of Bali resulted Bali prone to earthquake and tsunami. Tsunami hazard map is needed for better understanding of hazard level in a particular area and tsunami modeling is one of the most reliable techniques to produce hazard map. Tsunami modeling conducted using TUNAMI N2 and set for two tsunami sources scenarios which are subduction zone in the south of Bali and back thrust in the north of Bali. Tsunami hazard zone is divided into 3 zones, the first is a high hazard zones with inundation height of more than 3m. The second is a moderate hazard zone with inundation height 1 to 3m and the third is a low tsunami hazard zones with tsunami inundation heights less than 1m. Those 2 scenarios showed southern region has a greater potential of tsunami impact than the northern areas. This is obviously shown in the distribution of the inundated area in the south of Bali including the island of Nusa Penida, Nusa Lembongan and Nusa Ceningan is wider than in the northern coast of Bali although the northern region of the Nusa Penida Island more inundated due to the coastal topography.

  3. The Geological Susceptibility of Induced Earthquakes in the Duvernay Play

    Science.gov (United States)

    Pawley, Steven; Schultz, Ryan; Playter, Tiffany; Corlett, Hilary; Shipman, Todd; Lyster, Steven; Hauck, Tyler

    2018-02-01

    Presently, consensus on the incorporation of induced earthquakes into seismic hazard has yet to be established. For example, the nonstationary, spatiotemporal nature of induced earthquakes is not well understood. Specific to the Western Canada Sedimentary Basin, geological bias in seismogenic activation potential has been suggested to control the spatial distribution of induced earthquakes regionally. In this paper, we train a machine learning algorithm to systemically evaluate tectonic, geomechanical, and hydrological proxies suspected to control induced seismicity. Feature importance suggests that proximity to basement, in situ stress, proximity to fossil reef margins, lithium concentration, and rate of natural seismicity are among the strongest model predictors. Our derived seismogenic potential map faithfully reproduces the current distribution of induced seismicity and is suggestive of other regions which may be prone to induced earthquakes. The refinement of induced seismicity geological susceptibility may become an important technique to identify significant underlying geological features and address induced seismic hazard forecasting issues.

  4. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Strong Ground Motion

    Science.gov (United States)

    Borcherdt, Roger D.

    1994-01-01

    Strong ground motion generated by the Loma Prieta, Calif., earthquake (MS~7.1) of October 17, 1989, resulted in at least 63 deaths, more than 3,757 injuries, and damage estimated to exceed $5.9 billion. Strong ground motion severely damaged critical lifelines (freeway overpasses, bridges, and pipelines), caused severe damage to poorly constructed buildings, and induced a significant number of ground failures associated with liquefaction and landsliding. It also caused a significant proportion of the damage and loss of life at distances as far as 100 km from the epicenter. Consequently, understanding the characteristics of the strong ground motion associated with the earthquake is fundamental to understanding the earthquake's devastating impact on society. The papers assembled in this chapter address this problem. Damage to vulnerable structures from the earthquake varied substantially with the distance from the causative fault and the type of underlying geologic deposits. Most of the damage and loss of life occurred in areas underlain by 'soft soil'. Quantifying these effects is important for understanding the tragic concentrations of damage in such areas as Santa Cruz and the Marina and Embarcadero Districts of San Francisco, and the failures of the San Francisco-Oakland Bay Bridge and the Interstate Highway 880 overpass. Most importantly, understanding these effects is a necessary prerequisite for improving mitigation measures for larger earthquakes likely to occur much closer to densely urbanized areas in the San Francisco Bay region. The earthquake generated an especially important data set for understanding variations in the severity of strong ground motion. Instrumental strong-motion recordings were obtained at 131 sites located from about 6 to 175 km from the rupture zone. This set of recordings, the largest yet collected for an event of this size, was obtained from sites on various geologic deposits, including a unique set on 'soft soil' deposits

  5. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Gürpinar, Aybars; Serva, Leonello; Livio, Franz; Rizzo, Paul C.

    2017-01-01

    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  6. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Gürpinar, Aybars, E-mail: aybarsgurpinar2007@yahoo.com [Nuclear & Risk Consultancy, Anisgasse 4, 1221 Vienna (Austria); Serva, Leonello, E-mail: lserva@alice.it [Independent Consultant, Via dei Dauni 1, 00185 Rome (Italy); Livio, Franz, E-mail: franz.livio@uninsubria.it [Dipartimento di Scienza ed Alta Tecnologia, Università degli Studi dell’Insubria, Via Velleggio, 11, 22100 Como (Italy); Rizzo, Paul C., E-mail: paul.rizzo@rizzoasoc.com [RIZZO Associates, 500 Penn Center Blvd., Suite 100, Pittsburgh, PA 15235 (United States)

    2017-01-15

    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  7. Earthquake hazard in Northeast India – A seismic microzonation ...

    Indian Academy of Sciences (India)

    microzonation approach with typical case studies from .... the other hand, Guwahati city represents a case of well-formed basin with ... earthquake prone regions towards developing its ... tonic network and the observed seismicity has been.

  8. Probabilistic seismic hazard assessment for the two layer fault system of Antalya (SW Turkey) area

    Science.gov (United States)

    Dipova, Nihat; Cangir, Bülent

    2017-09-01

    Southwest Turkey, along Mediterranean coast, is prone to large earthquakes resulting from subduction of the African plate under the Eurasian plate and shallow crustal faults. Maximum observed magnitude of subduction earthquakes is Mw = 6.5 whereas that of crustal earthquakes is Mw = 6.6. Crustal earthquakes are sourced from faults which are related with Isparta Angle and Cyprus Arc tectonic structures. The primary goal of this study is to assess seismic hazard for Antalya area (SW Turkey) using a probabilistic approach. A new earthquake catalog for Antalya area, with unified moment magnitude scale, was prepared in the scope of the study. Seismicity of the area has been evaluated by the Gutenberg-Richter recurrence relationship. For hazard computation, CRISIS2007 software was used following the standard Cornell-McGuire methodology. Attenuation model developed by Youngs et al. Seismol Res Lett 68(1):58-73, (1997) was used for deep subduction earthquakes and Chiou and Youngs Earthq Spectra 24(1):173-215, (2008) model was used for shallow crustal earthquakes. A seismic hazard map was developed for peak ground acceleration and for rock ground with a hazard level of a 10% probability of exceedance in 50 years. Results of the study show that peak ground acceleration values on bedrock change between 0.215 and 0.23 g in the center of Antalya.

  9. Quasi-periodic recurrence of large earthquakes on the southern San Andreas fault

    Science.gov (United States)

    Scharer, Katherine M.; Biasi, Glenn P.; Weldon, Ray J.; Fumal, Tom E.

    2010-01-01

    It has been 153 yr since the last large earthquake on the southern San Andreas fault (California, United States), but the average interseismic interval is only ~100 yr. If the recurrence of large earthquakes is periodic, rather than random or clustered, the length of this period is notable and would generally increase the risk estimated in probabilistic seismic hazard analyses. Unfortunately, robust characterization of a distribution describing earthquake recurrence on a single fault is limited by the brevity of most earthquake records. Here we use statistical tests on a 3000 yr combined record of 29 ground-rupturing earthquakes from Wrightwood, California. We show that earthquake recurrence there is more regular than expected from a Poisson distribution and is not clustered, leading us to conclude that recurrence is quasi-periodic. The observation of unimodal time dependence is persistent across an observationally based sensitivity analysis that critically examines alternative interpretations of the geologic record. The results support formal forecast efforts that use renewal models to estimate probabilities of future earthquakes on the southern San Andreas fault. Only four intervals (15%) from the record are longer than the present open interval, highlighting the current hazard posed by this fault.

  10. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  11. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions.

    Science.gov (United States)

    Raccanello, Daniela; Burro, Roberto; Hall, Rob

    2017-01-01

    We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children's emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children's understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Our data extend the generalizability of theoretical models on children's psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and provide further knowledge on children's emotional resources related to natural disasters, as a basis for planning educational prevention programs.

  12. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  13. Interpretation of earthquake-induced landslides triggered by the 12 May 2008, M7.9 Wenchuan earthquake in the Beichuan area, Sichuan Province, China using satellite imagery and Google Earth

    Science.gov (United States)

    Sato, H.P.; Harp, E.L.

    2009-01-01

    The 12 May 2008 M7.9 Wenchuan earthquake in the People's Republic of China represented a unique opportunity for the international community to use commonly available GIS (Geographic Information System) tools, like Google Earth (GE), to rapidly evaluate and assess landslide hazards triggered by the destructive earthquake and its aftershocks. In order to map earthquake-triggered landslides, we provide details on the applicability and limitations of publicly available 3-day-post- and pre-earthquake imagery provided by GE from the FORMOSAT-2 (formerly ROCSAT-2; Republic of China Satellite 2). We interpreted landslides on the 8-m-resolution FORMOSAT-2 image by GE; as a result, 257 large landslides were mapped with the highest concentration along the Beichuan fault. An estimated density of 0.3 landslides/km2 represents a minimum bound on density given the resolution of available imagery; higher resolution data would have identified more landslides. This is a preliminary study, and further study is needed to understand the landslide characteristics in detail. Although it is best to obtain landslide locations and measurements from satellite imagery having high resolution, it was found that GE is an effective and rapid reconnaissance tool. ?? 2009 Springer-Verlag.

  14. Seismic hazards: New trends in analysis using geologic data

    International Nuclear Information System (INIS)

    Schwartz, D.P.; Coppersmith, K.J.

    1986-01-01

    In the late 1960s and early 1970s, largely in response to expansion of nuclear power plant siting and issuance of a code of federal regullations by the Nuclear Regulatory Commission referred to as Appendix A-10CFR100, the need to characterize the earthquake potential of individual faults for seismic design took on greater importance. Appendix A established deterministic procedures for assessing the seismic hazard at nuclear power plant sites. Bonilla and Buchanan, using data from historical suface-faulting earthquakes, developed a set of statistical correlations relating earthquake magnitude to surface rupture length and to surface displacement. These relationships have been refined and updated along with the relationship between fault area and magnitude and seismic moment and moment magnitude have served as the basis for selecting maximum earthquakes in a wide variety of design situations. In the paper presented, the authors discuss new trends in seismic hazard analysis using geologic data, with special emphasis on fault-zone segmentation and recurrence models and the way in which they provide a basis for evaluating long-term earthquake potential

  15. Toward a comprehensive areal model of earthquake-induced landslides

    Science.gov (United States)

    Miles, S.B.; Keefer, D.K.

    2009-01-01

    This paper provides a review of regional-scale modeling of earthquake-induced landslide hazard with respect to the needs for disaster risk reduction and sustainable development. Based on this review, it sets out important research themes and suggests computing with words (CW), a methodology that includes fuzzy logic systems, as a fruitful modeling methodology for addressing many of these research themes. A range of research, reviewed here, has been conducted applying CW to various aspects of earthquake-induced landslide hazard zonation, but none facilitate comprehensive modeling of all types of earthquake-induced landslides. A new comprehensive areal model of earthquake-induced landslides (CAMEL) is introduced here that was developed using fuzzy logic systems. CAMEL provides an integrated framework for modeling all types of earthquake-induced landslides using geographic information systems. CAMEL is designed to facilitate quantitative and qualitative representation of terrain conditions and knowledge about these conditions on the likely areal concentration of each landslide type. CAMEL is highly modifiable and adaptable; new knowledge can be easily added, while existing knowledge can be changed to better match local knowledge and conditions. As such, CAMEL should not be viewed as a complete alternative to other earthquake-induced landslide models. CAMEL provides an open framework for incorporating other models, such as Newmark's displacement method, together with previously incompatible empirical and local knowledge. ?? 2009 ASCE.

  16. A Possible Paradigm for the Mitigation of the Adverse Impacts of Natural Hazards in the Developing Countries

    Science.gov (United States)

    Aswathanarayana, U.

    2001-05-01

    The proneness of a country or region to a given natural hazard depends upon its geographical location, physiography, geological and structural setting, landuse/landcover situation, and biophysical and socioeconomic environments (e.g. cyclones and floods in Bangladesh, earthquakes in Turkey, drought in Sub-Saharan Africa). While the natural hazards themselves cannot be prevented, it is possible to mitigate their adverse effects, by a knowledge-based, environmentally-sustainable approach, involving the stakeholder communities: (i) by being prepared: on the basis of the understanding of the land conditions which are prone to a given hazard and the processes which could culminate in damage to life and property (e.g. planting of dense-rooted vegetation belts to protect against landslides in the earthquake-prone areas), (ii) by avoiding improper anthropogenic activities that may exacerbate a hazard (e.g. deforestation accentuating the floods and droughts), and (iii) by putting a hazard to a beneficial use, where possible (groundwater recharging of flood waters), etc. Mitigation strategies need to be custom-made for each country/region by integrating the biophysical and socioeconomic components. The proposed paradigm is illustrated in respect of Extreme Weather Events (EWEs), which is based on the adoption of three approaches: (i) Typology approach, involving the interpretation of remotely sensed data, to predict (say) temporal and spatial distribution of precipitation, (ii) "black box" approach, whereby the potential environmental consequences of an EWE are projected on the basis of previously known case histories, and (iii) Information Technology approach, to translate advanced technical information in the form of "virtual" do-it-yourself steps understandable to lay public.

  17. Geophysical surveying in the Sacramento Delta for earthquake hazard assessment and measurement of peat thickness

    Science.gov (United States)

    Craig, M. S.; Kundariya, N.; Hayashi, K.; Srinivas, A.; Burnham, M.; Oikawa, P.

    2017-12-01

    Near surface geophysical surveys were conducted in the Sacramento-San Joaquin Delta for earthquake hazard assessment and to provide estimates of peat thickness for use in carbon models. Delta islands have experienced 3-8 meters of subsidence during the past century due to oxidation and compaction of peat. Projected sea level rise over the next century will contribute to an ongoing landward shift of the freshwater-saltwater interface, and increase the risk of flooding due to levee failure or overtopping. Seismic shear wave velocity (VS) was measured in the upper 30 meters to determine Uniform Building Code (UBC)/ National Earthquake Hazard Reduction Program (NEHRP) site class. Both seismic and ground penetrating radar (GPR) methods were employed to estimate peat thickness. Seismic surface wave surveys were conducted at eight sites on three islands and GPR surveys were conducted at two of the sites. Combined with sites surveyed in 2015, the new work brings the total number of sites surveyed in the Delta to twenty.Soil boreholes were made at several locations using a hand auger, and peat thickness ranged from 2.1 to 5.5 meters. Seismic surveys were conducted using the multichannel analysis of surface wave (MASW) method and the microtremor array method (MAM). On Bouldin Island, VS of the surficial peat layer was 32 m/s at a site with pure peat and 63 m/s at a site peat with higher clay and silt content. Velocities at these sites reached a similar value, about 125 m/s, at a depth of 10 m. GPR surveys were performed at two sites on Sherman Island using 100 MHz antennas, and indicated the base of the peat layer at a depth of about 4 meters, consistent with nearby auger holes.The results of this work include VS depth profiles and UBC/NEHRP site classifications. Seismic and GPR methods may be used in a complementary fashion to estimate peat thickness. The seismic surface wave method is a relatively robust method and more effective than GPR in many areas with high clay

  18. Building the Southern California Earthquake Center

    Science.gov (United States)

    Jordan, T. H.; Henyey, T.; McRaney, J. K.

    2004-12-01

    Kei Aki was the founding director of the Southern California Earthquake Center (SCEC), a multi-institutional collaboration formed in 1991 as a Science and Technology Center (STC) under the National Science Foundation (NSF) and the U. S. Geological Survey (USGS). Aki and his colleagues articulated a system-level vision for the Center: investigations by disciplinary working groups would be woven together into a "Master Model" for Southern California. In this presentation, we will outline how the Master-Model concept has evolved and how SCEC's structure has adapted to meet scientific challenges of system-level earthquake science. In its first decade, SCEC conducted two regional imaging experiments (LARSE I & II); published the "Phase-N" reports on (1) the Landers earthquake, (2) a new earthquake rupture forecast for Southern California, and (3) new models for seismic attenuation and site effects; it developed two prototype "Community Models" (the Crustal Motion Map and Community Velocity Model) and, perhaps most important, sustained a long-term, multi-institutional, interdisciplinary collaboration. The latter fostered pioneering numerical simulations of earthquake ruptures, fault interactions, and wave propagation. These accomplishments provided the impetus for a successful proposal in 2000 to reestablish SCEC as a "stand alone" center under NSF/USGS auspices. SCEC remains consistent with the founders' vision: it continues to advance seismic hazard analysis through a system-level synthesis that is based on community models and an ever expanding array of information technology. SCEC now represents a fully articulated "collaboratory" for earthquake science, and many of its features are extensible to other active-fault systems and other system-level collaborations. We will discuss the implications of the SCEC experience for EarthScope, the USGS's program in seismic hazard analysis, NSF's nascent Cyberinfrastructure Initiative, and other large collaboratory programs.

  19. Investigation of tectonics and statistical analysis of earthquake hazard in Tange Sorkh dam

    OpenAIRE

    ZOLFAGHARI, Sayyed Yaghoub; RAFIEE, A.; HADI, S. M.R.; TAHERMANESH, R.

    2015-01-01

    Abstract. Today, most understood the importance of the risk of earthquakes with the intensification of the country's development, the rise in urbanization, the concentration of population and material and intellectual capital and increased vulnerability of the capital in the Iran seismic zone. Iran, as one of the most seismic countries in the world, in recent years has witnessed the devastating earthquake, for example can be pointed to earthquakes of Rudbar - Manjil, Bojnoord, Zir Kouh Ghaena...

  20. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    Science.gov (United States)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  1. Smartphone MEMS accelerometers and earthquake early warning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  2. Seismic hazard map of North and Central America and the Caribbean

    Directory of Open Access Journals (Sweden)

    K. M. Shedlock

    1999-06-01

    Full Text Available Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes, emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of North and Central America and the Caribbean is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful regional seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA with a 10% chance of exceedance in 50 years. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of North and Central America and the Caribbean depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings. The highest seismic hazard values in the region generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes.

  3. Safety analysis of nuclear containment vessels subjected to strong earthquakes and subsequent tsunamis

    Directory of Open Access Journals (Sweden)

    Feng Lin

    2017-08-01

    Full Text Available Nuclear power plants under expansion and under construction in China are mostly located in coastal areas, which means they are at risk of suffering strong earthquakes and subsequent tsunamis. This paper presents a safety analysis for a new reinforced concrete containment vessel in such events. A finite element method-based model was built, verified, and first used to understand the seismic performance of the containment vessel under earthquakes with increased intensities. Then, the model was used to assess the safety performance of the containment vessel subject to an earthquake with peak ground acceleration (PGA of 0.56g and subsequent tsunamis with increased inundation depths, similar to the 2011 Great East earthquake and tsunami in Japan. Results indicated that the containment vessel reached Limit State I (concrete cracking and Limit State II (concrete crushing when the PGAs were in a range of 0.8–1.1g and 1.2–1.7g, respectively. The containment vessel reached Limit State I with a tsunami inundation depth of 10 m after suffering an earthquake with a PGA of 0.56g. A site-specific hazard assessment was conducted to consider the likelihood of tsunami sources.

  4. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  5. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  6. Geo-Hazards and Mountain Road Development in Nepal: Understanding the Science-Policy-Governance Interface

    Science.gov (United States)

    Dugar, Sumit; Dahal, Vaskar

    2015-04-01

    The foothills of Nepalese Himalayas located in the neotectonic mountain environment are among some of the most unstable and geomorphologically dynamic landscapes in the world. Young fold mountains in this region are characterized by complex tectonics that influence the occurrence of earthquakes, while climatic processes such as intense orographic rainfall often dictate the occurrence of floods and landslides. Development of linear infrastructures, such as roads, in mountainous terrain characterized by high relief and orogeny is considerably challenging where the complexity of landscape in steep and irregular topography, difficult ground conditions and weak geology, presents engineers and planners with numerous difficulties to construct and maintain mountain roads. Whilst application of engineering geology, geomorphic interpretation of terrain in terms of physiography and hydrology, and identification of geo-hazards along the road corridor is critical for long term operation of mountain roads, low-cost arterial roads in the Himalayan foothills generally fail to incorporate standard road slope engineering structures. This research provides unique insights on policy and governance issues in developing mountainous countries such as Nepal, where achieving a sound balance between sustainability and affordability is a major challenge for road construction. Road development in Nepal is a complex issue where socio-economic and political factors influence the budget allocation for road construction in rural hilly areas. Moreover, most mountain roads are constructed without any geological or geo-technical site investigations due to rampant corruption and lack of adequate engineering supervision. Despite having good examples of rural road construction practices such as the Dharan-Dhankuta Road in Eastern Nepal where comprehensive terrain-evaluation methods and geo-technical surveys led to an improved understanding of road construction, learnings from this project have not

  7. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    International Nuclear Information System (INIS)

    Payne, S. M.; Gorman, V. W.; Jensen, S. A.; Nitzel, M. E.; Russell, M. J.; Smith, R. P.

    2000-01-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process

  8. Projecting community changes in hazard exposure to support long-term risk reduction: A case study of tsunami hazards in the U.S. Pacific Northwest

    Science.gov (United States)

    Sleeter, Benjamin M.; Wood, Nathan J.; Soulard, Christopher E.; Wilson, Tamara

    2017-01-01

    Tsunamis have the potential to cause considerable damage to communities along the U.S. Pacific Northwest coastline. As coastal communities expand over time, the potential societal impact of tsunami inundation changes. To understand how community exposure to tsunami hazards may change in coming decades, we projected future development (i.e. urban, residential, and rural), households, and residents over a 50-year period (2011–2061) along the Washington, Oregon, and northern California coasts. We created a spatially explicit, land use/land cover, state-and-transition simulation model to project future developed land use based on historical development trends. We then compared our development projection results to tsunami-hazard zones associated with a Cascadia subduction zone (CSZ) earthquake. Changes in tsunami-hazard exposure by 2061 were estimated for 50 incorporated cities, 7 tribal reservations, and 17 counties relative to current (2011) estimates. Across the region, 2061 population exposure in tsunami-hazard zones was projected to increase by 3880 households and 6940 residents. The top ten communities with highest population exposure to CSZ-related tsunamis in 2011 are projected to remain the areas with the highest population exposure by 2061. The largest net population increases in tsunami-hazard zones were projected in the unincorporated portions of several counties, including Skagit, Coos, and Humboldt. Land-change simulation modeling of projected future development serves as an exploratory tool aimed at helping local governments understand the hazard-exposure implications of community growth and to include this knowledge in risk-reduction planning.

  9. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    Science.gov (United States)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate

  10. From Seismic Scenarios to Earthquake Risk Assessment: A Case Study for Iquique, Chile.

    Science.gov (United States)

    Aguirre, P.; Fortuno, C.; Martin, J. C. D. L. L.; Vasquez, J.

    2015-12-01

    Iquique is a strategic city and economic center in northern Chile, and is located in a large seismic gap where a megathrust earthquake and tsunami is expected. Although it was hit by a Mw 8.2 earthquake on April 1st 2014, which caused moderate damage, geophysical evidence still suggests that there is potential for a larger event, so a thorough risk assessment is key to understand the physical, social, and economic effects of such potential event, and devise appropriate mitigation plans. Hence, Iquique has been selected as a prime study case for the implementation of a risk assessment platform in Chile. Our study integrates research on three main elements of risk calculations: hazard evaluation, exposure model, and physical vulnerabilities. To characterize the hazard field, a set of synthetic seismic scenarios have been developed based on plate interlocking and the residual slip potential that results from subtracting the slip occurred during the April 1st 2014 rupture fault mechanism, obtained using InSAR+GPS inversion. Additional scenarios were developed based of the fault rupture model of the Maule 2010 Mw 8.8 earthquake and on the local plate locking models in northern Chile. These rupture models define a collection of possible realizations of earthquake geometries parameterized in terms of critical variables like slip magnitude, rise time, mean propagation velocity, directivity, and other, which are propagated to obtain a hazard map for Iquique (e.g. PGA, PGV, PDG). Furthermore, a large body of public and local data was used to construct a detailed exposure model for Iquique, including aggregated building count, demographics, essential facilities, and lifelines. This model together with the PGA maps for the April 1st 2014 earthquake are used to calibrate HAZUS outputs against observed damage, and adjust the fragility curves of physical systems according to more detailed analyses of typical Chilean building types and their structural properties, plus historical

  11. Smartphone-Based Earthquake and Tsunami Early Warning in Chile

    Science.gov (United States)

    Brooks, B. A.; Baez, J. C.; Ericksen, T.; Barrientos, S. E.; Minson, S. E.; Duncan, C.; Guillemot, C.; Smith, D.; Boese, M.; Cochran, E. S.; Murray, J. R.; Langbein, J. O.; Glennie, C. L.; Dueitt, J.; Parra, H.

    2016-12-01

    Many locations around the world face high seismic hazard, but do not have the resources required to establish traditional earthquake and tsunami warning systems (E/TEW) that utilize scientific grade seismological sensors. MEMs accelerometers and GPS chips embedded in, or added inexpensively to, smartphones are sensitive enough to provide robust E/TEW if they are deployed in sufficient numbers. We report on a pilot project in Chile, one of the most productive earthquake regions world-wide. There, magnitude 7.5+ earthquakes occurring roughly every 1.5 years and larger tsunamigenic events pose significant local and trans-Pacific hazard. The smartphone-based network described here is being deployed in parallel to the build-out of a scientific-grade network for E/TEW. Our sensor package comprises a smartphone with internal MEMS and an external GPS chipset that provides satellite-based augmented positioning and phase-smoothing. Each station is independent of local infrastructure, they are solar-powered and rely on cellular SIM cards for communications. An Android app performs initial onboard processing and transmits both accelerometer and GPS data to a server employing the FinDer-BEFORES algorithm to detect earthquakes, producing an acceleration-based line source model for smaller magnitude earthquakes or a joint seismic-geodetic finite-fault distributed slip model for sufficiently large magnitude earthquakes. Either source model provides accurate ground shaking forecasts, while distributed slip models for larger offshore earthquakes can be used to infer seafloor deformation for local tsunami warning. The network will comprise 50 stations by Sept. 2016 and 100 stations by Dec. 2016. Since Nov. 2015, batch processing has detected, located, and estimated the magnitude for Mw>5 earthquakes. Operational since June, 2016, we have successfully detected two earthquakes > M5 (M5.5, M5.1) that occurred within 100km of our network while producing zero false alarms.

  12. Discovering Coseismic Traveling Ionospheric Disturbances Generated by the 2016 Kaikoura Earthquake

    Science.gov (United States)

    Li, J. D.; Rude, C. M.; Gowanlock, M.; Pankratius, V.

    2017-12-01

    Geophysical events and hazards, such as earthquakes, tsunamis, and volcanoes, have been shown to generate traveling ionospheric disturbances (TIDs). These disturbances can be measured by means of Total Electron Content fluctuations obtained from a network of multifrequency GPS receivers in the MIT Haystack Observatory Madrigal database. Analyzing the response of the ionosphere to such hazards enhances our understanding of natural phenomena and augments our large-scale monitoring capabilities in conjunction with other ground-based sensors. However, it is currently challenging for human investigators to spot and characterize such signatures, or whether a geophysical event has actually occurred, because the ionosphere can be noisy with multiple simultaneous phenomena taking place at the same time. This work therefore explores a systematic pipeline for the ex-post discovery and characterization of TIDs. Our technique starts by geolocating the event and gathering the corresponding data, then checks for potentially conflicting TID sources, and processes the raw total electron content data to generate differential measurements. A Kolmogorov-Smirnov test is applied to evaluate the statistical significance of detected deviations in the differential measurements. We present results from our successful application of this pipeline to the 2016 7.8 Mw Kaikoura earthquake occurring in New Zealand on November 13th. We detect a coseismic TID occurring 8 minutes after the earthquake and propagating towards the equator at 1050 m/s, with a 0.22 peak-to-peak TECu amplitude. Furthermore, the observed waveform exhibits more complex behavior than the expected N-wave for a coseismic TID, which potentially results from the complex multi-fault structure of the earthquake. We acknowledge support from NSF ACI1442997 (PI Pankratius), NASA AISTNNX15AG84G (PI Pankratius), and NSF AGS-1343967 (PI Pankratius), and NSF AGS-1242204 (PI Erickson).

  13. Tsunami hazard and risk assessment in El Salvador

    Science.gov (United States)

    González, M.; González-Riancho, P.; Gutiérrez, O. Q.; García-Aguilar, O.; Aniel-Quiroga, I.; Aguirre, I.; Alvarez, J. A.; Gavidia, F.; Jaimes, I.; Larreynaga, J. A.

    2012-04-01

    Tsunamis are relatively infrequent phenomena representing a greater threat than earthquakes, hurricanes and tornadoes, causing the loss of thousands of human lives and extensive damage to coastal infrastructure around the world. Several works have attempted to study these phenomena in order to understand their origin, causes, evolution, consequences, and magnitude of their damages, to finally propose mechanisms to protect coastal societies. Advances in the understanding and prediction of tsunami impacts allow the development of adaptation and mitigation strategies to reduce risk on coastal areas. This work -Tsunami Hazard and Risk Assessment in El Salvador-, funded by AECID during the period 2009-12, examines the state of the art and presents a comprehensive methodology for assessing the risk of tsunamis at any coastal area worldwide and applying it to the coast of El Salvador. The conceptual framework is based on the definition of Risk as the probability of harmful consequences or expected losses resulting from a given hazard to a given element at danger or peril, over a specified time period (European Commission, Schneiderbauer et al., 2004). The HAZARD assessment (Phase I of the project) is based on propagation models for earthquake-generated tsunamis, developed through the characterization of tsunamigenic sources -sismotectonic faults- and other dynamics under study -tsunami waves, sea level, etc.-. The study area is located in a high seismic activity area and has been hit by 11 tsunamis between 1859 and 1997, nine of them recorded in the twentieth century and all generated by earthquakes. Simulations of historical and potential tsunamis with greater or lesser affection to the country's coast have been performed, including distant sources, intermediate and close. Deterministic analyses of the threats under study -coastal flooding- have been carried out, resulting in different hazard maps (maximum wave height elevation, maximum water depth, minimum tsunami

  14. Extreme seismicity and disaster risks: Hazard versus vulnerability (Invited)

    Science.gov (United States)

    Ismail-Zadeh, A.

    2013-12-01

    Although the extreme nature of earthquakes has been known for millennia due to the resultant devastation from many of them, the vulnerability of our civilization to extreme seismic events is still growing. It is partly because of the increase in the number of high-risk objects and clustering of populations and infrastructure in the areas prone to seismic hazards. Today an earthquake may affect several hundreds thousand lives and cause significant damage up to hundred billion dollars; it can trigger an ecological catastrophe if occurs in close vicinity to a nuclear power plant. Two types of extreme natural events can be distinguished: (i) large magnitude low probability events, and (ii) the events leading to disasters. Although the first-type events may affect earthquake-prone countries directly or indirectly (as tsunamis, landslides etc.), the second-type events occur mainly in economically less-developed countries where the vulnerability is high and the resilience is low. Although earthquake hazards cannot be reduced, vulnerability to extreme events can be diminished by monitoring human systems and by relevant laws preventing an increase in vulnerability. Significant new knowledge should be gained on extreme seismicity through observations, monitoring, analysis, modeling, comprehensive hazard assessment, prediction, and interpretations to assist in disaster risk analysis. The advanced disaster risk communication skill should be developed to link scientists, emergency management authorities, and the public. Natural, social, economic, and political reasons leading to disasters due to earthquakes will be discussed.

  15. Tectonic feedback and the earthquake cycle

    Science.gov (United States)

    Lomnitz, Cinna

    1985-09-01

    The occurrence of cyclical instabilities along plate boundaries at regular intervals suggests that the process of earthquake causation differs in some respects from the model of elastic rebound in its simplest forms. The model of tectonic feedback modifies the concept of this original model in that it provides a physical interaction between the loading rate and the state of strain on the fault. Two examples are developed: (a) Central Chile, and (b) Mexico. The predictions of earthquake hazards for both types of models are compared.

  16. Evaluation of Seismic Hazards at California Department of Transportation (CALTRANS)Structures

    Science.gov (United States)

    Merriam, M. K.

    2005-12-01

    The California Department of Transportation (CALTRANS) has responsibility for design, construction, and maintenance of approximately 12,000 state bridges. CALTRANS also provides oversight for similar activities for 12,200 bridges owned by local agencies throughout the state. California is subjected to a M6 or greater seismic event every few years. Recent earthquakes include the 1971 Mw6.6 San Fernando earthquake which struck north of Los Angeles and prompted engineers to begin retrofitting existing bridges and re-examine the way bridges are detailed to improve their response to earthquakes, the 1989 Mw6.9 Loma Prieta earthquake which destroyed the Cypress Freeway and damaged the San Francisco-Oakland Bay Bridge, and the 1994 Mw6.7 Northridge earthquake in the Los Angeles area which heavily damaged four major freeways. Since CALTRANS' seismic performance goal is to ensure life-safety needs are met for the traveling public during an earthquake, estimating earthquake magnitude, peak bedrock acceleration, and determining if special seismic considerationsare needed at specific bridge sites are critical. CALTRANS is currently developing a fourth generation seismic hazard map to be used for estimating these parameters. A deterministic approach has been used to develop this map. Late-Quaternary-age faults are defined as the expected seismic sources. Caltrans requires site-specific studies to determine potential for liquefaction, seismically induced landslides, and surface fault rupture. If potential for one of these seismic hazards exists, the hazard is mitigated by avoidance, removal, or accommodated through design. The action taken, while complying with the Department's "no collapse" requirement, depends upon many factors, including cost.

  17. The 2016 Central Italy Earthquake: an Overview

    Science.gov (United States)

    Amato, A.

    2016-12-01

    The M6 central Italy earthquake occurred on the seismic backbone of the Italy, just in the middle of the highest hazard belt. The shock hit suddenly during the night of August 24, when people were asleep; no foreshocks occurred before the main event. The earthquake ruptured from 10 km to the surface, and produced a more than 17,000 aftershocks (Oct. 19) spread on a 40x20 km2 area elongated NW-SE. It is geologically very similar to previous recent events of the Apennines. Both the 2009 L'Aquila earthquake to the south and the 1997 Colfiorito to the north, were characterized by the activation of adjacent fault segments. Despite its magnitude and the well known seismic hazard of the region, the earthquake produced extensive damage and 297 fatalities. The town of Amatrice, that paid the highest toll, was classified in zone 1 (the highest) since 1915, but the buildings in this and other villages revealed highly vulnerable. In contrast, in the town of Norcia, that also experienced strong ground shaking, no collapses occurred, most likely due to the retrofitting carried out after an earthquake in 1979. Soon after the quake, the INGV Crisis Unit convened at night in the Rome headquarters, in order to coordinate the activities. The first field teams reached the epicentral area at 7 am with the portable seismic stations installed to monitor the aftershocks; other teams followed to map surface faults, damage, to measure GPS sites, to install instruments for site response studies, and so on. The INGV Crisis Unit includes the Press office and the INGVterremoti team, in order to manage and coordinate the communication towards the Civil Protection Dept. (DPC), the media and the web. Several tens of reports and updates have been delivered in the first month of the sequence to DPC. Also due to the controversial situation arisen from the L'Aquila earthquake and trials, particular attention was given to the communication: continuous and timely information has been released to

  18. Seismic Hazard and risk assessment for Romania -Bulgaria cross-border region

    Science.gov (United States)

    Simeonova, Stela; Solakov, Dimcho; Alexandrova, Irena; Vaseva, Elena; Trifonova, Petya; Raykova, Plamena

    2016-04-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic hazard and vulnerability to earthquakes are steadily increasing as urbanization and development occupy more areas that are prone to effects of strong earthquakes. The assessment of the seismic hazard and risk is particularly important, because it provides valuable information for seismic safety and disaster mitigation, and it supports decision making for the benefit of society. Romania and Bulgaria, situated in the Balkan Region as a part of the Alpine-Himalayan seismic belt, are characterized by high seismicity, and are exposed to a high seismic risk. Over the centuries, both countries have experienced strong earthquakes. The cross-border region encompassing the northern Bulgaria and southern Romania is a territory prone to effects of strong earthquakes. The area is significantly affected by earthquakes occurred in both countries, on the one hand the events generated by the Vrancea intermediate-depth seismic source in Romania, and on the other hand by the crustal seismicity originated in the seismic sources: Shabla (SHB), Dulovo, Gorna Orjahovitza (GO) in Bulgaria. The Vrancea seismogenic zone of Romania is a very peculiar seismic source, often described as unique in the world, and it represents a major concern for most of the northern part of Bulgaria as well. In the present study the seismic hazard for Romania-Bulgaria cross-border region on the basis of integrated basic geo-datasets is assessed. The hazard results are obtained by applying two alternative approaches - probabilistic and deterministic. The MSK64 intensity (MSK64 scale is practically equal to the new EMS98) is used as output parameter for the hazard maps. We prefer to use here the macroseismic intensity instead of PGA, because it is directly related to the degree of damages and, moreover, the epicentral intensity is the original

  19. The 1964 Great Alaska Earthquake and tsunamis: a modern perspective and enduring legacies

    Science.gov (United States)

    Brocher, Thomas M.; Filson, John R.; Fuis, Gary S.; Haeussler, Peter J.; Holzer, Thomas L.; Plafker, George; Blair, J. Luke

    2014-01-01

    The magnitude 9.2 Great Alaska Earthquake that struck south-central Alaska at 5:36 p.m. on Friday, March 27, 1964, is the largest recorded earthquake in U.S. history and the second-largest earthquake recorded with modern instruments. The earthquake was felt throughout most of mainland Alaska, as far west as Dutch Harbor in the Aleutian Islands some 480 miles away, and at Seattle, Washington, more than 1,200 miles to the southeast of the fault rupture, where the Space Needle swayed perceptibly. The earthquake caused rivers, lakes, and other waterways to slosh as far away as the coasts of Texas and Louisiana. Water-level recorders in 47 states—the entire Nation except for Connecticut, Delaware, and Rhode Island— registered the earthquake. It was so large that it caused the entire Earth to ring like a bell: vibrations that were among the first of their kind ever recorded by modern instruments. The Great Alaska Earthquake spawned thousands of lesser aftershocks and hundreds of damaging landslides, submarine slumps, and other ground failures. Alaska’s largest city, Anchorage, located west of the fault rupture, sustained heavy property damage. Tsunamis produced by the earthquake resulted in deaths and damage as far away as Oregon and California. Altogether the earthquake and subsequent tsunamis caused 129 fatalities and an estimated $2.3 billion in property losses (in 2013 dollars). Most of the population of Alaska and its major transportation routes, ports, and infrastructure lie near the eastern segment of the Aleutian Trench that ruptured in the 1964 earthquake. Although the Great Alaska Earthquake was tragic because of the loss of life and property, it provided a wealth of data about subductionzone earthquakes and the hazards they pose. The leap in scientific understanding that followed the 1964 earthquake has led to major breakthroughs in earth science research worldwide over the past half century. This fact sheet commemorates Great Alaska Earthquake and

  20. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    Science.gov (United States)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  1. Earthquake and welded structures 5: Earthquake damages and anti-earthquake measures of oil storage tanks; 5 kikenbutsu chozo tank no jishin higai to taishin taisaku

    Energy Technology Data Exchange (ETDEWEB)

    Kawano, K. [Chiyoda Chemical Engineering and Construction Co. Ltd., Tokyo (Japan)

    1997-09-05

    The result of field investigation carried out on the state of damages of 236 hazardous material storage tanks out of 687 caused by the Hyogoken Nambu Earthquake in 1995 is introduced together with the cases of damage and the description of the countermeasures. The events of inclination and settlement of tank bodies were confirmed in 44% among those investigated in particular with tanks having a capacity of less than 1000kl and as for the basement and ground settlement, the fact that sand spouted as a result of their fluidization was witnessed as much as 81% among those investigated and the area surrounding tanks was roughly agreed with the area where ground crack appeared. A great number of other damages such as cracking of preventive seals against rain water, breakdown of oil defense banks and so forth were also confirmed. In the latter half of the report, aseismatic standards of old and new regulations as well as on the new criterion concerning the outdoor storage tank body, its basement and ground are tabulated and 4 items of anti-earthquake measures such as the final structural check up with regard to an earthquake exceeding the designed permissible stress, consolidation of tank body structure on the basis of the revised seismic coefficient method, assurance of the steadfast basement, prevention of the elevated platform from falling down and strengthening of water-proof seals and oil defense banks are enumerated in accordance with the report of investigation and examination on the resistibility of hazardous material storage equipment against the earthquake. 3 refs., 5 figs., 3 tabs.

  2. Real-time earthquake data feasible

    Science.gov (United States)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  3. Migrating pattern of deformation prior to the Tohoku-Oki earthquake revealed by GRACE data

    Science.gov (United States)

    Panet, Isabelle; Bonvalot, Sylvain; Narteau, Clément; Remy, Dominique; Lemoine, Jean-Michel

    2018-05-01

    Understanding how and when far-field continuous motions lead to giant subduction earthquakes remains a challenge. An important limitation comes from an incomplete description of aseismic mass fluxes at depth along plate boundaries. Here we analyse Earth's gravity field variations derived from GRACE satellite data in a wide space-time domain surrounding the Mw 9.0 2011 Tohoku-Oki earthquake. We show that this earthquake is the extreme expression of initially silent deformation migrating from depth to the surface across the entire subduction system. Our analysis indeed reveals large-scale gravity and mass changes throughout three tectonic plates and connected slabs, starting a few months before March 2011. Before the Tohoku-Oki earthquake rupture, the gravity variations can be explained by aseismic extension of the Pacific plate slab at mid-upper mantle depth, concomitant with increasing seismicity in the shallower slab. For more than two years after the rupture, the deformation propagated far into the Pacific and Philippine Sea plate interiors, suggesting that subduction accelerated along 2,000 km of the plate boundaries in March 2011. This gravitational image of the earthquake's long-term dynamics provides unique information on deep and crustal processes over intermediate timescales, which could be used in seismic hazard assessment.

  4. 2014 Update of the Pacific Northwest portion of the U.S. National Seismic Hazard Maps

    Science.gov (United States)

    Frankel, Arthur; Chen, Rui; Petersen, Mark; Moschetti, Morgan P.; Sherrod, Brian

    2015-01-01

    Several aspects of the earthquake characterization were changed for the Pacific Northwest portion of the 2014 update of the national seismic hazard maps, reflecting recent scientific findings. New logic trees were developed for the recurrence parameters of M8-9 earthquakes on the Cascadia subduction zone (CSZ) and for the eastern edge of their rupture zones. These logic trees reflect recent findings of additional M8 CSZ earthquakes using offshore deposits of turbidity flows and onshore tsunami deposits and subsidence. These M8 earthquakes each rupture a portion of the CSZ and occur in the time periods between M9 earthquakes that have an average recurrence interval of about 500 years. The maximum magnitude was increased for deep intraslab earthquakes. An areal source zone to account for the possibility of deep earthquakes under western Oregon was expanded. The western portion of the Tacoma fault was added to the hazard maps.

  5. Natural hazard risk perception of Italian population: case studies along national territory.

    Science.gov (United States)

    Gravina, Teresita; Tupputi Schinosa, Francesca De Luca; Zuddas, Isabella; Preto, Mattia; Marengo, Angelo; Esposito, Alessandro; Figliozzi, Emanuele; Rapinatore, Matteo

    2015-04-01

    Risk perception is judgment that people make about the characteristics and severity of risks, in last few years risk perception studies focused on provide cognitive elements to communication experts responsible in order to design citizenship information and awareness appropriate strategies. Several authors in order to determine natural hazards risk (Seismic, landslides, cyclones, flood, Volcanic) perception used questionnaires as tool for providing reliable quantitative data and permitting comparison the results with those of similar surveys. In Italy, risk perception studies based on surveys, were also carried out in order to investigate on national importance Natural risk, in particular on Somma-Vesuvio and Phlegrean Fields volcanic Risks, but lacked risk perception studies on local situation distributed on whole national territory. National importance natural hazard were frequently reported by national mass media and there were debate about emergencies civil protection plans, otherwise could be difficult to obtain information on bonded and regional nature natural hazard which were diffuses along National territory. In fact, Italian peninsula was a younger geological area subjected to endogenous phenomena (volcanoes, earthquake) and exogenous phenomena which determine land evolution and natural hazard (landslide, coastal erosion, hydrogeological instability, sinkhole) for population. For this reason we decided to investigate on natural risks perception in different Italian place were natural hazard were taken place but not reported from mass media, as were only local relevant or historical event. We carried out surveys in different Italian place interested by different types of natural Hazard (landslide, coastal erosion, hydrogeological instability, sinkhole, volcanic phenomena and earthquake) and compared results, in order to understand population perception level, awareness and civil protection exercises preparation. Our findings support that risks

  6. Tsunami evacuation plans for future megathrust earthquakes in Padang, Indonesia, considering stochastic earthquake scenarios

    Directory of Open Access Journals (Sweden)

    A. Muhammad

    2017-12-01

    Full Text Available This study develops tsunami evacuation plans in Padang, Indonesia, using a stochastic tsunami simulation method. The stochastic results are based on multiple earthquake scenarios for different magnitudes (Mw 8.5, 8.75, and 9.0 that reflect asperity characteristics of the 1797 historical event in the same region. The generation of the earthquake scenarios involves probabilistic models of earthquake source parameters and stochastic synthesis of earthquake slip distributions. In total, 300 source models are generated to produce comprehensive tsunami evacuation plans in Padang. The tsunami hazard assessment results show that Padang may face significant tsunamis causing the maximum tsunami inundation height and depth of 15 and 10 m, respectively. A comprehensive tsunami evacuation plan – including horizontal evacuation area maps, assessment of temporary shelters considering the impact due to ground shaking and tsunami, and integrated horizontal–vertical evacuation time maps – has been developed based on the stochastic tsunami simulation results. The developed evacuation plans highlight that comprehensive mitigation policies can be produced from the stochastic tsunami simulation for future tsunamigenic events.

  7. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    Science.gov (United States)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  8. Probabilistic aftershock hazard analysis, two case studies in West and Northwest Iran

    Science.gov (United States)

    Ommi, S.; Zafarani, H.

    2018-01-01

    Aftershock hazard maps contain the essential information for search and rescue process, and re-occupation after a main-shock. Accordingly, the main purposes of this article are to study the aftershock decay parameters and to estimate the expected high-frequency ground motions (i.e., Peak Ground Acceleration (PGA)) for recent large earthquakes in the Iranian plateau. For this aim, the Ahar-Varzaghan doublet earthquake (August 11, 2012; M N =6.5, M N =6.3), and the Ilam (Murmuri) earthquake (August 18, 2014 ; M N =6.2) have been selected. The earthquake catalogue has been collected based on the Gardner and Knopoff (Bull Seismol Soc Am 64(5), 1363-1367, 1974) temporal and spatial windowing technique. The magnitude of completeness and the seismicity parameters ( a, b) and the modified Omori law parameters ( P, K, C) have been determined for these two earthquakes in the 14, 30, and 60 days after the mainshocks. Also, the temporal changes of parameters (a, b, P, K, C) have been studied. The aftershock hazard maps for the probability of exceedance (33%) have been computed in the time periods of 14, 30, and 60 days after the Ahar-Varzaghan and Ilam (Murmuri) earthquakes. For calculating the expected PGA of aftershocks, the regional and global ground motion prediction equations have been utilized. Amplification factor based on the site classes has also been implied in the calculation of PGA. These aftershock hazard maps show an agreement between the PGAs of large aftershocks and the forecasted PGAs. Also, the significant role of b parameter in the Ilam (Murmuri) probabilistic aftershock hazard maps has been investigated.

  9. Deterministic Tectonic Origin Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    Science.gov (United States)

    Necmioglu, O.; Meral Ozel, N.

    2014-12-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) and medium (1 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the

  10. A new way of telling earthquake stories: MOBEE - the MOBile Earthquake Exhibition

    Science.gov (United States)

    Tataru, Dragos; Toma-Danila, Dragos; Nastase, Eduard

    2016-04-01

    In the last decades, the demand and acknowledged importance of science outreach, in general and geophysics in particular, has grown, as demonstrated by many international and national projects and other activities performed by research institutes. The National Institute for Earth Physics (NIEP) from Romania is the leading national institution on earthquake monitoring and research, having at the same time a declared focus on informing and educating a wide audience about geosciences and especially seismology. This is more than welcome, since Romania is a very active country from a seismological point of view, but not too reactive when it comes to diminishing the possible effect of a major earthquake. Over the last few decades, the country has experienced several major earthquakes which have claimed thousands of lives and millions in property damage (1940; 1977; 1986 and 1990 Vrancea earthquakes). In this context, during a partnership started in 2014 together with the National Art University and Siveco IT company, a group of researchers from NIEP initiated the MOBile Earthquake Exhibition (MOBEE) project. The main goal was to design a portable museum to bring on the road educational activities focused on seismology, seismic hazard and Earth science. The exhibition is mainly focused on school students of all ages as it explains the main topics of geophysics through a unique combination of posters, digital animations and apps, large markets and exciting hand-on experiments, 3D printed models and posters. This project is singular in Romania and aims to transmit properly reviewed actual information, regarding the definition of earthquakes, the way natural hazards can affect people, buildings and the environment and the measures to be taken for prevent an aftermath. Many of the presented concepts can be used by teachers as a complementary way of demonstrating physics facts and concepts and explaining processes that shape the dynamic Earth features. It also involves

  11. Reliability of lifeline networks under seismic hazard

    International Nuclear Information System (INIS)

    Selcuk, A. Sevtap; Yuecemen, M. Semih

    1999-01-01

    Lifelines, such as pipelines, transportation, communication and power transmission systems, are networks which extend spatially over large geographical regions. The quantification of the reliability (survival probability) of a lifeline under seismic threat requires attention, as the proper functioning of these systems during or after a destructive earthquake is vital. In this study, a lifeline is idealized as an equivalent network with the capacity of its elements being random and spatially correlated and a comprehensive probabilistic model for the assessment of the reliability of lifelines under earthquake loads is developed. The seismic hazard that the network is exposed to is described by a probability distribution derived by using the past earthquake occurrence data. The seismic hazard analysis is based on the 'classical' seismic hazard analysis model with some modifications. An efficient algorithm developed by Yoo and Deo (Yoo YB, Deo N. A comparison of algorithms for terminal pair reliability. IEEE Transactions on Reliability 1988; 37: 210-215) is utilized for the evaluation of the network reliability. This algorithm eliminates the CPU time and memory capacity problems for large networks. A comprehensive computer program, called LIFEPACK is coded in Fortran language in order to carry out the numerical computations. Two detailed case studies are presented to show the implementation of the proposed model

  12. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  13. Strong motion modeling at the Paducah Diffusion Facility for a large New Madrid earthquake

    International Nuclear Information System (INIS)

    Herrmann, R.B.

    1991-01-01

    The Paducah Diffusion Facility is within 80 kilometers of the location of the very large New Madrid earthquakes which occurred during the winter of 1811-1812. Because of their size, seismic moment of 2.0 x 10 27 dyne-cm or moment magnitude M w = 7.5, the possible recurrence of these earthquakes is a major element in the assessment of seismic hazard at the facility. Probabilistic hazard analysis can provide uniform hazard response spectra estimates for structure evaluation, but a deterministic modeling of a such a large earthquake can provide strong constraints on the expected duration of motion. The large earthquake is modeled by specifying the earthquake fault and its orientation with respect to the site, and by specifying the rupture process. Synthetic time histories, based on forward modeling of the wavefield, from each subelement are combined to yield a three component time history at the site. Various simulations are performed to sufficiently exercise possible spatial and temporal distributions of energy release on the fault. Preliminary results demonstrate the sensitivity of the method to various assumptions, and also indicate strongly that the total duration of ground motion at the site is controlled primarily by the length of the rupture process on the fault

  14. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    Science.gov (United States)

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul S.; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  15. Evaluation and cataloging of Korean historical earthquakes

    International Nuclear Information System (INIS)

    Lee, Kew Hwa; Han, Young Woo; Lee, Jun Hui; Park, Ji Eok; Na, Kwang Wooing; Shin, Byung Ju

    1999-03-01

    Historical earthquake data of the Korean Peninsula which are very important is evaluating seismicity and seismic hazard of the peninsula were collected and analyzed by seismologist and historian. A preliminary catalog of Korean historical earthquake data translated in English was made. Felt places of 528 events felt at more than 2 places were indicated on maps and MMI III isoseismal were drawn for 52 events of MMI≥VII. Epicenters and intensities of these MMI≥VII events were estimated from these isoseismal maps

  16. Evaluation and cataloging of Korean historical earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kew Hwa; Han, Young Woo; Lee, Jun Hui; Park, Ji Eok; Na, Kwang Wooing; Shin, Byung Ju [The Reaearch Institute of Basic Sciences, Seoul Nationl Univ., Seoul (Korea, Republic of)

    1999-03-15

    Historical earthquake data of the Korean Peninsula which are very important is evaluating seismicity and seismic hazard of the peninsula were collected and analyzed by seismologist and historian. A preliminary catalog of Korean historical earthquake data translated in English was made. Felt places of 528 events felt at more than 2 places were indicated on maps and MMI III isoseismal were drawn for 52 events of MMI{>=}VII. Epicenters and intensities of these MMI{>=}VII events were estimated from these isoseismal maps.

  17. Varenna workshop report. Operational earthquake forecasting and decision making

    Directory of Open Access Journals (Sweden)

    Warner Marzocchi

    2015-09-01

    Full Text Available A workshop on Operational earthquake forecasting and decision making was convened in Varenna, Italy, on June 8-11, 2014, under the sponsorship of the EU FP 7 REAKT (Strategies and tools for Real-time EArthquake risK reducTion project, the Seismic Hazard Center at the Istituto Nazionale di Geofisica e Vulcanologia (INGV, and the Southern California Earthquake Center (SCEC. The main goal was to survey the interdisciplinary issues of operational earthquake forecasting (OEF, including the problems that OEF raises for decision making and risk communication. The workshop was attended by 64 researchers from universities, research centers, and governmental institutions in 11 countries. Participants and the workshop agenda are listed in the appendix.The workshop comprised six topical sessions structured around three main themes: the science of operational earthquake forecasting, decision making in a low-probability environment, and communicating hazard and risk. Each topic was introduced by a moderator and surveyed by a few invited speakers, who were then empaneled for an open discussion. The presentations were followed by poster sessions. During a wrap-up session on the last day, the reporters for each topical session summarized the main points that they had gleaned from the talks and open discussions. This report attempts to distill this workshop record into a brief overview of the workshop themes and to describe the range of opinions expressed during the discussions.

  18. Source Spectra and Site Response for Two Indonesian Earthquakes: the Tasikmalaya and Kerinci Events of 2009

    Science.gov (United States)

    Gunawan, I.; Cummins, P. R.; Ghasemi, H.; Suhardjono, S.

    2012-12-01

    Indonesia is very prone to natural disasters, especially earthquakes, due to its location in a tectonically active region. In September-October 2009 alone, intraslab and crustal earthquakes caused the deaths of thousands of people, severe infrastructure destruction and considerable economic loss. Thus, both intraslab and crustal earthquakes are important sources of earthquake hazard in Indonesia. Analysis of response spectra for these intraslab and crustal earthquakes are needed to yield more detail about earthquake properties. For both types of earthquakes, we have analysed available Indonesian seismic waveform data to constrain source and path parameters - i.e., low frequency spectral level, Q, and corner frequency - at reference stations that appear to be little influenced by site response.. We have considered these analyses for the main shocks as well as several aftershocks. We obtain corner frequencies that are reasonably consistent with the constant stress drop hypothesis. Using these results, we consider using them to extract information about site response form other stations form the Indonesian strong motion network that appear to be strongly affected by site response. Such site response data, as well as earthquake source parameters, are important for assessing earthquake hazard in Indonesia.

  19. Induced seismicity provides insight into why earthquake ruptures stop

    KAUST Repository

    Galis, Martin

    2017-12-21

    Injection-induced earthquakes pose a serious seismic hazard but also offer an opportunity to gain insight into earthquake physics. Currently used models relating the maximum magnitude of injection-induced earthquakes to injection parameters do not incorporate rupture physics. We develop theoretical estimates, validated by simulations, of the size of ruptures induced by localized pore-pressure perturbations and propagating on prestressed faults. Our model accounts for ruptures growing beyond the perturbed area and distinguishes self-arrested from runaway ruptures. We develop a theoretical scaling relation between the largest magnitude of self-arrested earthquakes and the injected volume and find it consistent with observed maximum magnitudes of injection-induced earthquakes over a broad range of injected volumes, suggesting that, although runaway ruptures are possible, most injection-induced events so far have been self-arrested ruptures.

  20. A preliminary regional assessment of earthquake-induced landslide susceptibility for Vrancea Seismic Region

    Science.gov (United States)

    Micu, Mihai; Balteanu, Dan; Ionescu, Constantin; Havenith, Hans; Radulian, Mircea; van Westen, Cees; Damen, Michiel; Jurchescu, Marta

    2015-04-01

    ) with head scarps near mountain tops and close to faults is similar to the one of large mass movements for which a seismic origin is proved (such as in the Tien Shan, Pamir, Longmenshan, etc.). Thus, correlations between landslide occurrence and combined seismotectonic and climatic factors are needed to support a regional multi-hazard risk assessment. The purpose of this paper is to harmonize for the first time at a regional scale the landslide predisposing factors and seismotectonic triggers and to present a first qualitative insight into the earthquake-induced landslide susceptibility for the Vrancea Seismic Region in terms of a GIS-based analysis of Newmark displacement (ND). In this way, it aims at better defining spatial and temporal distribution patterns of earthquake-triggered landslides. Arias Intensity calculation involved in the assessment considers both regional seismic hazard aspects and singular earthquake scenarios (adjusted by topography amplification factors). The known distribution of landslides mapped through digital stereographic interpretation of high-resolution aerial photos is compared with digital active fault maps and the computed ND maps to statistically outline the seismotectonic influence on slope stability in the study area. The importance of this approach resides in two main outputs. The fist one, of a fundamental nature, by providing the first regional insight into the seismic landslides triggering framework, is allowing us to understand if deep-focus earthquakes may trigger massive slope failures in an area with a relatively smooth relief (compared to the high mountain regions in Central Asia, the Himalayas), considering possible geologic and topographic site effects. The second one, more applied, will allow a better accelerometer instrumentation and monitoring of slopes and also will provide a first correlation of different levels of seismic shaking with precipitation recurrences, an important relationship within a multi-hazard risk

  1. Accessing northern California earthquake data via Internet

    Science.gov (United States)

    Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David

    The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.

  2. Estimating annualized earthquake losses for the conterminous United States

    Science.gov (United States)

    Jaiswal, Kishor S.; Bausch, Douglas; Chen, Rui; Bouabid, Jawhar; Seligson, Hope

    2015-01-01

    We make use of the most recent National Seismic Hazard Maps (the years 2008 and 2014 cycles), updated census data on population, and economic exposure estimates of general building stock to quantify annualized earthquake loss (AEL) for the conterminous United States. The AEL analyses were performed using the Federal Emergency Management Agency's (FEMA) Hazus software, which facilitated a systematic comparison of the influence of the 2014 National Seismic Hazard Maps in terms of annualized loss estimates in different parts of the country. The losses from an individual earthquake could easily exceed many tens of billions of dollars, and the long-term averaged value of losses from all earthquakes within the conterminous U.S. has been estimated to be a few billion dollars per year. This study estimated nationwide losses to be approximately $4.5 billion per year (in 2012$), roughly 80% of which can be attributed to the States of California, Oregon and Washington. We document the change in estimated AELs arising solely from the change in the assumed hazard map. The change from the 2008 map to the 2014 map results in a 10 to 20% reduction in AELs for the highly seismic States of the Western United States, whereas the reduction is even more significant for Central and Eastern United States.

  3. Seismic hazard analysis. Application of methodology, results, and sensitivity studies

    International Nuclear Information System (INIS)

    Bernreuter, D.L.

    1981-10-01

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectra for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimated seismic hazard in this region of the country. (author)

  4. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  5. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  6. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  7. Earthquake statistics, spatiotemporal distribution of foci and source mechanisms - a key to understanding of the West Bohemia/Vogtland earthquake swarms

    Science.gov (United States)

    Horálek, Josef; Čermáková, Hana; Fischer, Tomáš

    2016-04-01

    Earthquake swarms are sequences of numerous events closely clustered in space and time and do not have a single dominant mainshock. A few of the largest events in a swarm reach similar magnitudes and usually occur throughout the course of the earthquake sequence. These attributes differentiate earthquake swarms from ordinary mainshock-aftershock sequences. Earthquake swarms occur worldwide, in diverse geological units. The swarms typically accompany volcanic activity at margins of the tectonic plate but also occur in intracontinental areas where strain from tectonic-plate movement is small. The origin of earthquake swarms is still unclear. The swarms typically occur at the plate margins but also in intracontinental areas. West Bohemia-Vogtland represents one of the most active intraplate earthquake-swarm areas in Europe. It is characterised by a frequent reoccurrence of ML 2.8 swarm events are located in a few dense clusters which implies step by step rupturing of one or a few asperities during the individual swarms. The source mechanism patters (moment-tensor description, MT) of the individual swarms indicate several families of the mechanisms, which fit well geometry of respective fault segments. MTs of the most events signify pure shears except for the 1997-swarm events the MTs of which indicates a combine sources including both shear and tensile components. The origin of earthquake swarms is still unclear. Nevertheless, we infer that the individual earthquake swarms in West Bohemia-Vogtland are mixture of the mainshock-aftershock sequences which correspond to step by step rupturing of one or a few asperities. The swarms occur on short fault segments with heterogeneous stress and strength, which may be affected by pressurized crustal fluids reducing normal component of the tectonic stress and lower friction. This way critically loaded faults are brought to failure and the swarm activity is driven by the differential local stress.

  8. Long-Term Fault Memory: A New Time-Dependent Recurrence Model for Large Earthquake Clusters on Plate Boundaries

    Science.gov (United States)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.; Campbell, M. R.

    2017-12-01

    A challenge for earthquake hazard assessment is that geologic records often show large earthquakes occurring in temporal clusters separated by periods of quiescence. For example, in Cascadia, a paleoseismic record going back 10,000 years shows four to five clusters separated by approximately 1,000 year gaps. If we are still in the cluster that began 1700 years ago, a large earthquake is likely to happen soon. If the cluster has ended, a great earthquake is less likely. For a Gaussian distribution of recurrence times, the probability of an earthquake in the next 50 years is six times larger if we are still in the most recent cluster. Earthquake hazard assessments typically employ one of two recurrence models, neither of which directly incorporate clustering. In one, earthquake probability is time-independent and modeled as Poissonian, so an earthquake is equally likely at any time. The fault has no "memory" because when a prior earthquake occurred has no bearing on when the next will occur. The other common model is a time-dependent earthquake cycle in which the probability of an earthquake increases with time until one happens, after which the probability resets to zero. Because the probability is reset after each earthquake, the fault "remembers" only the last earthquake. This approach can be used with any assumed probability density function for recurrence times. We propose an alternative, Long-Term Fault Memory (LTFM), a modified earthquake cycle model where the probability of an earthquake increases with time until one happens, after which it decreases, but not necessarily to zero. Hence the probability of the next earthquake depends on the fault's history over multiple cycles, giving "long-term memory". Physically, this reflects an earthquake releasing only part of the elastic strain stored on the fault. We use the LTFM to simulate earthquake clustering along the San Andreas Fault and Cascadia. In some portions of the simulated earthquake history, events would

  9. The Global Earthquake Model and Disaster Risk Reduction

    Science.gov (United States)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  10. Empirical ground-motion relations for subduction-zone earthquakes and their application to Cascadia and other regions

    Science.gov (United States)

    Atkinson, G.M.; Boore, D.M.

    2003-01-01

    Ground-motion relations for earthquakes that occur in subduction zones are an important input to seismic-hazard analyses in many parts of the world. In the Cascadia region (Washington, Oregon, northern California, and British Columbia), for example, there is a significant hazard from megathrust earthquakes along the subduction interface and from large events within the subducting slab. These hazards are in addition to the hazard from shallow earthquakes in the overlying crust. We have compiled a response spectra database from thousands of strong-motion recordings from events of moment magnitude (M) 5-8.3 occurring in subduction zones around the world, including both interface and in-slab events. The 2001 M 6.8 Nisqually and 1999 M 5.9 Satsop earthquakes are included in the database, as are many records from subduction zones in Japan (Kyoshin-Net data), Mexico (Guerrero data), and Central America. The size of the database is four times larger than that available for previous empirical regressions to determine ground-motion relations for subduction-zone earthquakes. The large dataset enables improved determination of attenuation parameters and magnitude scaling, for both interface and in-slab events. Soil response parameters are also better determined by the data. We use the database to develop global ground-motion relations for interface and in-slab earthquakes, using a maximum likelihood regression method. We analyze regional variability of ground-motion amplitudes across the global database and find that there are significant regional differences. In particular, amplitudes in Cascadia differ by more than a factor of 2 from those in Japan for the same magnitude, distance, event type, and National Earthquake Hazards Reduction Program (NEHRP) soil class. This is believed to be due to regional differences in the depth of the soil profile, which are not captured by the NEHRP site classification scheme. Regional correction factors to account for these differences are

  11. Initiatives to Reduce Earthquake Risk of Developing Countries

    Science.gov (United States)

    Tucker, B. E.

    2008-12-01

    The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of

  12. Living With Earthquakes in the Pacific Northwest: A Survivor's Guide, 2nd edition

    Science.gov (United States)

    Hutton, Kate

    In 1995, Robert S.Yeats found himself teaching a core curriculum class at Oregon State University for undergraduate nonscience majors, linking recent discoveries on the earthquake hazard in the Pacific Northwest to societal response to those hazards. The notes for that course evolved into the first edition of this book, published in 1998. In 2001, he published a similar book, Living With Earthquakes in California: A Survivors Guide (Oregon State University Press).Recent earthquakes, such as the 2001 Nisqually Mw6.8, discoveries, and new techniques in paleoseismology plus changes in public policy decisions, quickly outdated the first Pacific Northwest edition. This is especially true with the Cascadia Subduction Zone and crustal faults, where our knowledge expands with every scientific meeting.

  13. The MeSO-net (Metropolitan Seismic Observation network) confronts the Pacific Coast of Tohoku Earthquake, Japan (Mw 9.0)

    Science.gov (United States)

    Kasahara, K.; Nakagawa, S.; Sakai, S.; Nanjo, K.; Panayotopoulos, Y.; Morita, Y.; Tsuruoka, H.; Kurashimo, E.; Obara, K.; Hirata, N.; Aketagawa, T.; Kimura, H.

    2011-12-01

    methods. The data contributes to solve co-called "the problem of the Long-Period Ground Motion Hazard", an engineering problem about earthquake disaster prevention of city (Koketsu et al., 2008). Moreover we could understand the detailed distributions of dominant periods of H/V spectral ratios and ground responses excited in the metropolitan area (Tsuno et al., 2011). The overall results obtained under our project will contribute directly to the next assessment of the seismic hazard in the Tokyo metropolitan area.

  14. Seismic and tsunami hazard in Puerto Rico and the Virgin Islands

    Science.gov (United States)

    Dillon, William P.; Frankel, Arthur D.; Mueller, Charles S.; Rodriguez, Rafael W.; ten Brink, Uri S.

    1999-01-01

    first day of the workshop, participants from universities, federal institutions, and consulting firms in Puerto Rico, the Virgin Islands, the continental U.S., Dominican Republic, and Europe reviewed the present state of knowledge including a review and discussion of present plate models, recent GPS and seismic reflection data, seismicity, paleoseismology, and tsunamis. The state of earthquake/tsunami studies in Puerto Rico was presented by several faculty members from the University of Puerto Rico at Mayaguez. A preliminary seismic hazard map was presented by the USGS and previous hazard maps and economic loss assessments were considered. During the second day, the participants divided into working groups and prepared specific recommendations for future activities in the region along the six following topics below. Highlights of these recommended activities are:Marine geology and geophysics – Acquire deep-penetration seismic reflection and refraction data, deploy temporary ocean bottom seismometer arrays to record earthquakes, collect high-resolution multibeam bathymetry and side scan sonar data of the region, and in particular, the near shore region, and conduct focussed high-resolution seismic studies around faults. Determine slip rates of specific offshore faults. Assemble a GIS database for available marine geological and geophysical data.Paleoseismology and active faults - Field reconnaissance aimed at identifying Quaternary faults and determining their paleoseismic chronology and slip rates, as well as identifying and dating paleoliquefaction features from large earthquakes. Quaternary mapping of marine terraces, fluvial terraces and basins, beach ridges, etc., to establish framework for understanding neotectonic deformation of the island. Interpretation of aerial photography to identify possible Quaternary faults.Earthquake seismology – Determine an empirical seismic attenuation function using observations from local seismic networks and recently

  15. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    Science.gov (United States)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  16. Scientific, Engineering, and Financial Factors of the 1989 Human-Triggered Newcastle Earthquake in Australia

    Science.gov (United States)

    Klose, C. D.

    2006-12-01

    This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5losses of triggered earthquakes. An hazard assessment, based on a geomechanical crust model, shows that only four deep coal mines were responsible for triggering this severe earthquake. A small-scale economic risk assessment identifies that the financial loss due to earthquake damage has reduced mining profits that have been re-invested in the Newcastle region for over two centuries beginning in 1801. Furthermore, large-scale economic risk assessment reveals that the financial loss is equivalent to 26% of the Australian Gross Domestic Product (GDP) growth in 1988/89. These costs account for 13% of the total costs of all natural disasters (e.g., flooding, drought, wild fires) and 94% of the costs of all earthquakes recorded in Australia between 1967 and 1999. In conclusion, the increasing number and size of geoengineering activities, such as coal mining near Newcastle or planned carbon dioxide Geosequestration initiatives, represent a growing hazard potential, which can negatively affect socio-economic growth and sustainable development. Finally, hazard and risk degrees, based on geomechanical-mathematical models, can be forecasted in space and over time for urban planning in order to prevent economic losses of human-triggered earthquakes in the future.

  17. Seismic hazard maps for Haiti

    Science.gov (United States)

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  18. Reassessment of probabilistic seismic hazard in the Marmara region

    Science.gov (United States)

    Kalkan, Erol; Gulkan, Polat; Yilmaz, Nazan; Çelebi, Mehmet

    2009-01-01

    In 1999, the eastern coastline of the Marmara region (Turkey) witnessed increased seismic activity on the North Anatolian fault (NAF) system with two damaging earthquakes (M 7.4 Kocaeli and M 7.2 D??zce) that occurred almost three months apart. These events have reduced stress on the western segment of the NAF where it continues under the Marmara Sea. The undersea fault segments have been recently explored using bathymetric and reflection surveys. These recent findings helped scientists to understand the seismotectonic environment of the Marmara basin, which has remained a perplexing tectonic domain. On the basis of collected new data, seismic hazard of the Marmara region is reassessed using a probabilistic approach. Two different earthquake source models: (1) the smoothed-gridded seismicity model and (2) fault model and alternate magnitude-frequency relations, Gutenberg-Richter and characteristic, were used with local and imported ground-motion-prediction equations. Regional exposure is computed and quantified on a set of hazard maps that provide peak horizontal ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 sec on uniform firm-rock site condition (760 m=sec average shear wave velocity in the upper 30 m). These acceleration levels were computed for ground motions having 2% and 10% probabilities of exceedance in 50 yr, corresponding to return periods of about 2475 and 475 yr, respectively. The maximum PGA computed (at rock site) is 1.5g along the fault segments of the NAF zone extending into the Marmara Sea. The new maps generally show 10% to 15% increase for PGA, 0.2 and 1.0 sec spectral acceleration values across much of Marmara compared to previous regional hazard maps. Hazard curves and smooth design spectra for three site conditions: rock, soil, and soft-soil are provided for the Istanbul metropolitan area as possible tools in future risk estimates.

  19. Lessons of L'Aquila for Operational Earthquake Forecasting

    Science.gov (United States)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  20. Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems

    Science.gov (United States)

    Day, S. J.; Fearnley, C. J.

    2013-12-01

    Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted

  1. Fostering the uptake of satellite Earth Observation data for landslide hazard understanding: the CEOS Landslide Pilot

    Science.gov (United States)

    Kirschbaum, Dalia; Malet, Jean-Philippe; Roessner, Sigrid

    2017-04-01

    Landslides occur around the world, on every continent, and play an important role in the evolution of landscapes. They also represent a serious hazard in many areas of the world. Despite their importance, it has been estimated that past landslide and landslide potential maps cover less than 1% of the slopes in these landmasses. Systematic information on the type, abundance, and distribution of existing landslides is lacking. Even in countries where landslide information is abundant (e.g. Italy), the vast majority of landslides caused by meteorological (intense or prolonged rainfall, rapid snowmelt) or geophysical (earthquake) triggers go undetected. This paucity of knowledge has consequences on the design of effective remedial and mitigation measures. Systematic use of Earth observation (EO) data and technologies can contribute effectively to detect, map, and monitor landslides, and landslide prone hillsides, in different physiographic and climatic regions. The CEOS (Committee on Earth Observation Satellites) Working Group on Disasters has recently launched a Landslide Pilot (period 2017-2019) with the aim to demonstrate the effective exploitation of satellite EO across the full cycle of landslide disaster risk management, including preparedness, response, and recovery at global, regional, and local scales, with a distinct multi-hazard focus on cascading impacts and risks. The Landslide Pilot is focusing efforts on three objectives: 1. Establish effective practices for merging different Earth Observation data (e.g. optical and radar) to better monitor and map landslide activity over time and space. 2. Demonstrate how landslide products, models, and services can support disaster risk management for multi-hazard and cascading landslide events. 3. Engage and partner with data brokers and end users to understand requirements and user expectations and get feedback through the activities described in objectives 1-2. The Landslide Pilot was endorsed in April 2016 and work

  2. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  3. Environmental Hazards and Mud Volcanoes in Romania

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Romania, an eastern European country, is severely affected by a variety of natural hazards. These include frequent earthquakes, floods, landslides, soil erosion, and...

  4. Recent Mega-Thrust Tsunamigenic Earthquakes and PTHA

    Science.gov (United States)

    Lorito, S.

    2013-05-01

    The occurrence of several mega-thrust tsunamigenic earthquakes in the last decade, including but not limited to the 2004 Sumatra-Andaman, the 2010 Maule, and 2011 Tohoku earthquakes, has been a dramatic reminder of the limitations in our capability of assessing earthquake and tsunami hazard and risk. However, the increasingly high-quality geophysical observational networks allowed the retrieval of most accurate than ever models of the rupture process of mega-thrust earthquakes, thus paving the way for future improved hazard assessments. Probabilistic Tsunami Hazard Analysis (PTHA) methodology, in particular, is less mature than its seismic counterpart, PSHA. Worldwide recent research efforts of the tsunami science community allowed to start filling this gap, and to define some best practices that are being progressively employed in PTHA for different regions and coasts at threat. In the first part of my talk, I will briefly review some rupture models of recent mega-thrust earthquakes, and highlight some of their surprising features that likely result in bigger error bars associated to PTHA results. More specifically, recent events of unexpected size at a given location, and with unexpected rupture process features, posed first-order open questions which prevent the definition of an heterogeneous rupture probability along a subduction zone, despite of several recent promising results on the subduction zone seismic cycle. In the second part of the talk, I will dig a bit more into a specific ongoing effort for improving PTHA methods, in particular as regards epistemic and aleatory uncertainties determination, and the computational PTHA feasibility when considering the full assumed source variability. Only logic trees are usually explicated in PTHA studies, accounting for different possible assumptions on the source zone properties and behavior. The selection of the earthquakes to be actually modelled is then in general made on a qualitative basis or remains implicit

  5. Earthquake Preparedness and Education: A Collective Impact Approach to Improving Awareness and Resiliency

    Science.gov (United States)

    Benthien, M. L.; Wood, M. M.; Ballmann, J. E.; DeGroot, R. M.

    2017-12-01

    The Southern California Earthquake Center (SCEC), headquartered at the University of Southern California, is a collaboration of more than 1000 scientists and students from 70+ institutions. SCEC's Communication, Education, and Outreach (CEO) program translates earthquake science into products and activities in order to increase scientific literacy, develop a diverse scientific workforce, and reduce earthquake risk to life and property. SCEC CEO staff coordinate these efforts through partnership collaborations it has established to engage subject matter experts, reduce duplication of effort, and achieve greater results. Several of SCEC's collaborative networks began within Southern California and have since grown statewide (Earthquake Country Alliance, a public-private-grassroots partnership), national ("EPIcenter" Network of museums, parks, libraries, etc.), and international (Great ShakeOut Earthquake Drills with millions of participants each year). These networks have benefitted greatly from partnerships with national (FEMA), state, and local emergency managers. Other activities leverage SCEC's networks in new ways and with national earth science organizations, such as the EarthConnections Program (with IRIS, NAGT, and many others), Quake Catcher Network (with IRIS) and the GeoHazards Messaging Collaboratory (with IRIS, UNAVCO, and USGS). Each of these partnerships share a commitment to service, collaborative development, and the application of research (including social science theory for motivating preparedness behaviors). SCEC CEO is developing new evaluative structures and adapting the Collective Impact framework to better understand what has worked well or what can be improved, according to the framework's five key elements: create a common agenda; share common indicators and measurement; engage diverse stakeholders to coordinate mutually reinforcing activities; initiate continuous communication; and provide "backbone" support. This presentation will provide

  6. The plan to coordinate NEHRP post-earthquake investigations

    Science.gov (United States)

    Holzer, Thomas L.; Borcherdt, Roger D.; Comartin, Craig D.; Hanson, Robert D.; Scawthorn, Charles R.; Tierney, Kathleen; Youd, T. Leslie

    2003-01-01

    This is the plan to coordinate domestic and foreign post-earthquake investigations supported by the National Earthquake Hazards Reduction Program (NEHRP). The plan addresses coordination of both the NEHRP agencies—Federal Emergency Management Agency (FEMA), National Institute of Standards and Technology (NIST), National Science Foundation (NSF), and U. S. Geological Survey (USGS)—and their partners. The plan is a framework for both coordinating what is going to be done and identifying responsibilities for post-earthquake investigations. It does not specify what will be done. Coordination is addressed in various time frames ranging from hours to years after an earthquake. The plan includes measures for (1) gaining rapid and general agreement on high-priority research opportunities, and (2) conducting the data gathering and fi eld studies in a coordinated manner. It deals with identifi cation, collection, processing, documentation, archiving, and dissemination of the results of post-earthquake work in a timely manner and easily accessible format.

  7. Data base and seismicity studies for Fagaras, Romania crustal earthquakes

    International Nuclear Information System (INIS)

    Moldovan, I.-A.; Enescu, B. D.; Pantea, A.; Constantin, A.; Bazacliu, O.; Malita, Z.; Moldoveanu, T.

    2002-01-01

    Besides the major impact of the Vrancea seismic region, one of the most important intermediate earthquake sources of Europe, the Romanian crustal earthquake sources, from Fagaras, Banat, Crisana, Bucovina or Dobrogea regions, have to be taken into consideration for seismicity studies or seismic hazard assessment. To determine the characteristics of the seismicity for Fagaras seismogenic region, a revised and updated catalogue of the Romanian earthquakes, recently compiled by Oncescu et al. (1999) is used. The catalogue contains 471 tectonic earthquakes and 338 induced earthquakes and is homogenous starting with 1471 for I>VIII and for I>VII starting with 1801. The catalogue is complete for magnitudes larger than 3 starting with 1982. In the studied zone only normal earthquakes occur, related to intracrustal fractures situated from 5 to 30 km depth. Most of them are of low energy, but once in a century a large destructive event occurs with epicentral intensity larger than VIII. The maximum expected magnitude is M GR = 6.5 and the epicenter distribution outlines significant clustering in the zones and on the lines mentioned in the tectonic studies. Taking into account the date of the last major earthquake (1916) and the return periods of severe damaging shocks of over 85 years it is to be expected very soon a large shock in the area. That's why a seismicity and hazard study for this zone is necessary. In the paper there are studied the b parameter variation (the mean value is 0.69), the activity value, the return periods, and seismicity maps and different histograms are plotted. At the same time there are excluded from the catalogue the explosions due to Campulung quarry. Because the catalogue contains the aftershocks for the 1916 earthquake for the seismicity studies we have excluded these shocks. (authors)

  8. Surface Rupture Effects on Earthquake Moment-Area Scaling Relations

    Science.gov (United States)

    Luo, Yingdi; Ampuero, Jean-Paul; Miyakoshi, Ken; Irikura, Kojiro

    2017-09-01

    Empirical earthquake scaling relations play a central role in fundamental studies of earthquake physics and in current practice of earthquake hazard assessment, and are being refined by advances in earthquake source analysis. A scaling relation between seismic moment ( M 0) and rupture area ( A) currently in use for ground motion prediction in Japan features a transition regime of the form M 0- A 2, between the well-recognized small (self-similar) and very large (W-model) earthquake regimes, which has counter-intuitive attributes and uncertain theoretical underpinnings. Here, we investigate the mechanical origin of this transition regime via earthquake cycle simulations, analytical dislocation models and numerical crack models on strike-slip faults. We find that, even if stress drop is assumed constant, the properties of the transition regime are controlled by surface rupture effects, comprising an effective rupture elongation along-dip due to a mirror effect and systematic changes of the shape factor relating slip to stress drop. Based on this physical insight, we propose a simplified formula to account for these effects in M 0- A scaling relations for strike-slip earthquakes.

  9. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  10. Major earthquakes occur regularly on an isolated plate boundary fault.

    Science.gov (United States)

    Berryman, Kelvin R; Cochran, Ursula A; Clark, Kate J; Biasi, Glenn P; Langridge, Robert M; Villamor, Pilar

    2012-06-29

    The scarcity of long geological records of major earthquakes, on different types of faults, makes testing hypotheses of regular versus random or clustered earthquake recurrence behavior difficult. We provide a fault-proximal major earthquake record spanning 8000 years on the strike-slip Alpine Fault in New Zealand. Cyclic stratigraphy at Hokuri Creek suggests that the fault ruptured to the surface 24 times, and event ages yield a 0.33 coefficient of variation in recurrence interval. We associate this near-regular earthquake recurrence with a geometrically simple strike-slip fault, with high slip rate, accommodating a high proportion of plate boundary motion that works in isolation from other faults. We propose that it is valid to apply time-dependent earthquake recurrence models for seismic hazard estimation to similar faults worldwide.

  11. The distinction between risk and hazard: understanding and use in stakeholder communication.

    Science.gov (United States)

    Scheer, Dirk; Benighaus, Christina; Benighaus, Ludger; Renn, Ortwin; Gold, Stefan; Röder, Bettina; Böl, Gaby-Fleur

    2014-07-01

    A major issue in all risk communication efforts is the distinction between the terms "risk" and "hazard." The potential to harm a target such as human health or the environment is normally defined as a hazard, whereas risk also encompasses the probability of exposure and the extent of damage. What can be observed again and again in risk communication processes are misunderstandings and communication gaps related to these crucial terms. We asked a sample of 53 experts from public authorities, business and industry, and environmental and consumer organizations in Germany to outline their understanding and use of these terms using both the methods of expert interviews and focus groups. The empirical study made clear that the terms risk and hazard are perceived and used very differently in risk communication depending on the perspective of the stakeholders. Several factors can be identified, such as responsibility for hazard avoidance, economic interest, or a watchdog role. Thus, communication gaps can be reduced to a four-fold problem matrix comprising a semantic, conceptual, strategic, and control problem. The empirical study made clear that risks and hazards are perceived very differently depending on the stakeholders' perspective. Their own worldviews played a major role in their specific use of the two terms hazards and risks in communication. © 2014 Society for Risk Analysis.

  12. Sensitivity Analysis of Evacuation Speed in Hypothetical NPP Accident by Earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Effective emergency response in emergency situation of nuclear power plant (NPP) can make consequences be different therefore it is regarded important when establishing an emergency response plan and assessing the risk of hypothetical NPP accident. Situation of emergency response can be totally changed when NPP accident caused by earthquake or tsunami is considered due to the failure of roads and buildings by the disaster. In this study evacuation speed has been focused among above various factors and reasonable evacuation speed in earthquake scenario has been investigated. Finally, sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Evacuation scenario can be entirely different in the situation of seismic hazard and the sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Various references were investigated and earthquake evacuation model has been developed considering that evacuees may convert their evacuation method from using a vehicle to walking when they face the difficulty of using a vehicle due to intense traffic jam, failure of buildings and roads, and etc. The population dose within 5 km / 30 km have been found to be increased in earthquake situation due to decreased evacuation speed and become 1.5 - 2 times in the severest earthquake evacuation scenario set up in this study. It is not agreed that using same emergency response model which is used for normal evacuation situations when performing level 3 probabilistic safety assessment for earthquake and tsunami event. Investigation of data and sensitivity analysis for constructing differentiated emergency response model in the event of seismic hazard has been carried out in this study.

  13. Sensitivity Analysis of Evacuation Speed in Hypothetical NPP Accident by Earthquake

    International Nuclear Information System (INIS)

    Kim, Sung-yeop; Lim, Ho-Gon

    2016-01-01

    Effective emergency response in emergency situation of nuclear power plant (NPP) can make consequences be different therefore it is regarded important when establishing an emergency response plan and assessing the risk of hypothetical NPP accident. Situation of emergency response can be totally changed when NPP accident caused by earthquake or tsunami is considered due to the failure of roads and buildings by the disaster. In this study evacuation speed has been focused among above various factors and reasonable evacuation speed in earthquake scenario has been investigated. Finally, sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Evacuation scenario can be entirely different in the situation of seismic hazard and the sensitivity analysis of evacuation speed in hypothetical NPP accident by earthquake has been performed in this study. Various references were investigated and earthquake evacuation model has been developed considering that evacuees may convert their evacuation method from using a vehicle to walking when they face the difficulty of using a vehicle due to intense traffic jam, failure of buildings and roads, and etc. The population dose within 5 km / 30 km have been found to be increased in earthquake situation due to decreased evacuation speed and become 1.5 - 2 times in the severest earthquake evacuation scenario set up in this study. It is not agreed that using same emergency response model which is used for normal evacuation situations when performing level 3 probabilistic safety assessment for earthquake and tsunami event. Investigation of data and sensitivity analysis for constructing differentiated emergency response model in the event of seismic hazard has been carried out in this study

  14. Fault roughness and strength heterogeneity control earthquake size and stress drop

    KAUST Repository

    Zielke, Olaf

    2017-01-13

    An earthquake\\'s stress drop is related to the frictional breakdown during sliding and constitutes a fundamental quantity of the rupture process. High-speed laboratory friction experiments that emulate the rupture process imply stress drop values that greatly exceed those commonly reported for natural earthquakes. We hypothesize that this stress drop discrepancy is due to fault-surface roughness and strength heterogeneity: an earthquake\\'s moment release and its recurrence probability depend not only on stress drop and rupture dimension but also on the geometric roughness of the ruptured fault and the location of failing strength asperities along it. Using large-scale numerical simulations for earthquake ruptures under varying roughness and strength conditions, we verify our hypothesis, showing that smoother faults may generate larger earthquakes than rougher faults under identical tectonic loading conditions. We further discuss the potential impact of fault roughness on earthquake recurrence probability. This finding provides important information, also for seismic hazard analysis.

  15. Earthquake Activities Along the Strike-Slip Fault System on the Thailand-Myanmar Border

    Directory of Open Access Journals (Sweden)

    Santi Pailoplee

    2014-01-01

    Full Text Available This study investigates the present-day seismicity along the strike-slip fault system on the Thailand-Myanmar border. Using the earthquake catalogue the earthquake parameters representing seismic activities were evaluated in terms of the possible maximum magnitude, return period and earthquake occurrence probabilities. Three different hazardous areas could be distinguished from the obtained results. The most seismic-prone area was located along the northern segment of the fault system and can generate earthquakes of magnitude 5.0, 5.8, and 6.8 mb in the next 5, 10, and 50 years, respectively. The second most-prone area was the southern segment where earthquakes of magnitude 5.0, 6.0, and 7.0 mb might be generated every 18, 60, and 300 years, respectively. For the central segment, there was less than 30 and 10% probability that 6.0- and 7.0-mb earthquakes will be generated in the next 50 years. With regards to the significant infrastructures (dams in the vicinity, the operational Wachiralongkorn dam is situated in a low seismic hazard area with a return period of around 30 - 3000 years for a 5.0 - 7.0 mb earthquake. In contrast, the Hut Gyi, Srinakarin and Tha Thung Na dams are seismically at risk for earthquakes of mb 6.4 - 6.5 being generated in the next 50 years. Plans for a seismic-retrofit should therefore be completed and implemented while seismic monitoring in this region is indispensable.

  16. Integrating population dynamics into mapping human exposure to seismic hazard

    Directory of Open Access Journals (Sweden)

    S. Freire

    2012-11-01

    Full Text Available Disaster risk is not fully characterized without taking into account vulnerability and population exposure. Assessment of earthquake risk in urban areas would benefit from considering the variation of population distribution at more detailed spatial and temporal scales, and from a more explicit integration of this improved demographic data with existing seismic hazard maps. In the present work, "intelligent" dasymetric mapping is used to model population dynamics at high spatial resolution in order to benefit the analysis of spatio-temporal exposure to earthquake hazard in a metropolitan area. These night- and daytime-specific population densities are then classified and combined with seismic intensity levels to derive new spatially-explicit four-class-composite maps of human exposure. The presented approach enables a more thorough assessment of population exposure to earthquake hazard. Results show that there are significantly more people potentially at risk in the daytime period, demonstrating the shifting nature of population exposure in the daily cycle and the need to move beyond conventional residence-based demographic data sources to improve risk analyses. The proposed fine-scale maps of human exposure to seismic intensity are mainly aimed at benefiting visualization and communication of earthquake risk, but can be valuable in all phases of the disaster management process where knowledge of population densities is relevant for decision-making.

  17. Fossil landscapes and youthful seismogenic sources in the central Apennines: excerpts from the 24 August 2016, Amatrice earthquake and seismic hazard implications

    Directory of Open Access Journals (Sweden)

    Gianluca Valensise

    2016-11-01

    Full Text Available We show and discuss the similarities among the 2016 Amatrice (Mw 6.0, 1997 Colfiorito-Sellano (Mw 6.0-5.6 and 2009 L’Aquila (Mw 6.3 earthquakes. They all occurred along the crest of the central Apennines and were caused by shallow dipping faults between 3 and 10 km depth, as shown by their characteristic InSAR signature. We contend that these earthquakes delineate a seismogenic style that is characteristic of this portion of the central Apennines, where the upward propagation of seismogenic faults is hindered by the presence of pre-existing regional thrusts. This leads to an effective decoupling between the deeper seismogenic portion of the upper crust and its uppermost 3 km.The decoupling implies that active faults mapped at the surface do not connect with the seismogenic sources, and that their evolution may be controlled by passive readjustments to coseismic strains or even by purely gravitational motions. Seismic hazard analyses and estimates based on such faults should hence be considered with great caution as they may be all but representative of the true seismogenic potential.

  18. Global Omori law decay of triggered earthquakes: Large aftershocks outside the classical aftershock zone

    Science.gov (United States)

    Parsons, Tom

    2002-09-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the1999 shocks in Turkey and the 2001 earthquakes in El Salvador. In this study, earthquakes with Ms ≥ 7.0 from the Harvard centroid moment tensor (CMT) catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near (defined as having shear stress change ∣Δτ∣ ≥ 0.01 MPa) the Ms ≥ 7.0 shocks are associated with calculated shear stress increases, while ˜39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, these triggered earthquakes obey an Omori law rate decay that lasts between ˜7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main shock centroid. Omori's law is one of the few time-predictable patterns evident in the global occurrence of earthquakes. If large triggered earthquakes habitually obey Omori's law, then their hazard can be more readily assessed. The characteristic rate change with time and spatial distribution can be used to rapidly assess the likelihood of triggered earthquakes following events of Ms ≥ 7.0. I show an example application to the M = 7.7 13 January 2001 El Salvador earthquake where use of global statistics appears to provide a better rapid hazard estimate than Coulomb stress change calculations.

  19. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    Science.gov (United States)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  20. Tsunami hazard in the Caribbean: Regional exposure derived from credible worst case scenarios

    Science.gov (United States)

    Harbitz, C. B.; Glimsdal, S.; Bazin, S.; Zamora, N.; Løvholt, F.; Bungum, H.; Smebye, H.; Gauer, P.; Kjekstad, O.

    2012-04-01

    The present study documents a high tsunami hazard in the Caribbean region, with several thousands of lives lost in tsunamis and associated earthquakes since the XIXth century. Since then, the coastal population of the Caribbean and the Central West Atlantic region has grown significantly and is still growing. Understanding this hazard is therefore essential for the development of efficient mitigation measures. To this end, we report a regional tsunami exposure assessment based on potential and credible seismic and non-seismic tsunamigenic sources. Regional tsunami databases have been compiled and reviewed, and on this basis five main scenarios have been selected to estimate the exposure. The scenarios comprise two Mw8 earthquake tsunamis (north of Hispaniola and east of Lesser Antilles), two subaerial/submarine volcano flank collapse tsunamis (Montserrat and Saint Lucia), and one tsunami resulting from a landslide on the flanks of the Kick'em Jenny submarine volcano (north of Grenada). Offshore tsunami water surface elevations as well as maximum water level distributions along the shore lines are computed and discussed for each of the scenarios. The number of exposed people has been estimated in each case, together with a summary of the tsunami exposure for the earthquake and the landslide tsunami scenarios. For the earthquake scenarios, the highest tsunami exposure relative to the population is found for Guadeloupe (6.5%) and Antigua (7.5%), while Saint Lucia (4.5%) and Antigua (5%) have been found to have the highest tsunami exposure relative to the population for the landslide scenarios. Such high exposure levels clearly warrant more attention on dedicated mitigation measures in the Caribbean region.

  1. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    Science.gov (United States)

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions

  2. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    Science.gov (United States)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global

  3. Vrancea earthquakes. Courses for specific actions to mitigate seismic risk

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru

    2005-01-01

    Earthquakes in the Carpathian-Pannonian region are confined to the crust, except the Vrancea zone, where earthquakes with focal depth down to 200 Km occur. For example, the ruptured area migrated from 150 km to 180 km (November 10,1940, M w = 7.7) from 90 km to 110 km (March 4, 1977, M w 7.4), from 130 km to 150 km (August 30, 1986, M w = 7.1) and from 70 km to 90 km (May 30, 1990, M w = 6.9) depth. The depth interval between 110 km and 130 km remains not ruptured since 1802, October 26, when it was the strongest earthquake occurred in this part of Central Europe. The magnitude is assumed to be M w = 7.9 - 8.0 and this depth interval is a natural candidate for the next strong Vrancea event. While no country in the world is entirely safe, the lack of capacity to limit the impact of seismic hazards remains a major burden for all countries and while the world has witnessed an exponential increase in human and material losses due to natural disasters given by earthquakes, there is a need to reverse trends in seismic risk mitigation to future events. Main courses for specific actions to mitigate the seismic risk given by strong deep Vrancea earthquakes should be considered as key for development actions: - Early warning system for industrial facilities. Early warning is more than a technological instrument to detect, monitor and submit warnings. It should become part of a management information system for decision-making in the context of national institutional frameworks for disaster management and part of national and local strategies and programmers for risk mitigation; - Prediction program of Vrancea strong earthquakes of short and long term; - Hazard seismic map of Romania. The wrong assessment of the seismic hazard can lead to dramatic situations as those from Bucharest or Kobe. Before the 1977 Vrancea earthquake, the city of Bucharest was designed to intensity I = VII (MMI) and the real intensity was I = IX1/2-X (MMI); - Seismic microzonation of large populated

  4. A reliable simultaneous representation of seismic hazard and of ground shaking recurrence

    Science.gov (United States)

    Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.

    2015-12-01

    Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.

  5. Understanding earthquakes: The key role of radar images

    International Nuclear Information System (INIS)

    Atzori, Simone

    2013-01-01

    The investigation of the fault rupture underlying earthquakes greatly improved thanks to the spread of radar images. Following pioneer applications in the eighties, Interferometry from Synthetic Aperture Radar (InSAR) gained a prominent role in geodesy. Its capability to measure millimetric deformations for wide areas and the increased data availability from the early nineties, made InSAR a diffused and accepted analysis tool in tectonics, though several factors contribute to reduce the data quality. With the introduction of analytical or numerical modeling, InSAR maps are used to infer the source of an earthquake by means of data inversion. Newly developed algorithms, known as InSAR time-series, allowed to further improve the data accuracy and completeness, strengthening the InSAR contribution even in the study of the inter- and post-seismic phases. In this work we describe the rationale at the base of the whole processing, showing its application to the New Zealand 2010–2011 seismic sequence

  6. Understanding earthquakes: The key role of radar images

    Energy Technology Data Exchange (ETDEWEB)

    Atzori, Simone, E-mail: simone.atzori@ingv.it [Istituto Nazionale di Geofisica e Vulcanologia, Rome (Italy)

    2013-08-21

    The investigation of the fault rupture underlying earthquakes greatly improved thanks to the spread of radar images. Following pioneer applications in the eighties, Interferometry from Synthetic Aperture Radar (InSAR) gained a prominent role in geodesy. Its capability to measure millimetric deformations for wide areas and the increased data availability from the early nineties, made InSAR a diffused and accepted analysis tool in tectonics, though several factors contribute to reduce the data quality. With the introduction of analytical or numerical modeling, InSAR maps are used to infer the source of an earthquake by means of data inversion. Newly developed algorithms, known as InSAR time-series, allowed to further improve the data accuracy and completeness, strengthening the InSAR contribution even in the study of the inter- and post-seismic phases. In this work we describe the rationale at the base of the whole processing, showing its application to the New Zealand 2010–2011 seismic sequence.

  7. Designing an Earthquake-Resistant Building

    Science.gov (United States)

    English, Lyn D.; King, Donna T.

    2016-01-01

    How do cross-bracing, geometry, and base isolation help buildings withstand earthquakes? These important structural design features involve fundamental geometry that elementary school students can readily model and understand. The problem activity, Designing an Earthquake-Resistant Building, was undertaken by several classes of sixth- grade…

  8. What Can Sounds Tell Us About Earthquake Interactions?

    Science.gov (United States)

    Aiken, C.; Peng, Z.

    2012-12-01

    It is important not only for seismologists but also for educators to effectively convey information about earthquakes and the influences earthquakes can have on each other. Recent studies using auditory display [e.g. Kilb et al., 2012; Peng et al. 2012] have depicted catastrophic earthquakes and the effects large earthquakes can have on other parts of the world. Auditory display of earthquakes, which combines static images with time-compressed sound of recorded seismic data, is a new approach to disseminating information to a general audience about earthquakes and earthquake interactions. Earthquake interactions are influential to understanding the underlying physics of earthquakes and other seismic phenomena such as tremors in addition to their source characteristics (e.g. frequency contents, amplitudes). Earthquake interactions can include, for example, a large, shallow earthquake followed by increased seismicity around the mainshock rupture (i.e. aftershocks) or even a large earthquake triggering earthquakes or tremors several hundreds to thousands of kilometers away [Hill and Prejean, 2007; Peng and Gomberg, 2010]. We use standard tools like MATLAB, QuickTime Pro, and Python to produce animations that illustrate earthquake interactions. Our efforts are focused on producing animations that depict cross-section (side) views of tremors triggered along the San Andreas Fault by distant earthquakes, as well as map (bird's eye) views of mainshock-aftershock sequences such as the 2011/08/23 Mw5.8 Virginia earthquake sequence. These examples of earthquake interactions include sonifying earthquake and tremor catalogs as musical notes (e.g. piano keys) as well as audifying seismic data using time-compression. Our overall goal is to use auditory display to invigorate a general interest in earthquake seismology that leads to the understanding of how earthquakes occur, how earthquakes influence one another as well as tremors, and what the musical properties of these

  9. Earthquake hypocenter relocation using double difference method in East Java and surrounding areas

    International Nuclear Information System (INIS)

    C, Aprilia Puspita; Nugraha, Andri Dian; Puspito, Nanang T

    2015-01-01

    Determination of precise hypocenter location is very important in order to provide information about subsurface fault plane and for seismic hazard analysis. In this study, we have relocated hypocenter earthquakes in Eastern part of Java and surrounding areas from local earthquake data catalog compiled by Meteorological, Climatological, and Geophysical Agency of Indonesia (MCGA) in time period 2009-2012 by using the double-difference method. The results show that after relocation processes, there are significantly changes in position and orientation of earthquake hypocenter which is correlated with the geological setting in this region. We observed indication of double seismic zone at depths of 70-120 km within the subducting slab in south of eastern part of Java region. Our results will provide useful information for advance seismological studies and seismic hazard analysis in this study

  10. Earthquake hypocenter relocation using double difference method in East Java and surrounding areas

    Energy Technology Data Exchange (ETDEWEB)

    C, Aprilia Puspita [Geophysical Engineering Program, Faculty of Mining and Petroleum Engineering, Institute of Technology Bandung (Indonesia); Meteorological, Climatological, and Geophysical Agency (MCGA) of Indonesian, Jakarta (Indonesia); Nugraha, Andri Dian, E-mail: nugraha@gf.itb.ac.id [Geophysical Engineering Program, Faculty of Mining and Petroleum Engineering, Institute of Technology Bandung (Indonesia); Puspito, Nanang T [Global Geophysical Research Group, Faculty of Mining and Petroleum Engineering, Institute of Technology Bandung (Indonesia)

    2015-04-24

    Determination of precise hypocenter location is very important in order to provide information about subsurface fault plane and for seismic hazard analysis. In this study, we have relocated hypocenter earthquakes in Eastern part of Java and surrounding areas from local earthquake data catalog compiled by Meteorological, Climatological, and Geophysical Agency of Indonesia (MCGA) in time period 2009-2012 by using the double-difference method. The results show that after relocation processes, there are significantly changes in position and orientation of earthquake hypocenter which is correlated with the geological setting in this region. We observed indication of double seismic zone at depths of 70-120 km within the subducting slab in south of eastern part of Java region. Our results will provide useful information for advance seismological studies and seismic hazard analysis in this study.

  11. A study of risk evaluation methodology selection for the external hazards

    International Nuclear Information System (INIS)

    Kuramoto, Takahiro; Yamaguchi, Akira; Narumiya, Yosiyuki

    2014-01-01

    Since the accident at Fukushima Daiichi Nuclear Power Plant caused by the Great East Japan Earthquake in March 2011, there has been growing demands for assessing the effects of external hazards, including natural events, such as earthquake and tsunami, and external human behaviors, and taking actions to address those external hazards. The newly established Japanese regulatory requirements claim design considerations associated with external hazards. The primary objective of the risk assessment for external hazards is to establish countermeasures against such hazards rather than grasping the risk figures. Therefore, applying detailed risk assessment methods, such as probabilistic risk assessment (PRA), to all the external hazards is not always the most appropriate. Risk assessment methods can vary in types including qualitative evaluation, hazard analysis (analyzing hazard frequencies or their influence), and margin assessment. To resolve these issues, a process has been established that enables us to identify the external hazards in a comprehensive and systematic manner, which have potential risks leading to core damage and to select an appropriate evaluation method according to the risks associated with each of the external hazards. This paper discusses the comprehensive and systematic identification process for the external hazards which have potential risks leading to core damage, and the approaches of selecting an appropriate evaluation method for each external hazard. This paper also describes some applications of specific risk evaluation methods. (author)

  12. Stochastic finite-fault modelling of strong earthquakes in Narmada ...

    Indian Academy of Sciences (India)

    The prevailing hazard evidenced by the earthquake-related fatalities in the region imparts significance to the investigations .... tures and sudden fault movement due to stress concentration (Kayal 2008). ..... nificantly improved the present work.

  13. Development of the Global Earthquake Model’s neotectonic fault database

    Science.gov (United States)

    Christophersen, Annemarie; Litchfield, Nicola; Berryman, Kelvin; Thomas, Richard; Basili, Roberto; Wallace, Laura; Ries, William; Hayes, Gavin P.; Haller, Kathleen M.; Yoshioka, Toshikazu; Koehler, Richard D.; Clark, Dan; Wolfson-Schwehr, Monica; Boettcher, Margaret S.; Villamor, Pilar; Horspool, Nick; Ornthammarath, Teraphan; Zuñiga, Ramon; Langridge, Robert M.; Stirling, Mark W.; Goded, Tatiana; Costa, Carlos; Yeats, Robert

    2015-01-01

    The Global Earthquake Model (GEM) aims to develop uniform, openly available, standards, datasets and tools for worldwide seismic risk assessment through global collaboration, transparent communication and adapting state-of-the-art science. GEM Faulted Earth (GFE) is one of GEM’s global hazard module projects. This paper describes GFE’s development of a modern neotectonic fault database and a unique graphical interface for the compilation of new fault data. A key design principle is that of an electronic field notebook for capturing observations a geologist would make about a fault. The database is designed to accommodate abundant as well as sparse fault observations. It features two layers, one for capturing neotectonic faults and fold observations, and the other to calculate potential earthquake fault sources from the observations. In order to test the flexibility of the database structure and to start a global compilation, five preexisting databases have been uploaded to the first layer and two to the second. In addition, the GFE project has characterised the world’s approximately 55,000 km of subduction interfaces in a globally consistent manner as a basis for generating earthquake event sets for inclusion in earthquake hazard and risk modelling. Following the subduction interface fault schema and including the trace attributes of the GFE database schema, the 2500-km-long frontal thrust fault system of the Himalaya has also been characterised. We propose the database structure to be used widely, so that neotectonic fault data can make a more complete and beneficial contribution to seismic hazard and risk characterisation globally.

  14. Long-term impacts of tropical storms and earthquakes on human population growth in Haiti and Dominican Republic

    OpenAIRE

    Christian D. Klose; Christian Webersik

    2010-01-01

    The two Caribbean states, Haiti and the Dominican Republic, have experienced similar natural forces since the 18th century, including hurricanes and earthquakes. Although, both countries seem to be two of the most prone of all Latin American and Caribbean countries to natural hazard events, historically, Haiti tends to be more vulnerable to natural forces. The purpose of this article is to understand to what extent geohazards shape demographic changes. Research findings of this study show tha...

  15. A flatfile of ground motion intensity measurements from induced earthquakes in Oklahoma and Kansas

    Science.gov (United States)

    Rennolet, Steven B.; Moschetti, Morgan P.; Thompson, Eric M.; Yeck, William

    2018-01-01

    We have produced a uniformly processed database of orientation-independent (RotD50, RotD100) ground motion intensity measurements containing peak horizontal ground motions (accelerations and velocities) and 5-percent-damped pseudospectral accelerations (0.1–10 s) from more than 3,800 M ≥ 3 earthquakes in Oklahoma and Kansas that occurred between January 2009 and December 2016. Ground motion time series were collected from regional, national, and temporary seismic arrays out to 500 km. We relocated the majority of the earthquake hypocenters using a multiple-event relocation algorithm to produce a set of near-uniformly processed hypocentral locations. Ground motion processing followed standard methods, with the primary objective of reducing the effects of noise on the measurements. Regional wave-propagation features and the high seismicity rate required careful selection of signal windows to ensure that we captured the entire ground motion record and that contaminating signals from extraneous earthquakes did not contribute to the database. Processing was carried out with an automated scheme and resulted in a database comprising more than 174,000 records (https://dx.doi.org/10.5066/F73B5X8N). We anticipate that these results will be useful for improved understanding of earthquake ground motions and for seismic hazard applications.

  16. Exploring Earthquakes in Real-Time

    Science.gov (United States)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  17. Understanding radiation and risk: the importance of primary and secondary education

    International Nuclear Information System (INIS)

    Tada, Junichiro

    1999-01-01

    In Japan's primary and secondary schools, radiation and radioactivity are taught as part of the curriculum dealing with social science subjects. Students learn much about the hazardous features of radiation, but lack the scientific understanding necessary to build a more balanced picture. Although the same point applies to education covering the harmful effects of volcanic eruptions, earthquakes, electrical storms and so on, public understanding of these events is relatively high and students are generally able to make informed judgments about the risks involved. By contrast, their limited understanding of radiation often contributes to fears that it is evil or even supernatural. To correct this distortion, it is important that primary and secondary education includes a scientific explanation of radiation. Like heat and light, radiation is fundamental to the history of the universe; and scientific education programs should give appropriate emphasis to this important subject. Students would then be able to make more objective judgments about the useful and hazardous aspects of radiation. (author)

  18. Understanding radiation and risk: the importance of primary and secondary education

    Energy Technology Data Exchange (ETDEWEB)

    Tada, Junichiro [Japan Synchrotron Radiation Research Institute (SPring-8), Mikaduki, Hyogo (Japan)

    1999-09-01

    In Japan's primary and secondary schools, radiation and radioactivity are taught as part of the curriculum dealing with social science subjects. Students learn much about the hazardous features of radiation, but lack the scientific understanding necessary to build a more balanced picture. Although the same point applies to education covering the harmful effects of volcanic eruptions, earthquakes, electrical storms and so on, public understanding of these events is relatively high and students are generally able to make informed judgments about the risks involved. By contrast, their limited understanding of radiation often contributes to fears that it is evil or even supernatural. To correct this distortion, it is important that primary and secondary education includes a scientific explanation of radiation. Like heat and light, radiation is fundamental to the history of the universe; and scientific education programs should give appropriate emphasis to this important subject. Students would then be able to make more objective judgments about the useful and hazardous aspects of radiation. (author)

  19. Permeability Changes Observed in the Arbuckle Group Coincident with Nearby Earthquake Occurrence

    Science.gov (United States)

    Kroll, K.; Cochran, E. S.; Richards-Dinger, K. B.; Murray, K.

    2017-12-01

    We investigate the temporal evolution of hydrologic properties of the 2 km deep Arbuckle Group, the principal target in Oklahoma for saltwater disposal resulting from oil and gas production. Specifically, we look for changes to the hydrologic system associated with local earthquakes at two monitoring wells (Payne07 and 08) near Cushing, Oklahoma. The wells were instrumented with pressure transducers starting in Aug. 2016, after injection was discontinued due to regulatory directives. The observation period includes the 3 Sep 2016 Mw5.8 Pawnee and 7 Nov. 2016 Mw5.0 Cushing earthquakes located 50 km and 5 km from the wells, respectively. Previous studies have suggested the Mw5.8 Pawnee earthquake affected both the shallow and deep hydrological systems, with an increase in stream discharge observed near the mainshock (Manga et al., 2016) and a change in poroelastic properties of the Arbuckle inferred from the observed co-seismic water level offsets observed at Payne 07 and 08 (Kroll et al., 2017). Here, we use the water level response to solid Earth tides to estimate permeability and specific storage through time during the observation period. We measure the phase lag between the solid Earth tide and the water level changes and find that phase lag between the Earth tide and aquifer response decreases at the time of the Mw5.0 Cushing earthquake in both wells. Our results suggest permeability increased in the Arbuckle Group after the earthquake by a factor of 5. It is possible that in extreme cases there may be complex interaction between saltwater disposal, hydrologic systems, and earthquake rates that should be considered to better understand seismic hazard.

  20. Introduction: seismology and earthquake engineering in Mexico and Central and South America.

    Science.gov (United States)

    Espinosa, A.F.

    1982-01-01

    The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston

  1. Earthquake Disaster of Yogyakarta and Central Java, and Disaster Reduction, Indonesia

    Directory of Open Access Journals (Sweden)

    Sutikno Sutikno

    2016-05-01

    Full Text Available This paper discussed on earthquake disaster and its reduction of Yogyakarta and Central Java, Indonesia. The study area is located at relatively a short distance from subduction zone of India-Australian and Eurasian plates. Geologically this area is characterized by fault and graben structure, and geomorphologically is composed of block mountain, karsts topography and fluvio-volcanic plain. Aim of this paper is to evaluate the spatial distribution of the damage area, the environmental impacts, and to discuss the risk reduction of earthquake disaster scientifically and practically. In this paper to determine the hazard susceptibility zone and their environmental impact used geologic, geomorphologic, land use map, remote sensing image interpretation, and field observation. Discussion on the earthquake disaster risk reduction based on the hazard susceptibility and the characteristic of the human settlement and facilities. The result of this study shows that: i.the high damage area associate with distribution of the fault structures and the lithology; ii. mass-movement, lowering of groundwater, rising new springs, liquefaction, cracking of rocks and land surface; iii. structural non structural efforts are used for earthquake disaster reduction.

  2. Site Specific Probabilistic Seismic Hazard and Risk Analysis for Surrounding Communities of The Geysers Geothermal Development Area

    Science.gov (United States)

    Miah, M.; Hutchings, L. J.; Savy, J. B.

    2014-12-01

    We conduct a probabilistic seismic hazard and risk analysis from induced and tectonic earthquakes for a 50 km radius area centered on The Geysers, California and for the next ten years. We calculate hazard with both a conventional and physics-based approach. We estimate site specific hazard. We convert hazard to risk of nuisance and damage to structures per year and map the risk. For the conventional PSHA we assume the past ten years is indicative of hazard for the next ten years from Mnoise. Then, we interpolate within each geologic unit in finely gridded points. All grid points within a unit are weighted by distance from each data collection point. The entire process is repeated for all of the other types of geologic units until the entire area is gridded and assigned a hazard value for every grid points. We found that nuisance and damage risks calculated by both conventional and physics-based approaches provided almost identical results. This is very surprising since they were calculated by completely independent means. The conventional approach used the actual catalog of the past ten years of earthquakes to estimate the hazard for the next ten year. While the physics-based approach used geotechnical modeling to calculate the catalog for the next ten years. Similarly, for the conventional PSHA, we utilized attenuation relations from past earthquakes recorded at the Geysers to translate the ground motion from the source to the site. While for the physics-based approach we calculated ground motion from simulation of actual earthquake rupture. Finally, the source of the earthquakes was the actual source for the conventional PSHA. While, we assumed random fractures for the physics-based approach. From all this, we consider the calculation of the conventional approach, based on actual data, to validate the physics-based approach used.

  3. Surface rupture and vertical deformation associated with 20 May 2016 M6 Petermann Ranges earthquake, Northern Territory, Australia

    Science.gov (United States)

    Gold, Ryan; Clark, Dan; King, Tamarah; Quigley, Mark

    2017-04-01

    Surface-rupturing earthquakes in stable continental regions (SCRs) occur infrequently, though when they occur in heavily populated regions the damage and loss of life can be severe (e.g., 2001 Bhuj earthquake). Quantifying the surface-rupture characteristics of these low-probability events is therefore important, both to improve understanding of the on- and off-fault deformation field near the rupture trace and to provide additional constraints on earthquake magnitude to rupture length and displacement, which are critical inputs for seismic hazard calculations. This investigation focuses on the 24 August 2016 M6.0 Petermann Ranges earthquake, Northern Territory, Australia. We use 0.3-0.5 m high-resolution optical Worldview satellite imagery to map the trace of the surface rupture associated with the earthquake. From our mapping, we are able to trace the rupture over a length of 20 km, trending NW, and exhibiting apparent north-side-up motion. To quantify the magnitude of vertical surface deformation, we use stereo Worldview images processed using NASA Ames Stereo Pipeline software to generate pre- and post-earthquake digital terrain models with a spatial resolution of 1.5 to 2 m. The surface scarp is apparent in much of the post-event digital terrain model. Initial efforts to difference the pre- and post-event digital terrain models yield noisy results, though we detect vertical deformation of 0.2 to 0.6 m over length scales of 100 m to 1 km from the mapped trace of the rupture. Ongoing efforts to remove ramps and perform spatial smoothing will improve our understanding of the extent and pattern of vertical deformation. Additionally, we will compare our results with InSAR and field measurements obtained following the earthquake.

  4. Seismic Hazard Assessment at Esfaraen‒Bojnurd Railway, North‒East of Iran

    Science.gov (United States)

    Haerifard, S.; Jarahi, H.; Pourkermani, M.; Almasian, M.

    2018-01-01

    The objective of this study is to evaluate the seismic hazard at the Esfarayen-Bojnurd railway using the probabilistic seismic hazard assessment (PSHA) method. This method was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. Attenuation equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 1.2 × 1.2 km covering the study area, ground acceleration for every node was calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to return periods of 74, 475 and 2475 years.

  5. Surface latent heat flux as an earthquake precursor

    Directory of Open Access Journals (Sweden)

    S. Dey

    2003-01-01

    Full Text Available The analysis of surface latent heat flux (SLHF from the epicentral regions of five recent earthquakes that occurred in close proximity to the oceans has been found to show anomalous behavior. The maximum increase of SLHF is found 2–7 days prior to the main earthquake event. This increase is likely due to an ocean-land-atmosphere interaction. The increase of SLHF prior to the main earthquake event is attributed to the increase in infrared thermal (IR temperature in the epicentral and surrounding region. The anomalous increase in SLHF shows great potential in providing early warning of a disastrous earthquake, provided that there is a better understanding of the background noise due to the tides and monsoon in surface latent heat flux. Efforts have been made to understand the level of background noise in the epicentral regions of the five earthquakes considered in the present paper. A comparison of SLHF from the epicentral regions over the coastal earthquakes and the earthquakes that occurred far away from the coast has been made and it has been found that the anomalous behavior of SLHF prior to the main earthquake event is only associated with the coastal earthquakes.

  6. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  7. Width of surface rupture zone for thrust earthquakes: implications for earthquake fault zoning

    Science.gov (United States)

    Boncio, Paolo; Liberi, Francesca; Caldarella, Martina; Nurminen, Fiia-Charlotta

    2018-01-01

    The criteria for zoning the surface fault rupture hazard (SFRH) along thrust faults are defined by analysing the characteristics of the areas of coseismic surface faulting in thrust earthquakes. Normal and strike-slip faults have been deeply studied by other authors concerning the SFRH, while thrust faults have not been studied with comparable attention. Surface faulting data were compiled for 11 well-studied historic thrust earthquakes occurred globally (5.4 ≤ M ≤ 7.9). Several different types of coseismic fault scarps characterize the analysed earthquakes, depending on the topography, fault geometry and near-surface materials (simple and hanging wall collapse scarps, pressure ridges, fold scarps and thrust or pressure ridges with bending-moment or flexural-slip fault ruptures due to large-scale folding). For all the earthquakes, the distance of distributed ruptures from the principal fault rupture (r) and the width of the rupture zone (WRZ) were compiled directly from the literature or measured systematically in GIS-georeferenced published maps. Overall, surface ruptures can occur up to large distances from the main fault ( ˜ 2150 m on the footwall and ˜ 3100 m on the hanging wall). Most of the ruptures occur on the hanging wall, preferentially in the vicinity of the principal fault trace ( > ˜ 50 % at distances guidelines). In the absence of such a very detailed study (basic SM, i.e. Level 1 SM of Italian guidelines) a width of ˜ 840 m (90 % probability from "simple thrust" database of distributed ruptures, excluding B-M, F-S and Sy fault ruptures) is suggested to be sufficiently precautionary. For more detailed SM, where the fault is carefully mapped, one must consider that the highest SFRH is concentrated in a narrow zone, ˜ 60 m in width, that should be considered as a fault avoidance zone (more than one-third of the distributed ruptures are expected to occur within this zone). The fault rupture hazard zones should be asymmetric compared to the trace

  8. Expanding Horizons in Mitigating Earthquake Related Disasters in Urban Areas: Global Development of Real-Time Seismology

    OpenAIRE

    Utkucu, Murat; Küyük, Hüseyin Serdar; Demir, İsmail Hakkı

    2016-01-01

    Abstract Real-time seismology is a newly developing alternative approach in seismology to mitigate earthquake hazard. It exploits up-to-date advances in seismic instrument technology, data acquisition, digital communications and computer systems for quickly transforming data into earthquake information in real-time to reduce earthquake losses and its impact on social and economic life in the earthquake prone densely populated urban and industrial areas.  Real-time seismology systems are not o...

  9. Connecting Hazard Analysts and Risk Managers to Sensor Information.

    Science.gov (United States)

    Le Cozannet, Gonéri; Hosford, Steven; Douglas, John; Serrano, Jean-Jacques; Coraboeuf, Damien; Comte, Jérémie

    2008-06-11

    Hazard analysts and risk managers of natural perils, such as earthquakes, landslides and floods, need to access information from sensor networks surveying their regions of interest. However, currently information about these networks is difficult to obtain and is available in varying formats, thereby restricting accesses and consequently possibly leading to decision-making based on limited information. As a response to this issue, state-of-the-art interoperable catalogues are being currently developed within the framework of the Group on Earth Observations (GEO) workplan. This article provides an overview of the prototype catalogue that was developed to improve access to information about the sensor networks surveying geological hazards (geohazards), such as earthquakes, landslides and volcanoes.

  10. Broadband Ground Motion Simulation Recipe for Scenario Hazard Assessment in Japan

    Science.gov (United States)

    Koketsu, K.; Fujiwara, H.; Irikura, K.

    2014-12-01

    The National Seismic Hazard Maps for Japan, which consist of probabilistic seismic hazard maps (PSHMs) and scenario earthquake shaking maps (SESMs), have been published every year since 2005 by the Earthquake Research Committee (ERC) in the Headquarter for Earthquake Research Promotion, which was established in the Japanese government after the 1995 Kobe earthquake. The publication was interrupted due to problems in the PSHMs revealed by the 2011 Tohoku earthquake, and the Subcommittee for Evaluations of Strong Ground Motions ('Subcommittee') has been examining the problems for two and a half years (ERC, 2013; Fujiwara, 2014). However, the SESMs and the broadband ground motion simulation recipe used in them are still valid at least for crustal earthquakes. Here, we outline this recipe and show the results of validation tests for it.Irikura and Miyake (2001) and Irikura (2004) developed a recipe for simulating strong ground motions from future crustal earthquakes based on a characterization of their source models (Irikura recipe). The result of the characterization is called a characterized source model, where a rectangular fault includes a few rectangular asperities. Each asperity and the background area surrounding the asperities have their own uniform stress drops. The Irikura recipe defines the parameters of the fault and asperities, and how to simulate broadband ground motions from the characterized source model. The recipe for the SESMs was constructed following the Irikura recipe (ERC, 2005). The National Research Institute for Earth Science and Disaster Prevention (NIED) then made simulation codes along this recipe to generate SESMs (Fujiwara et al., 2006; Morikawa et al., 2011). The Subcommittee in 2002 validated a preliminary version of the SESM recipe by comparing simulated and observed ground motions for the 2000 Tottori earthquake. In 2007 and 2008, the Subcommittee carried out detailed validations of the current version of the SESM recipe and the NIED

  11. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  12. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    some understanding of their sources and the physical properties of the crust, which also vary from place to place and time to time. Anomalies are not necessarily due to stress or earthquake preparation, and separating the extraneous ones is a problem as daunting as understanding earthquake behavior itself. Fourth, the associations presented between anomalies and earthquakes are generally based on selected data. Validating a proposed association requires complete data on the earthquake record and the geophysical measurements over a large area and time, followed by prospective testing which allows no adjustment of parameters, criteria, etc. The Collaboratory for Study of Earthquake Predictability (CSEP) is dedicated to providing such prospective testing. Any serious proposal for prediction research should deal with the problems above, and anticipate the huge investment in time required to test hypotheses.

  13. Stress drops of induced and tectonic earthquakes in the central United States are indistinguishable.

    Science.gov (United States)

    Huang, Yihe; Ellsworth, William L; Beroza, Gregory C

    2017-08-01

    Induced earthquakes currently pose a significant hazard in the central United States, but there is considerable uncertainty about the severity of their ground motions. We measure stress drops of 39 moderate-magnitude induced and tectonic earthquakes in the central United States and eastern North America. Induced earthquakes, more than half of which are shallower than 5 km, show a comparable median stress drop to tectonic earthquakes in the central United States that are dominantly strike-slip but a lower median stress drop than that of tectonic earthquakes in the eastern North America that are dominantly reverse-faulting. This suggests that ground motion prediction equations developed for tectonic earthquakes can be applied to induced earthquakes if the effects of depth and faulting style are properly considered. Our observation leads to the notion that, similar to tectonic earthquakes, induced earthquakes are driven by tectonic stresses.

  14. Far field tsunami simulations of the 1755 Lisbon earthquake: Implications for tsunami hazard to the U.S. East Coast and the Caribbean

    Science.gov (United States)

    Barkan, R.; ten Brink, Uri S.; Lin, J.

    2009-01-01

    The great Lisbon earthquake of November 1st, 1755 with an estimated moment magnitude of 8.5-9.0 was the most destructive earthquake in European history. The associated tsunami run-up was reported to have reached 5-15??m along the Portuguese and Moroccan coasts and the run-up was significant at the Azores and Madeira Island. Run-up reports from a trans-oceanic tsunami were documented in the Caribbean, Brazil and Newfoundland (Canada). No reports were documented along the U.S. East Coast. Many attempts have been made to characterize the 1755 Lisbon earthquake source using geophysical surveys and modeling the near-field earthquake intensity and tsunami effects. Studying far field effects, as presented in this paper, is advantageous in establishing constraints on source location and strike orientation because trans-oceanic tsunamis are less influenced by near source bathymetry and are unaffected by triggered submarine landslides at the source. Source location, fault orientation and bathymetry are the main elements governing transatlantic tsunami propagation to sites along the U.S. East Coast, much more than distance from the source and continental shelf width. Results of our far and near-field tsunami simulations based on relative amplitude comparison limit the earthquake source area to a region located south of the Gorringe Bank in the center of the Horseshoe Plain. This is in contrast with previously suggested sources such as Marqu??s de Pombal Fault, and Gulf of C??diz Fault, which are farther east of the Horseshoe Plain. The earthquake was likely to be a thrust event on a fault striking ~ 345?? and dipping to the ENE as opposed to the suggested earthquake source of the Gorringe Bank Fault, which trends NE-SW. Gorringe Bank, the Madeira-Tore Rise (MTR), and the Azores appear to have acted as topographic scatterers for tsunami energy, shielding most of the U.S. East Coast from the 1755 Lisbon tsunami. Additional simulations to assess tsunami hazard to the U.S. East

  15. Digging Our Own Holes: Institutional Perspectives on Seismic Hazards

    Science.gov (United States)

    Stein, S.; Tomasello, J.

    2005-12-01

    It has been observed that there are no true students of the earth; instead, we each dig our own holes and sit in them. A similar situation arises in attempts to assess the hazards of earthquakes and other natural disasters and to develop strategies to mitigate them. Ideally, we would like to look at the interests of society as a whole and develop strategies that best balance hazard mitigation with alternative uses of resources. Doing so, however, is difficult for several reasons. First, estimating seismic hazards requires assumptions about the size, recurrence, and shaking from future earthquakes, none of which are well known. Second, we have to chose a definition of seismic hazard, which is even more arbitrary and at least as significant about future earthquakes. Third, mitigating the risks involves economic and policy issues as well as the scientific one of estimating the hazard itself and the engineering one of designing safe structures. As a result, different public and private organizations with different institutional perspectives naturally adopt different approaches. Most organizations have a single focus. For example, those focusing on economic development tend to discount hazards, whereas emergency management groups tend to accentuate them. Organizations with quasi-regulatory duties (BSSC, FEMA, USGS) focus on reducing losses in future earthquakes without considering the cost of mitigation measures or how this use of resources should be balanced with alternative uses of resources that could mitigate other losses. Some organizations, however, must confront these tradeoffs directly because they allocate resources internally. Hence hospitals implicitly trade off more earthquake resistant construction with treating uninsured patients, highway departments balance stronger bridges with other safety improvements, and schools balance safer buildings with after school programs. These choices are complicated by the fact that such infrastructure typically has longer

  16. Protocols for geologic hazards response by the Yellowstone Volcano Observatory

    Science.gov (United States)

    ,

    2010-01-01

    The Yellowstone Plateau hosts an active volcanic system, with subterranean magma (molten rock), boiling, pressurized waters, and a variety of active faults with significant earthquake hazards. Within the next few decades, light-to-moderate earthquakes and steam explosions are certain to occur. Volcanic eruptions are less likely, but are ultimately inevitable in this active volcanic region. This document summarizes protocols, policies, and tools to be used by the Yellowstone Volcano Observatory (YVO) during earthquakes, hydrothermal explosions, or any geologic activity that could lead to a volcanic eruption.

  17. Has El Salvador Fault Zone produced M ≥ 7.0 earthquakes? The 1719 El Salvador earthquake

    Science.gov (United States)

    Canora, C.; Martínez-Díaz, J.; Álvarez-Gómez, J.; Villamor, P.; Ínsua-Arévalo, J.; Alonso-Henar, J.; Capote, R.

    2013-05-01

    Historically, large earthquakes, Mw ≥ 7.0, in the Εl Salvador area have been attributed to activity in the Cocos-Caribbean subduction zone. Τhis is correct for most of the earthquakes of magnitude greater than 6.5. However, recent paleoseismic evidence points to the existence of large earthquakes associated with rupture of the Εl Salvador Fault Ζone, an Ε-W oriented strike slip fault system that extends for 150 km through central Εl Salvador. Τo calibrate our results from paleoseismic studies, we have analyzed the historical seismicity of the area. In particular, we suggest that the 1719 earthquake can be associated with paleoseismic activity evidenced in the Εl Salvador Fault Ζone. Α reinterpreted isoseismal map for this event suggests that the damage reported could have been a consequence of the rupture of Εl Salvador Fault Ζone, rather than rupture of the subduction zone. Τhe isoseismal is not different to other upper crustal earthquakes in similar tectonovolcanic environments. We thus challenge the traditional assumption that only the subduction zone is capable of generating earthquakes of magnitude greater than 7.0 in this region. Τhis result has broad implications for future risk management in the region. Τhe potential occurrence of strong ground motion, significantly higher and closer to the Salvadorian populations that those assumed to date, must be considered in seismic hazard assessment studies in this area.

  18. Seismogenic structures of the central Apennines and its implication for seismic hazard

    Science.gov (United States)

    Zheng, Y.; Riaz, M. S.; Shan, B.

    2017-12-01

    The central Apennines belt is formed during the Miocene-to-Pliocene epoch under the environment where the Adriatic Plate collides with and plunges beneath the Eurasian Plate, eventually formed a fold and thrust belt. This active fold and thrust belt has experienced relatively frequent moderate-magnitude earthquakesover, as well as strong destructive earthquakes such as the 1997 Umbira-Marche sequence, the 2009 Mw 6.3 L'Aquila earthquake sequence, and three strong earthquakes occurred in 2016. Such high seismicity makes it one of the most active tectonic zones in the world. Moreover, most of these earthquakes are normal fault events with shallow depths, and most earthquakes occurred in the central Apennines are of lower seismic energy to moment ratio. What seismogenic structure causes such kind of seismic features? and how about the potential seismic hazard in the study region? In order to make in-depth understanding about the seismogenic structures in this reion, we collected seismic data from the INGV, Italy, to model the crustal structure, and to relocate the earthquakes. To improve the spatial resolution of the tomographic images, we collected travel times from 27627 earthquakes with M>1.7 recorded at 387 seismic stations. Double Difference Tomography (hereafter as DDT) is applied to build velocity structures and earthquake locations. Checkerboard test confirms that the spatial resolution between the depths range from 5 20km is better than 10km. The travel time residual is significantly decreased from 1208 ms to 70 ms after the inversion. Horizontal Vp images show that mostly earthquakes occurred in high anomalies zones, especially between 5 10km, whereas at the deeper depths, some of the earthquakes occurred in the low Vp anomalies. For Vs images, shallow earthquakes mainly occurred in low anomalies zone, at depths range of 10 15km, earthquakes are mainly concentrated in normal velocity or relatively lower anomalies zones. Moreover, mostly earthquakes occurred

  19. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  20. Seismic hazard analysis for the NTS spent reactor fuel test site

    International Nuclear Information System (INIS)

    Campbell, K.W.

    1980-01-01

    An experiment is being directed at the Nevada Test Site to test the feasibility for storage of spent fuel from nuclear reactors in geologic media. As part of this project, an analysis of the earthquake hazard was prepared. This report presents the results of this seismic hazard assessment. Two distinct components of the seismic hazard were addressed: vibratory ground motion and surface displacement