WorldWideScience

Sample records for earthquake engineers release

  1. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  2. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  3. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  4. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  5. Elastic energy release in great earthquakes and eruptions

    Directory of Open Access Journals (Sweden)

    Agust eGudmundsson

    2014-05-01

    Full Text Available The sizes of earthquakes are measured using well-defined, measurable quantities such as seismic moment and released (transformed elastic energy. No similar measures exist for the sizes of volcanic eruptions, making it difficult to compare the energies released in earthquakes and eruptions. Here I provide a new measure of the elastic energy (the potential mechanical energy associated with magma chamber rupture and contraction (shrinkage during an eruption. For earthquakes and eruptions, elastic energy derives from two sources: (1 the strain energy stored in the volcano/fault zone before rupture, and (2 the external applied load (force, pressure, stress, displacement on the volcano/fault zone. From thermodynamic considerations it follows that the elastic energy released or transformed (dU during an eruption is directly proportional to the excess pressure (pe in the magma chamber at the time of rupture multiplied by the volume decrease (-dVc of the chamber, so that . This formula can be used as a basis for a new eruption magnitude scale, based on elastic energy released, which can be related to the moment-magnitude scale for earthquakes. For very large eruptions (>100 km3, the volume of the feeder-dike is negligible, so that the decrease in chamber volume during an eruption corresponds roughly to the associated volume of erupted materials , so that the elastic energy is . Using a typical excess pressures of 5 MPa, it is shown that the largest known eruptions on Earth, such as the explosive La Garita Caldera eruption (27-28 million years ago and largest single (effusive Colombia River basalt lava flows (15-16 million years ago, both of which have estimated volumes of about 5000 km3, released elastic energy of the order of 10EJ. For comparison, the seismic moment of the largest earthquake ever recorded, the M9.5 1960 Chile earthquake, is estimated at 100 ZJ and the associated elastic energy release at 10EJ.

  6. Computational methods in earthquake engineering

    CERN Document Server

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  7. Modern earthquake engineering offshore and land-based structures

    CERN Document Server

    Jia, Junbo

    2017-01-01

    This book addresses applications of earthquake engineering for both offshore and land-based structures. It is self-contained as a reference work and covers a wide range of topics, including topics related to engineering seismology, geotechnical earthquake engineering, structural engineering, as well as special contents dedicated to design philosophy, determination of ground motions, shock waves, tsunamis, earthquake damage, seismic response of offshore and arctic structures, spatial varied ground motions, simplified and advanced seismic analysis methods, sudden subsidence of offshore platforms, tank liquid impacts during earthquakes, seismic resistance of non-structural elements, and various types of mitigation measures, etc. The target readership includes professionals in offshore and civil engineering, officials and regulators, as well as researchers and students in this field.

  8. Real-time earthquake monitoring using a search engine method.

    Science.gov (United States)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  9. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  10. Elements of earthquake engineering and structural dynamics. 2. ed.

    International Nuclear Information System (INIS)

    Filiatrault, A.

    2002-01-01

    This book is written for practising engineers, senior undergraduate and junior structural-engineering students, and university educators. Its main goal is to provide basic knowledge to structural engineers who have no previous knowledge about earthquake engineering and structural dynamics. Earthquake engineering is a multidisciplinary science. This book is not limited to structural analysis and design. The basics of other relevant topics (such as geology, seismology, and geotechnical engineering) are also covered to ensure that structural engineers can interact efficiently with other specialists during a construction project in a seismic zone

  11. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  12. Building Infrastructure for Preservation and Publication of Earthquake Engineering Research Data

    Directory of Open Access Journals (Sweden)

    Stanislav Pejša

    2014-10-01

    Full Text Available The objective of this paper is to showcase the progress of the earthquake engineering community during a decade-long effort supported by the National Science Foundation in the George E. Brown Jr., Network for Earthquake Engineering Simulation (NEES. During the four years that NEES network operations have been headquartered at Purdue University, the NEEScomm management team has facilitated an unprecedented cultural change in the ways research is performed in earthquake engineering. NEES has not only played a major role in advancing the cyberinfrastructure required for transformative engineering research, but NEES research outcomes are making an impact by contributing to safer structures throughout the USA and abroad. This paper reflects on some of the developments and initiatives that helped instil change in the ways that the earthquake engineering and tsunami community share and reuse data and collaborate in general.

  13. Prevent recurrence of nuclear disaster (3). Agenda on nuclear safety from earthquake engineering

    International Nuclear Information System (INIS)

    Kameda, Hiroyuki; Takada, Tsuyoshi; Ebisawa, Katsumi; Nakamura, Susumu

    2012-01-01

    Based on results of activities of committee on seismic safety of nuclear power plants (NPPs) of Japan Association for Earthquake Engineering, which started activities after Chuetsu-oki earthquake and then experienced Great East Japan Earthquake, (under close collaboration with the committee of Atomic Energy Society of Japan started activities simultaneously), and taking account of further development of concept, agenda on nuclear safety were proposed from earthquake engineering. In order to prevent recurrence of nuclear disaster, individual technical issues of earthquake engineering and comprehensive issues of integration technology, multidisciplinary collaboration and establishment of technology governance based on them were of prime importance. This article described important problems to be solved; (1) technical issues and mission of seismic safety of NPPs, (2) decision making based on risk assessment - basis of technical governance, (3) framework of risk, design and regulation - framework of required technology governance, (4) technical issues of earthquake engineering for nuclear safety, (5) role of earthquake engineering in nuclear power risk communication and (6) importance of multidisciplinary collaboration. Responsibility of engineering would be attributed to establishment of technology governance, cultivation of individual technology and integration technology, and social communications. (T. Tanaka)

  14. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    Science.gov (United States)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  15. Revolutionising engineering education in the Middle East region to promote earthquake-disaster mitigation

    Science.gov (United States)

    Baytiyeh, Hoda; Naja, Mohamad K.

    2014-09-01

    Due to the high market demands for professional engineers in the Arab oil-producing countries, the appetite of Middle Eastern students for high-paying jobs and challenging careers in engineering has sharply increased. As a result, engineering programmes are providing opportunities for more students to enrol on engineering courses through lenient admission policies that do not compromise academic standards. This strategy has generated an influx of students who must be carefully educated to enhance their professional knowledge and social capital to assist in future earthquake-disaster risk-reduction efforts. However, the majority of Middle Eastern engineering students are unaware of the valuable acquired engineering skills and knowledge in building the resilience of their communities to earthquake disasters. As the majority of the countries in the Middle East are exposed to seismic hazards and are vulnerable to destructive earthquakes, engineers have become indispensable assets and the first line of defence against earthquake threats. This article highlights the contributions of some of the engineering innovations in advancing technologies and techniques for effective disaster mitigation and it calls for the incorporation of earthquake-disaster-mitigation education into academic engineering programmes in the Eastern Mediterranean region.

  16. Basic earthquake engineering from seismology to analysis and design

    CERN Document Server

    Sucuoğlu, Halûk

    2014-01-01

    This book provides senior undergraduate students, master students and structural engineers who do not have a background in the field with core knowledge of structural earthquake engineering that will be invaluable in their professional lives. The basics of seismotectonics, including the causes, magnitude, and intensity of earthquakes, are first explained. Then the book introduces basic elements of seismic hazard analysis and presents the concept of a seismic hazard map for use in seismic design. Subsequent chapters cover key aspects of the response analysis of simple systems and building struc­tures to earthquake ground motions, design spectrum, the adoption of seismic analysis procedures in seismic design codes, seismic design principles and seismic design of reinforced concrete structures. Helpful worked examples on seismic analysis of linear, nonlinear and base isolated buildings, earthquake-resistant design of frame and frame-shear wall systems are included, most of which can be solved using a hand calcu...

  17. Earthquake engineering and structural dynamics studies at Bhabha Atomic Research Centre

    International Nuclear Information System (INIS)

    Reddy, G.R.; Parulekar, Y.M.; Sharma, A.; Dubey, P.N.; Vaity, K.N.; Kukreja, Mukhesh; Vaze, K.K.; Ghosh, A.K.; Kushwaha, H.S.

    2007-01-01

    Earthquake Engineering and structural Dynamics has gained the attention of many researchers throughout the world and extensive research work is performed. Linear behaviour of structures, systems and components (SSCs) subjected to earthquake/dynamic loading is clearly understood. However, nonlinear behaviour of SSCs subjected to earthquake/dynamic loading need to be understood clearly and design methods need to be validated experimentally. In view of this, three major areas in earthquake engineering and structural dynamics identified for research includes: design and development of passive devices to control the seismic/dynamic response of SSCs, nonlinear behaviour of piping systems subjected to earthquake loading and nonlinear behavior of RCC structures under seismic excitation or dynamic loading. BARC has performed extensive work and also being continued in the above-identified areas. The work performed is helping for clearer understanding of nonlinear behavior of SSCs as well as in developing new schemes, methodologies and devices to control the earthquake response of SSCs. (author)

  18. Engineering Seismic Base Layer for Defining Design Earthquake Motion

    International Nuclear Information System (INIS)

    Yoshida, Nozomu

    2008-01-01

    Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term ''base''. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface

  19. Introduction: seismology and earthquake engineering in Mexico and Central and South America.

    Science.gov (United States)

    Espinosa, A.F.

    1982-01-01

    The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston

  20. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... nuclear power plant structures, systems, and components important to safety to withstand the effects of...

  1. Does Modern Ideology of Earthquake Engineering Ensure the Declared Levels of Damage of Structures at Earthquakes?

    International Nuclear Information System (INIS)

    Gabrichidze, G.

    2011-01-01

    The basic position of the modern ideology of earthquake engineering is based on the idea that a structure should be designed so that it suffers almost no damage at an earthquake, the occurrence of which is most probable in the given area during the lifetime of the structure. This statement is essentially based on the so-called Performance Based Design, the ideology of the 21 s t century. In the article at tenton is focused on the fact that the modern ideology of earthquake engineering assigns structures to a dangerous zone in which their behavior is defined by processes of damage and destruction of materials, which is a nonequilibrium process and demands application of special refined methods of research. In such conditions use of ratios that correspond to static conditions of loading to describe the process of damage of materials appears to be unfounded. The article raises the question of the necessity of working out a new mathematical model of behavior of materials and structures at rapid intensive impact. (authors)

  2. A self-referential HOWTO on release engineering

    Energy Technology Data Exchange (ETDEWEB)

    Galassi, Mark C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-31

    Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early and continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.

  3. Engineering aspects of earthquake risk mitigation: Lessons from management of recent earthquakes, and consequential mudflows and landslides

    International Nuclear Information System (INIS)

    1992-01-01

    The Proceedings contain 30 selected presentations given at the Second and Third UNDRO/USSR Training Seminars: Engineering Aspects of Earthquake Risk Assessment and Mitigation of Losses, held in Dushanbe, October 1988; and Lessons from Management of Recent Earthquakes, and Consequential Mudflows and Landslides, held in Moscow, October 1989. The annexes to the document provide information on the participants, the work programme and the resolution adopted at each of the seminars. Refs, figs and tabs

  4. Controlled drug release for tissue engineering.

    Science.gov (United States)

    Rambhia, Kunal J; Ma, Peter X

    2015-12-10

    Tissue engineering is often referred to as a three-pronged discipline, with each prong corresponding to 1) a 3D material matrix (scaffold), 2) drugs that act on molecular signaling, and 3) regenerative living cells. Herein we focus on reviewing advances in controlled release of drugs from tissue engineering platforms. This review addresses advances in hydrogels and porous scaffolds that are synthesized from natural materials and synthetic polymers for the purposes of controlled release in tissue engineering. We pay special attention to efforts to reduce the burst release effect and to provide sustained and long-term release. Finally, novel approaches to controlled release are described, including devices that allow for pulsatile and sequential delivery. In addition to recent advances, limitations of current approaches and areas of further research are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Regional distribution of released earthquake energy in northern Egypt along with Inahass area

    International Nuclear Information System (INIS)

    El-hemamy, S.T.; Adel, A.A. Othman

    1999-01-01

    A review of the seismic history of Egypt indicates sone areas of high activity concentrated along Oligocene-Miocene faults. These areas support the idea of recent activation of the Oligocene-Miocene stress cycle. There are similarities in the special distribution of recent and historical epicenters. Form the tectonic map of Egypt, distribution of Intensity and magnitude show strong activity along Nile Delta. This due to the presence of a thick layers of recent alluvial sediments. The released energy of the earthquakes are effective on the structures. The present study deals with the computed released energies of the reported earthquakes in Egypt and around Inshas area . Its effect on the urban and nuclear facilities inside Inshas site is considered. Special consideration will be given to old and new waste repository sites. The application of the determined released energy reveals that Inshas site is affected by seismic activity from five seismo-tectonic source zones, namely the Red Sea, Nile Delta, El-Faiyum, the Mediterranean Sea and the Gulf of Aqaba seismo-tectonic zones. El-Faiyum seismo-tectonic source zone has the maximum effect on the site and gave a high released energy reaching to 5.4E +2 1 erg

  6. Advancing Integrated STEM Learning through Engineering Design: Sixth-Grade Students' Design and Construction of Earthquake Resistant Buildings

    Science.gov (United States)

    English, Lyn D.; King, Donna; Smeed, Joanna

    2017-01-01

    As part of a 3-year longitudinal study, 136 sixth-grade students completed an engineering-based problem on earthquakes involving integrated STEM learning. Students employed engineering design processes and STEM disciplinary knowledge to plan, sketch, then construct a building designed to withstand earthquake damage, taking into account a number of…

  7. Estimate of airborne release of plutonium from Babcock and Wilcox plant as a result of severe wind hazard and earthquake

    International Nuclear Information System (INIS)

    Mishima, J.; Schwendiman, L.C.; Ayer, J.E.

    1978-10-01

    As part of an interdisciplinary study to evaluate the potential radiological consequences of wind hazard and earthquake upon existing commercial mixed oxide fuel fabrication plants, the potential mass airborne releases of plutonium (source terms) from such events are estimated. The estimated souce terms are based upon the fraction of enclosures damaged to three levels of severity (crush, puncture penetrate, and loss of external filter, in order of decreasing severity), called damage ratio, and the airborne release if all enclosures suffered that level of damage. The discussion of damage scenarios and source terms is divided into wind hazard and earthquake scenarios in order of increasing severity. The largest airborne releases from the building were for cases involving the catastrophic collapse of the roof over the major production areas--wind hazard at 110 mph and earthquakes with peak ground accelerations of 0.20 to 0.29 g. Wind hazards at higher air velocities and earthquakes with higher ground acceleration do not result in significantly greater source terms. The source terms were calculated as additional mass of respirable particles released with time up to 4 days; and, under these assumptions, approximately 98% of the mass of material of concern is made airborne from 2 h to 4 days after the event. The overall building source terms from the damage scenarios evaluated are shown in a table. The contribution of individual areas to the overall building source term is presented in order of increasing severity for wind hazard and earthquake

  8. Electromagnetic Energy Released in the Subduction (Benioff) Zone in Weeks Previous to Earthquake Occurrence in Central Peru and the Estimation of Earthquake Magnitudes.

    Science.gov (United States)

    Heraud, J. A.; Centa, V. A.; Bleier, T.

    2017-12-01

    During the past four years, magnetometers deployed in the Peruvian coast have been providing evidence that the ULF pulses received are indeed generated at the subduction or Benioff zone and are connected with the occurrence of earthquakes within a few kilometers of the source of such pulses. This evidence was presented at the AGU 2015 Fall meeting, showing the results of triangulation of pulses from two magnetometers located in the central area of Peru, using data collected during a two-year period. Additional work has been done and the method has now been expanded to provide the instantaneous energy released at the stress areas on the Benioff zone during the precursory stage, before an earthquake occurs. Collected data from several events and in other parts of the country will be shown in a sequential animated form that illustrates the way energy is released in the ULF part of the electromagnetic spectrum. The process has been extended in time and geographical places. Only pulses associated with the occurrence of earthquakes are taken into account in an area which is highly associated with subduction-zone seismic events and several pulse parameters have been used to estimate a function relating the magnitude of the earthquake with the value of a function generated with those parameters. The results shown, including the animated data video, constitute additional work towards the estimation of the magnitude of an earthquake about to occur, based on electromagnetic pulses that originated at the subduction zone. The method is providing clearer evidence that electromagnetic precursors in effect conveys physical and useful information prior to the advent of a seismic event

  9. Structural performance of the DOE's Idaho National Engineering Laboratory during the 1983 Borah Peak Earthquake

    International Nuclear Information System (INIS)

    Guenzler, R.C.; Gorman, V.W.

    1985-01-01

    The 1983 Borah Peak Earthquake (7.3 Richter magnitude) was the largest earthquake ever experienced by the DOE's Idaho National Engineering Laboratory (INEL). Reactor and plant facilities are generally located about 90 to 110 km (60 miles) from the epicenter. Several reactors were operating normally at the time of the earthquake. Based on detailed inspections, comparisons of measured accelerations with design levels, and instrumental seismograph information, it was concluded that the 1983 Borah Peak Earthquake created no safety problems for INEL reactors or other facilities. 10 references, 16 figures, 2 tables

  10. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.

    1997-01-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  11. Introduction: seismology and earthquake engineering in Central and South America.

    Science.gov (United States)

    Espinosa, A.F.

    1983-01-01

    Reports the state-of-the-art in seismology and earthquake engineering that is being advanced in Central and South America. Provides basic information on seismological station locations in Latin America and some of the programmes in strong-motion seismology, as well as some of the organizations involved in these activities.-from Author

  12. The HayWired earthquake scenario—Engineering implications

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2018-04-18

    The HayWired Earthquake Scenario—Engineering Implications is the second volume of U.S. Geological Survey (USGS) Scientific Investigations Report 2017–5013, which describes the HayWired scenario, developed by USGS and its partners. The scenario is a hypothetical yet scientifically realistic earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after a magnitude-7 earthquake (mainshock) on the Hayward Fault and its aftershocks.Analyses in this volume suggest that (1) 800 deaths and 16,000 nonfatal injuries result from shaking alone, plus property and direct business interruption losses of more than $82 billion from shaking, liquefaction, and landslides; (2) the building code is designed to protect lives, but even if all buildings in the region complied with current building codes, 0.4 percent could collapse, 5 percent could be unsafe to occupy, and 19 percent could have restricted use; (3) people expect, prefer, and would be willing to pay for greater resilience of buildings; (4) more than 22,000 people could require extrication from stalled elevators, and more than 2,400 people could require rescue from collapsed buildings; (5) the average east-bay resident could lose water service for 6 weeks, some for as long as 6 months; (6) older steel-frame high-rise office buildings and new reinforced-concrete residential buildings in downtown San Francisco and Oakland could be unusable for as long as 10 months; (7) about 450 large fires could result in a loss of residential and commercial building floor area equivalent to more than 52,000 single-family homes and cause property (building and content) losses approaching $30 billion; and (8) combining earthquake early warning (ShakeAlert) with “drop, cover, and hold on” actions could prevent as many as 1,500 nonfatal injuries out of 18,000 total estimated nonfatal injuries from shaking and liquefaction hazards.

  13. Current earthquake engineering practice for Japanese nuclear power plants

    International Nuclear Information System (INIS)

    Hofmayer, C.H.; Park, Y.J.; Costello, J.F.

    1992-01-01

    This paper provides a brief overview of seismic research being conducted in Japan and describes USNRC efforts to understand Japanese seismic practice. Current earthquake engineering practice for Japanese nuclear power plants is descried in JEAG 4601-1987, ''Technical Guidelines for Aseismic Design of Nuclear Power Plants.'' The USNRC has sponsored BNL to translate this document into English. Efforts are underway to study and understand JEAG 4601-1987 and make the translation more readily available in the United States

  14. Load-Unload Response Ratio and Accelerating Moment/Energy Release Critical Region Scaling and Earthquake Prediction

    Science.gov (United States)

    Yin, X. C.; Mora, P.; Peng, K.; Wang, Y. C.; Weatherley, D.

    The main idea of the Load-Unload Response Ratio (LURR) is that when a system is stable, its response to loading corresponds to its response to unloading, whereas when the system is approaching an unstable state, the response to loading and unloading becomes quite different. High LURR values and observations of Accelerating Moment/Energy Release (AMR/AER) prior to large earthquakes have led different research groups to suggest intermediate-term earthquake prediction is possible and imply that the LURR and AMR/AER observations may have a similar physical origin. To study this possibility, we conducted a retrospective examination of several Australian and Chinese earthquakes with magnitudes ranging from 5.0 to 7.9, including Australia's deadly Newcastle earthquake and the devastating Tangshan earthquake. Both LURR values and best-fit power-law time-to-failure functions were computed using data within a range of distances from the epicenter. Like the best-fit power-law fits in AMR/AER, the LURR value was optimal using data within a certain epicentral distance implying a critical region for LURR. Furthermore, LURR critical region size scales with mainshock magnitude and is similar to the AMR/AER critical region size. These results suggest a common physical origin for both the AMR/AER and LURR observations. Further research may provide clues that yield an understanding of this mechanism and help lead to a solid foundation for intermediate-term earthquake prediction.

  15. An Ilustrative Nuclide Release Behavior from an HLW Repository due to an Earthquake Event

    International Nuclear Information System (INIS)

    Lee, Youn-Myoung; Hwang, Yong-Soo; Choi, Jong-Won

    2008-01-01

    Program for the evaluation of a high-level waste repository which is conceptually modeled. During the last few years, programs developed with the aid of AMBER and GoldSim by which nuclide transports in the near- and far-field of a repository as well as transport through the biosphere under various normal and disruptive release scenarios could be modeled and evaluated, have been continuously demonstrated. To show its usability, as similarly done for the natural groundwater flow scheme, influence of a possible disruptive event on a nuclide release behavior from an HLW repository system caused naturally due to an earthquake has been investigated and illustrated with the newly developed GoldSim program

  16. Earthquake strong ground motion studies at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wong, Ivan; Silva, W.; Darragh, R.; Stark, C.; Wright, D.; Jackson, S.; Carpenter, G.; Smith, R.; Anderson, D.; Gilbert, H.; Scott, D.

    1989-01-01

    Site-specific strong earthquake ground motions have been estimated for the Idaho National Engineering Laboratory assuming that an event similar to the 1983 M s 7.3 Borah Peak earthquake occurs at epicentral distances of 10 to 28 km. The strong ground motion parameters have been estimated based on a methodology incorporating the Band-Limited-White-Noise ground motion model coupled with Random Vibration Theory. A 16-station seismic attenuation and site response survey utilizing three-component portable digital seismographs was also performed for a five-month period in 1989. Based on the recordings of regional earthquakes, the effects of seismic attenuation in the shallow crust and along the propagation path and local site response were evaluated. This data combined with a detailed geologic profile developed for each site based principally on borehole data, was used in the estimation of the strong ground motion parameters. The preliminary peak horizontal ground accelerations for individual sites range from approximately 0.15 to 0.35 g. Based on the authors analysis, the thick sedimentary interbeds (greater than 20 m) in the basalt section attenuate ground motions as speculated upon in a number of previous studies

  17. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  18. Principles for selecting earthquake motions in engineering design of large dams

    Science.gov (United States)

    Krinitzsky, E.L.; Marcuson, William F.

    1983-01-01

    This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at

  19. Designing an Earthquake-Proof Art Museum: An Arts- and Engineering-Integrated Science Lesson

    Science.gov (United States)

    Carignan, Anastasia; Hussain, Mahjabeen

    2016-01-01

    In this practical arts-integrated science and engineering lesson, an inquiry-based approach was adopted to teach a class of fourth graders in a Midwest elementary school about the scientific concepts of plate tectonics and earthquakes. Lessons were prepared following the 5 E instructional model. Next Generation Science Standards (4-ESS3-2) and the…

  20. Earthquakes, Cities, and Lifelines: lessons integrating tectonics, society, and engineering in middle school Earth Science

    Science.gov (United States)

    Toke, N.; Johnson, A.; Nelson, K.

    2010-12-01

    Earthquakes are one of the most widely covered geologic processes by the media. As a result students, even at the middle school level, arrive in the classroom with preconceptions about the importance and hazards posed by earthquakes. Therefore earthquakes represent not only an attractive topic to engage students when introducing tectonics, but also a means to help students understand the relationships between geologic processes, society, and engineering solutions. Facilitating understanding of the fundamental connections between science and society is important for the preparation of future scientists and engineers as well as informed citizens. Here, we present a week-long lesson designed to be implemented in five one hour sessions with classes of ~30 students. It consists of two inquiry-based mapping investigations, motivational presentations, and short readings that describe fundamental models of plate tectonics, faults, and earthquakes. The readings also provide examples of engineering solutions such as the Alaskan oil pipeline which withstood multi-meter surface offset in the 2002 Denali Earthquake. The first inquiry-based investigation is a lesson on tectonic plates. Working in small groups, each group receives a different world map plotting both topography and one of the following data sets: GPS plate motion vectors, the locations and types of volcanoes, the location of types of earthquakes. Using these maps and an accompanying explanation of the data each group’s task is to map plate boundary locations. Each group then presents a ~10 minute summary of the type of data they used and their interpretation of the tectonic plates with a poster and their mapping results. Finally, the instructor will facilitate a class discussion about how the data types could be combined to understand more about plate boundaries. Using student interpretations of real data allows student misconceptions to become apparent. Throughout the exercise we record student preconceptions

  1. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    Science.gov (United States)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  2. Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering

    Science.gov (United States)

    Cavlazoglu, Baki; Stuessy, Carol

    2018-02-01

    The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.

  3. Addressing earthquakes strong ground motion issues at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wong, I.G.; Silva, W.J.; Stark, C.L.; Jackson, S.; Smith, R.P.

    1991-01-01

    In the course of reassessing seismic hazards at the Idaho National Engineering Laboratory (INEL), several key issues have been raised concerning the effects of the earthquake source and site geology on potential strong ground motions that might be generated by a large earthquake. The design earthquake for the INEL is an approximate moment magnitude (M w ) 7 event that may occur on the southern portion of the Lemhi fault, a Basin and Range normal fault that is located on the northwestern boundary of the eastern Snake River Plain and the INEL, within 10 to 27 km of several major facilities. Because the locations of these facilities place them at close distances to a large earthquake and generally along strike of the causative fault, the effects of source rupture dynamics (e.g., directivity) could be critical in enhancing potential ground shaking at the INEL. An additional source issue that has been addressed is the value of stress drop to use in ground motion predictions. In terms of site geology, it has been questioned whether the interbedded volcanic stratigraphy beneath the ESRP and the INEL attenuates ground motions to a greater degree than a typical rock site in the western US. These three issues have been investigated employing a stochastic ground motion methodology which incorporates the Band-Limited-White-Noise source model for both a point source and finite fault, random vibration theory and an equivalent linear approach to model soil response

  4. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  5. Estimated airborne release of plutonium from the 102 Building at the General Electric Vallecitos Nuclear Center, Vallecitos, California, as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.; Hays, I.D.

    1980-12-01

    This report estimates the potential airborne releases of plutonium as a consequence of various severities of earthquake and wind hazard postulated for the 102 Building at the General Electric Vallecitos Nuclear Center in California. The releases are based on damage scenarios developed by other specialists. The hazard severities presented range up to a nominal velocity of 230 mph for wind hazard and are in excess of 0.8 g linear acceleration for earthquakes. The consequences of thrust faulting are considered. The approaches and factors used to estimate the releases are discussed. Release estimates range from 0.003 to 3 g Pu

  6. Estimation of Recurrence Interval of Large Earthquakes on the Central Longmen Shan Fault Zone Based on Seismic Moment Accumulation/Release Model

    Directory of Open Access Journals (Sweden)

    Junjie Ren

    2013-01-01

    Full Text Available Recurrence interval of large earthquake on an active fault zone is an important parameter in assessing seismic hazard. The 2008 Wenchuan earthquake (Mw 7.9 occurred on the central Longmen Shan fault zone and ruptured the Yingxiu-Beichuan fault (YBF and the Guanxian-Jiangyou fault (GJF. However, there is a considerable discrepancy among recurrence intervals of large earthquake in preseismic and postseismic estimates based on slip rate and paleoseismologic results. Post-seismic trenches showed that the central Longmen Shan fault zone probably undertakes an event similar to the 2008 quake, suggesting a characteristic earthquake model. In this paper, we use the published seismogenic model of the 2008 earthquake based on Global Positioning System (GPS and Interferometric Synthetic Aperture Radar (InSAR data and construct a characteristic seismic moment accumulation/release model to estimate recurrence interval of large earthquakes on the central Longmen Shan fault zone. Our results show that the seismogenic zone accommodates a moment rate of (2.7 ± 0.3 × 1017 N m/yr, and a recurrence interval of 3900 ± 400 yrs is necessary for accumulation of strain energy equivalent to the 2008 earthquake. This study provides a preferred interval estimation of large earthquakes for seismic hazard analysis in the Longmen Shan region.

  7. Estimation of recurrence interval of large earthquakes on the central Longmen Shan fault zone based on seismic moment accumulation/release model.

    Science.gov (United States)

    Ren, Junjie; Zhang, Shimin

    2013-01-01

    Recurrence interval of large earthquake on an active fault zone is an important parameter in assessing seismic hazard. The 2008 Wenchuan earthquake (Mw 7.9) occurred on the central Longmen Shan fault zone and ruptured the Yingxiu-Beichuan fault (YBF) and the Guanxian-Jiangyou fault (GJF). However, there is a considerable discrepancy among recurrence intervals of large earthquake in preseismic and postseismic estimates based on slip rate and paleoseismologic results. Post-seismic trenches showed that the central Longmen Shan fault zone probably undertakes an event similar to the 2008 quake, suggesting a characteristic earthquake model. In this paper, we use the published seismogenic model of the 2008 earthquake based on Global Positioning System (GPS) and Interferometric Synthetic Aperture Radar (InSAR) data and construct a characteristic seismic moment accumulation/release model to estimate recurrence interval of large earthquakes on the central Longmen Shan fault zone. Our results show that the seismogenic zone accommodates a moment rate of (2.7 ± 0.3) × 10¹⁷ N m/yr, and a recurrence interval of 3900 ± 400 yrs is necessary for accumulation of strain energy equivalent to the 2008 earthquake. This study provides a preferred interval estimation of large earthquakes for seismic hazard analysis in the Longmen Shan region.

  8. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    Science.gov (United States)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    Recently our understanding of tectonic faulting has been shaken by the discoveries of seismic tremor, low frequency earthquakes, slow slip events, and other models of fault slip. These phenomenas represent models of failure that were thought to be non-existent and theoretically impossible only a few years ago. Slow earthquakes are seismic phenomena in which the rupture of geological faults in the earth's crust occurs gradually without creating strong tremors. Despite the growing number of observations of slow earthquakes their origin remains unresolved. Studies show that the duration of slow earthquakes ranges from a few seconds to a few hundred seconds. The regular earthquakes with which most people are familiar release a burst of built-up stress in seconds, slow earthquakes release energy in ways that do little damage. This study focus on the characteristics of the Mw5.6 earthquake occurred in Sofia seismic zone on May 22nd, 2012. The Sofia area is the most populated, industrial and cultural region of Bulgaria that faces considerable earthquake risk. The Sofia seismic zone is located in South-western Bulgaria - the area with pronounce tectonic activity and proved crustal movement. In 19th century the city of Sofia (situated in the centre of the Sofia seismic zone) has experienced two strong earthquakes with epicentral intensity of 10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK64).The 2012 quake occurs in an area characterized by a long quiescence (of 95 years) for moderate events. Moreover, a reduced number of small earthquakes have also been registered in the recent past. The Mw5.6 earthquake is largely felt on the territory of Bulgaria and neighbouring countries. No casualties and severe injuries have been reported. Mostly moderate damages were observed in the cities of Pernik and Sofia and their surroundings. These observations could be assumed indicative for a

  9. 33 CFR 222.4 - Reporting earthquake effects.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Reporting earthquake effects. 222..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... significant earthquakes. It primarily concerns damage surveys following the occurrences of earthquakes. (b...

  10. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  11. Clustered and transient earthquake sequences in mid-continents

    Science.gov (United States)

    Liu, M.; Stein, S. A.; Wang, H.; Luo, G.

    2012-12-01

    Earthquakes result from sudden release of strain energy on faults. On plate boundary faults, strain energy is constantly accumulating from steady and relatively rapid relative plate motion, so large earthquakes continue to occur so long as motion continues on the boundary. In contrast, such steady accumulation of stain energy does not occur on faults in mid-continents, because the far-field tectonic loading is not steadily distributed between faults, and because stress perturbations from complex fault interactions and other stress triggers can be significant relative to the slow tectonic stressing. Consequently, mid-continental earthquakes are often temporally clustered and transient, and spatially migrating. This behavior is well illustrated by large earthquakes in North China in the past two millennia, during which no single large earthquakes repeated on the same fault segments, but moment release between large fault systems was complementary. Slow tectonic loading in mid-continents also causes long aftershock sequences. We show that the recent small earthquakes in the Tangshan region of North China are aftershocks of the 1976 Tangshan earthquake (M 7.5), rather than indicators of a new phase of seismic activity in North China, as many fear. Understanding the transient behavior of mid-continental earthquakes has important implications for assessing earthquake hazards. The sequence of large earthquakes in the New Madrid Seismic Zone (NMSZ) in central US, which includes a cluster of M~7 events in 1811-1812 and perhaps a few similar ones in the past millennium, is likely a transient process, releasing previously accumulated elastic strain on recently activated faults. If so, this earthquake sequence will eventually end. Using simple analysis and numerical modeling, we show that the large NMSZ earthquakes may be ending now or in the near future.

  12. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  13. Seismic ground motion modelling and damage earthquake scenarios: A bridge between seismologists and seismic engineers

    International Nuclear Information System (INIS)

    Panza, G.F.; Romanelli, F.; Vaccari. F.; . E-mails: Luis.Decanini@uniroma1.it; Fabrizio.Mollaioli@uniroma1.it)

    2002-07-01

    The input for the seismic risk analysis can be expressed with a description of 'roundshaking scenarios', or with probabilistic maps of perhaps relevant parameters. The probabilistic approach, unavoidably based upon rough assumptions and models (e.g. recurrence and attenuation laws), can be misleading, as it cannot take into account, with satisfactory accuracy, some of the most important aspects like rupture process, directivity and site effects. This is evidenced by the comparison of recent recordings with the values predicted by the probabilistic methods. We prefer a scenario-based, deterministic approach in view of the limited seismological data, of the local irregularity of the occurrence of strong earthquakes, and of the multiscale seismicity model, that is capable to reconcile two apparently conflicting ideas: the Characteristic Earthquake concept and the Self Organized Criticality paradigm. Where the numerical modeling is successfully compared with records, the synthetic seismograms permit the microzoning, based upon a set of possible scenario earthquakes. Where no recordings are available the synthetic signals can be used to estimate the ground motion without having to wait for a strong earthquake to occur (pre-disaster microzonation). In both cases the use of modeling is necessary since the so-called local site effects can be strongly dependent upon the properties of the seismic source and can be properly defined only by means of envelopes. The joint use of reliable synthetic signals and observations permits the computation of advanced hazard indicators (e.g. damaging potential) that take into account local soil properties. The envelope of synthetic elastic energy spectra reproduces the distribution of the energy demand in the most relevant frequency range for seismic engineering. The synthetic accelerograms can be fruitfully used for design and strengthening of structures, also when innovative techniques, like seismic isolation, are employed. For these

  14. Scaling and spatial complementarity of tectonic earthquake swarms

    KAUST Repository

    Passarelli, Luigi

    2017-11-10

    Tectonic earthquake swarms (TES) often coincide with aseismic slip and sometimes precede damaging earthquakes. In spite of recent progress in understanding the significance and properties of TES at plate boundaries, their mechanics and scaling are still largely uncertain. Here we evaluate several TES that occurred during the past 20 years on a transform plate boundary in North Iceland. We show that the swarms complement each other spatially with later swarms discouraged from fault segments activated by earlier swarms, which suggests efficient strain release and aseismic slip. The fault area illuminated by earthquakes during swarms may be more representative of the total moment release than the cumulative moment of the swarm earthquakes. We use these findings and other published results from a variety of tectonic settings to discuss general scaling properties for TES. The results indicate that the importance of TES in releasing tectonic strain at plate boundaries may have been underestimated.

  15. Engineering works for increasing earthquake resistance of Hamaoka nuclear power plant

    International Nuclear Information System (INIS)

    Oonishi, Yoshihiro; Kondou, Makoto; Hattori, Kazushi

    2007-01-01

    The improvement works of the ground of outdoor piping and duct system of Hamaoka-3, one of engineering works for increasing earthquake resistance of the plant, are reported. The movable outdoor piping systems were moved. SJ method, one of the high-pressure jet mixing method, improved the ground between the duct and the unmoved light oil tank on the western side, and the environmental ground. The other places were improved by the concrete replacement works. The results of ground treated by SJ method showed the high quality of stiffness and continuity. Outline of engineering works, execution of concrete replacement works, the high-pressure jet mixing method, SJ method, the quality control and treatment of the generated mud by SJ method are reported. A seismic response analysis, execution facilities, construction planning, working diagram, improvement work conditions of three methods, and steps of SJ method are illustrated. (S.Y.)

  16. A refined Frequency Domain Decomposition tool for structural modal monitoring in earthquake engineering

    Science.gov (United States)

    Pioldi, Fabio; Rizzi, Egidio

    2017-07-01

    Output-only structural identification is developed by a refined Frequency Domain Decomposition ( rFDD) approach, towards assessing current modal properties of heavy-damped buildings (in terms of identification challenge), under strong ground motions. Structural responses from earthquake excitations are taken as input signals for the identification algorithm. A new dedicated computational procedure, based on coupled Chebyshev Type II bandpass filters, is outlined for the effective estimation of natural frequencies, mode shapes and modal damping ratios. The identification technique is also coupled with a Gabor Wavelet Transform, resulting in an effective and self-contained time-frequency analysis framework. Simulated response signals generated by shear-type frames (with variable structural features) are used as a necessary validation condition. In this context use is made of a complete set of seismic records taken from the FEMA P695 database, i.e. all 44 "Far-Field" (22 NS, 22 WE) earthquake signals. The modal estimates are statistically compared to their target values, proving the accuracy of the developed algorithm in providing prompt and accurate estimates of all current strong ground motion modal parameters. At this stage, such analysis tool may be employed for convenient application in the realm of Earthquake Engineering, towards potential Structural Health Monitoring and damage detection purposes.

  17. Underground water stress release models

    Science.gov (United States)

    Li, Yong; Dang, Shenjun; Lü, Shaochuan

    2011-08-01

    The accumulation of tectonic stress may cause earthquakes at some epochs. However, in most cases, it leads to crustal deformations. Underground water level is a sensitive indication of the crustal deformations. We incorporate the information of the underground water level into the stress release models (SRM), and obtain the underground water stress release model (USRM). We apply USRM to the earthquakes occurred at Tangshan region. The analysis shows that the underground water stress release model outperforms both Poisson model and stress release model. Monte Carlo simulation shows that the simulated seismicity by USRM is very close to the real seismicity.

  18. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  19. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    Science.gov (United States)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake

  20. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  1. Moment-ration imaging of seismic regions for earthquake prediction

    Science.gov (United States)

    Lomnitz, Cinna

    1993-10-01

    An algorithm for predicting large earthquakes is proposed. The reciprocal ratio (mri) of the residual seismic moment to the total moment release in a region is used for imaging seismic moment precursors. Peaks in mri predict recent major earthquakes, including the 1985 Michoacan, 1985 central Chile, and 1992 Eureka, California earthquakes.

  2. The ShakeOut scenario: A hypothetical Mw7.8 earthquake on the Southern San Andreas Fault

    Science.gov (United States)

    Porter, K.; Jones, L.; Cox, D.; Goltz, J.; Hudnut, K.; Mileti, D.; Perry, S.; Ponti, D.; Reichle, M.; Rose, A.Z.; Scawthorn, C.R.; Seligson, H.A.; Shoaf, K.I.; Treiman, J.; Wein, A.

    2011-01-01

    In 2008, an earthquake-planning scenario document was released by the U.S. Geological Survey (USGS) and California Geological Survey that hypothesizes the occurrence and effects of a Mw7.8 earthquake on the southern San Andreas Fault. It was created by more than 300 scientists and engineers. Fault offsets reach 13 m and up to 8 m at lifeline crossings. Physics-based modeling was used to generate maps of shaking intensity, with peak ground velocities of 3 m/sec near the fault and exceeding 0.5 m/sec over 10,000 km2. A custom HAZUS??MH analysis and 18 special studies were performed to characterize the effects of the earthquake on the built environment. The scenario posits 1,800 deaths and 53,000 injuries requiring emergency room care. Approximately 1,600 fires are ignited, resulting in the destruction of 200 million square feet of the building stock, the equivalent of 133,000 single-family homes. Fire contributes $87 billion in property and business interruption loss, out of the total $191 billion in economic loss, with most of the rest coming from shakerelated building and content damage ($46 billion) and business interruption loss from water outages ($24 billion). Emergency response activities are depicted in detail, in an innovative grid showing activities versus time, a new format introduced in this study. ?? 2011, Earthquake Engineering Research Institute.

  3. Thermal Radiation Anomalies Associated with Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulinets, Sergey; Kafatos, Menas C.; Taylor, Patrick

    2017-01-01

    Recent developments of remote sensing methods for Earth satellite data analysis contribute to our understanding of earthquake related thermal anomalies. It was realized that the thermal heat fluxes over areas of earthquake preparation is a result of air ionization by radon (and other gases) and consequent water vapor condensation on newly formed ions. Latent heat (LH) is released as a result of this process and leads to the formation of local thermal radiation anomalies (TRA) known as OLR (outgoing Longwave radiation, Ouzounov et al, 2007). We compare the LH energy, obtained by integrating surface latent heat flux (SLHF) over the area and time with released energies associated with these events. Extended studies of the TRA using the data from the most recent major earthquakes allowed establishing the main morphological features. It was also established that the TRA are the part of more complex chain of the short-term pre-earthquake generation, which is explained within the framework of a lithosphere-atmosphere coupling processes.

  4. Quantifying slip balance in the earthquake cycle: Coseismic slip model constrained by interseismic coupling

    KAUST Repository

    Wang, Lifeng; Hainzl, Sebastian; Mai, Paul Martin

    2015-01-01

    The long-term slip on faults has to follow, on average, the plate motion, while slip deficit is accumulated over shorter time scales (e.g., between the large earthquakes). Accumulated slip deficits eventually have to be released by earthquakes and aseismic processes. In this study, we propose a new inversion approach for coseismic slip, taking interseismic slip deficit as prior information. We assume a linear correlation between coseismic slip and interseismic slip deficit, and invert for the coefficients that link the coseismic displacements to the required strain accumulation time and seismic release level of the earthquake. We apply our approach to the 2011 M9 Tohoku-Oki earthquake and the 2004 M6 Parkfield earthquake. Under the assumption that the largest slip almost fully releases the local strain (as indicated by borehole measurements, Lin et al., 2013), our results suggest that the strain accumulated along the Tohoku-Oki earthquake segment has been almost fully released during the 2011 M9 rupture. The remaining slip deficit can be attributed to the postseismic processes. Similar conclusions can be drawn for the 2004 M6 Parkfield earthquake. We also estimate the required time of strain accumulation for the 2004 M6 Parkfield earthquake to be ~25 years (confidence interval of [17, 43] years), consistent with the observed average recurrence time of ~22 years for M6 earthquakes in Parkfield. For the Tohoku-Oki earthquake, we estimate the recurrence time of~500-700 years. This new inversion approach for evaluating slip balance can be generally applied to any earthquake for which dense geodetic measurements are available.

  5. Quantifying slip balance in the earthquake cycle: Coseismic slip model constrained by interseismic coupling

    KAUST Repository

    Wang, Lifeng

    2015-11-11

    The long-term slip on faults has to follow, on average, the plate motion, while slip deficit is accumulated over shorter time scales (e.g., between the large earthquakes). Accumulated slip deficits eventually have to be released by earthquakes and aseismic processes. In this study, we propose a new inversion approach for coseismic slip, taking interseismic slip deficit as prior information. We assume a linear correlation between coseismic slip and interseismic slip deficit, and invert for the coefficients that link the coseismic displacements to the required strain accumulation time and seismic release level of the earthquake. We apply our approach to the 2011 M9 Tohoku-Oki earthquake and the 2004 M6 Parkfield earthquake. Under the assumption that the largest slip almost fully releases the local strain (as indicated by borehole measurements, Lin et al., 2013), our results suggest that the strain accumulated along the Tohoku-Oki earthquake segment has been almost fully released during the 2011 M9 rupture. The remaining slip deficit can be attributed to the postseismic processes. Similar conclusions can be drawn for the 2004 M6 Parkfield earthquake. We also estimate the required time of strain accumulation for the 2004 M6 Parkfield earthquake to be ~25 years (confidence interval of [17, 43] years), consistent with the observed average recurrence time of ~22 years for M6 earthquakes in Parkfield. For the Tohoku-Oki earthquake, we estimate the recurrence time of~500-700 years. This new inversion approach for evaluating slip balance can be generally applied to any earthquake for which dense geodetic measurements are available.

  6. The threat of silent earthquakes

    Science.gov (United States)

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  7. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    Science.gov (United States)

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  8. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    Science.gov (United States)

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  9. Refresher Course on Physics of Earthquakes -98 ...

    Indian Academy of Sciences (India)

    The objective of this course is to help teachers gain an understanding of the earhquake phenomenon and the physical processes involved in its genesis as well as offhe earthquake waves which propagate the energy released by the earthquake rupture outward from the source. The Course will begin with mathematical ...

  10. The HayWired Earthquake Scenario

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  11. The 2007 Mentawai earthquake sequence on the Sumatra megathrust

    Science.gov (United States)

    Konca, A.; Avouac, J.; Sladen, A.; Meltzner, A. J.; Kositsky, A. P.; Sieh, K.; Fang, P.; Li, Z.; Galetzka, J.; Genrich, J.; Chlieh, M.; Natawidjaja, D. H.; Bock, Y.; Fielding, E. J.; Helmberger, D. V.

    2008-12-01

    The Sumatra Megathrust has recently produced a flurry of large interplate earthquakes starting with the giant Mw 9.15, Aceh earthquake of 2004. All of these earthquakes occurred within the area monitored by the Sumatra Geodetic Array (SuGAr), which provided exceptional records of near-field co-seismic and postseismic ground displacements. The most recent of these major earthquakes, an Mw 8.4 earthquake and an Mw 7.9 earthquake twelve hours later, occurred in the Mentawai islands area where devastating historical earthquakes had happened in 1797 and 1833. The 2007 earthquake sequence provides an exceptional opportunity to understand the variability of the earthquakes along megathrusts and their relation to interseismic coupling. The InSAR, GPS and teleseismic modeling shows that 2007 earthquakes ruptured a fraction of the strongly coupled Mentawai patch of the megathrust, which is also only a fraction of the 1833 rupture area. It also released a much smaller moment than the one released in 1833, or than the deficit of moment that has accumulated since. Both earthquakes of 2007 consist of 2 sub-events which are 50 to 100 km apart from each other. On the other hand, the northernmost slip patch of 8.4 and southern slip patch of 7.9 earthquakes abut each other, but they ruptured 12 hours apart. Sunda megathrust earthquakes of recent years include a rupture of a strongly coupled patch that closely mimics a prior rupture of that patch and which is well correlated with the interseismic coupling pattern (Nias-Simeulue section), as well as a rupture sequence of a strongly coupled patch that differs substantially in the details from its most recent predecessors (Mentawai section). We conclude that (1) seismic asperities are probably persistent features which arise form heterogeneous strain build up in the interseismic period; and (2) the same portion of a megathrust can rupture in different ways depending on whether asperities break as isolated events or cooperate to produce

  12. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    Science.gov (United States)

    Bostenaru Dan, M.

    2009-04-01

    mitigation will be presented. The session includes contributions showing methodological and modelling approaches from scientists in geophysical/seismological, hydrological, remote sensing, civil engineering, insurance, and urbanism, amongst other fields, as well as presentations from practitioners working on specific case studies, regarding analysis of recent events and their impact on cities as well as re-evaluation of past events from the point of view of long-time recovery. In 2005 it was called for: Most strategies for both preparedness and emergency management in case of disaster mitigation are related to urban planning. While natural, engineering and social sciences contribute to the evaluation of the impact of earthquakes and their secondary events (including tsunamis, earthquake triggered landslides, or fire), floods, landslides, high winds, and volcanic eruptions on urban areas, there are the instruments of urban planning which are to be employed for both visualisation as well as development and implementation of strategy concepts for pre- and postdisaster intervention. The evolution of natural systems towards extreme conditions is taken into consideration so far at it concerns the damaging impact on urban areas and infrastructure and the impact on the natural environment of interventions to reduce such damaging impact.

  13. Failures and suggestions in Earthquake forecasting and prediction

    Science.gov (United States)

    Sacks, S. I.

    2013-12-01

    Seismologists have had poor success in earthquake prediction. However, wide ranging observations from earlier great earthquakes show that precursory data can exist. In particular, two aspects seem promising. In agreement with simple physical modeling, b-values decrease in highly loaded fault zones for years before failure. Potentially more usefully, in high stress regions, breakdown of dilatant patches leading to failure can yield expelled water-related observations. The volume increase (dilatancy) caused by high shear stresses decreases the pore pressure. Eventually, water flows back in restoring the pore pressure, promoting failure and expelling the extra water. Of course, in a generally stressed region there may be many small patches that fail, such as observed before the 1975 Haicheng earthquake. Only a few days before the major event will most of the dilatancy breakdown occur in the fault zone itself such as for the Tangshan, 1976 destructive event. Observations of 'water release' effects have been observed before the 1923 great Kanto earthquake, the 1984 Yamasaki event, the 1975 Haicheng and the 1976 Tangshan earthquakes and also the 1995 Kobe earthquake. While there are obvious difficulties in water release observations, not least because there is currently no observational network anywhere, historical data does suggest some promise if we broaden our approach to this difficult subject.

  14. Initiatives to Reduce Earthquake Risk of Developing Countries

    Science.gov (United States)

    Tucker, B. E.

    2008-12-01

    The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of

  15. How fault geometry controls earthquake magnitude

    Science.gov (United States)

    Bletery, Q.; Thomas, A.; Karlstrom, L.; Rempel, A. W.; Sladen, A.; De Barros, L.

    2016-12-01

    Recent large megathrust earthquakes, such as the Mw9.3 Sumatra-Andaman earthquake in 2004 and the Mw9.0 Tohoku-Oki earthquake in 2011, astonished the scientific community. The first event occurred in a relatively low-convergence-rate subduction zone where events of its size were unexpected. The second event involved 60 m of shallow slip in a region thought to be aseismicaly creeping and hence incapable of hosting very large magnitude earthquakes. These earthquakes highlight gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution. Here we show that gradients in dip angle exert a primary control on mega-earthquake occurrence. We calculate the curvature along the major subduction zones of the world and show that past mega-earthquakes occurred on flat (low-curvature) interfaces. A simplified analytic model demonstrates that shear strength heterogeneity increases with curvature. Stress loading on flat megathrusts is more homogeneous and hence more likely to be released simultaneously over large areas than on highly-curved faults. Therefore, the absence of asperities on large faults might counter-intuitively be a source of higher hazard.

  16. A Decade of Giant Earthquakes - What does it mean?

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Terry C. Jr. [Los Alamos National Laboratory

    2012-07-16

    On December 26, 2004 the largest earthquake since 1964 occurred near Ache, Indonesia. The magnitude 9.2 earthquake and subsequent tsunami killed a quarter of million people; it also marked the being of a period of extraordinary seismicity. Since the Ache earthquake there have been 16 magnitude 8 earthquakes globally, including 2 this last April. For the 100 years previous to 2004 there was an average of 1 magnitude 8 earthquake every 2.2 years; since 2004 there has been 2 per year. Since magnitude 8 earthquakes dominate global seismic energy release, this period of seismicity has seismologist rethinking what they understand about plate tectonics and the connectivity between giant earthquakes. This talk will explore this remarkable period of time and its possible implications.

  17. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  18. Release mechanisms from shallow engineered trenches used as repositories for radioactive wastes

    International Nuclear Information System (INIS)

    Locke, J.; Wood, E.

    1987-05-01

    This report has been written for the Department of the Environment as part of their radioactive waste management research programme. The aim has been to identify release mechanisms of radioactivity from fully engineered trenches of the LAND 2 type and, to identify the data needed for their assessment. No direct experimental work has been involved. The report starts with a brief background to UK strategy and outlines a basic disposal system. It gives reviews of existing experience of low level radioactive waste disposal from LAND 1 trenches and of UK experience of toxic waste disposal to provide a practical basis for the next section which covers the implications of identified release mechanisms on the design requirements for an engineered trench. From these design requirements and their interaction with potential site conditions (both saturated and unsaturated zone sites are considered) an assessment of radionuclide release mechanism is made. (author)

  19. Earthquake magnitude estimation using the τ c and P d method for earthquake early warning systems

    Science.gov (United States)

    Jin, Xing; Zhang, Hongcai; Li, Jun; Wei, Yongxiang; Ma, Qiang

    2013-10-01

    Earthquake early warning (EEW) systems are one of the most effective ways to reduce earthquake disaster. Earthquake magnitude estimation is one of the most important and also the most difficult parts of the entire EEW system. In this paper, based on 142 earthquake events and 253 seismic records that were recorded by the KiK-net in Japan, and aftershocks of the large Wenchuan earthquake in Sichuan, we obtained earthquake magnitude estimation relationships using the τ c and P d methods. The standard variances of magnitude calculation of these two formulas are ±0.65 and ±0.56, respectively. The P d value can also be used to estimate the peak ground motion of velocity, then warning information can be released to the public rapidly, according to the estimation results. In order to insure the stability and reliability of magnitude estimation results, we propose a compatibility test according to the natures of these two parameters. The reliability of the early warning information is significantly improved though this test.

  20. Ionospheric Anomaly before Kyushu|Japan Earthquake

    Directory of Open Access Journals (Sweden)

    YANG Li

    2017-05-01

    Full Text Available GIM data released by IGS is used in the article and a new method of combining the Sliding Time Window Method and the Ionospheric TEC correlation analysis method of adjacent grid points is proposed to study the relationship between pre-earthquake ionospheric anomalies and earthquake. By analyzing the abnormal change of TEC in the 5 grid points around the seismic region, the abnormal change of ionospheric TEC is found before the earthquake and the correlation between the TEC sequences of lattice points is significantly affected by earthquake. Based on the analysis of the spatial distribution of TEC anomaly, anomalies of 6 h, 12 h and 6 h were found near the epicenter three days before the earthquake. Finally, ionospheric tomographic technology is used to do tomographic inversion on electron density. And the distribution of the electron density in the ionospheric anomaly is further analyzed.

  1. Earthquake: Game-based learning for 21st century STEM education

    Science.gov (United States)

    Perkins, Abigail Christine

    To play is to learn. A lack of empirical research within game-based learning literature, however, has hindered educational stakeholders to make informed decisions about game-based learning for 21st century STEM education. In this study, I modified a research and development (R&D) process to create a collaborative-competitive educational board game illuminating elements of earthquake engineering. I oriented instruction- and game-design principles around 21st century science education to adapt the R&D process to develop the educational game, Earthquake. As part of the R&D, I evaluated Earthquake for empirical evidence to support the claim that game-play results in student gains in critical thinking, scientific argumentation, metacognitive abilities, and earthquake engineering content knowledge. I developed Earthquake with the aid of eight focus groups with varying levels of expertise in science education research, teaching, administration, and game-design. After developing a functional prototype, I pilot-tested Earthquake with teacher-participants (n=14) who engaged in semi-structured interviews after their game-play. I analyzed teacher interviews with constant comparison methodology. I used teachers' comments and feedback from content knowledge experts to integrate game modifications, implementing results to improve Earthquake. I added player roles, simplified phrasing on cards, and produced an introductory video. I then administered the modified Earthquake game to two groups of high school student-participants (n = 6), who played twice. To seek evidence documenting support for my knowledge claim, I analyzed videotapes of students' game-play using a game-based learning checklist. My assessment of learning gains revealed increases in all categories of students' performance: critical thinking, metacognition, scientific argumentation, and earthquake engineering content knowledge acquisition. Players in both student-groups improved mostly in critical thinking, having

  2. Impact of Dissociation and Sensible Heat Release on Pulse Detonation and Gas Turbine Engine Performance

    Science.gov (United States)

    Povinelli, Louis A.

    2001-01-01

    A thermodynamic cycle analysis of the effect of sensible heat release on the relative performance of pulse detonation and gas turbine engines is presented. Dissociation losses in the PDE (Pulse Detonation Engine) are found to cause a substantial decrease in engine performance parameters.

  3. An Experimental Investigation on the Combustion and Heat Release Characteristics of an Opposed-Piston Folded-Cranktrain Diesel Engine

    Directory of Open Access Journals (Sweden)

    Fukang Ma

    2015-06-01

    Full Text Available In opposed-piston folded-cranktrain diesel engines, the relative movement rules of opposed-pistons, combustion chamber components and injector position are different from those of conventional diesel engines. The combustion and heat release characteristics of an opposed-piston folded-cranktrain diesel engine under different operating conditions were investigated. Four phases: ignition delay, premixed combustion, diffusion combustion and after combustion are used to describe the heat release process of the engine. Load changing has a small effect on premixed combustion duration while it influences diffusion combustion duration significantly. The heat release process has more significant isochoric and isobaric combustion which differs from the conventional diesel engine situation, except at high exhaust pressure and temperature, due to its two-stroke and uniflow scavenging characteristics. Meanwhile, a relatively high-quality exhaust heat energy is produced in opposed-piston folded-cranktrain diesel engines.

  4. Short presentation on some researches activities about near field earthquakes

    International Nuclear Information System (INIS)

    Donald, John

    2002-01-01

    The major hazard posed by earthquakes is often thought to be due to moderate to large magnitude events. However, there have been many cases where earthquakes of moderate and even small magnitude have caused very significant destruction when they have coincided with population centres. Even though the area of intense ground shaking caused by such events is generally small, the epicentral motions can be severe enough to cause damage even in well-engineered structures. Two issues are addressed here, the first being the identification of the minimum earthquake magnitude likely to cause damage to engineered structures and the limits of the near-field for small-to-moderate magnitude earthquakes. The second issue addressed is whether features of near-field ground motions such as directivity, which can significantly enhance the destructive potential, occur in small-to-moderate magnitude events. The accelerograms from the 1986 San Salvador (El Salvador) earthquake indicate that it may be non conservative to assume that near-field directivity effects only need to be considered for earthquakes of moment magnitude M 6.5 and greater. (author)

  5. Dynamic Model for the Stocks and Release Flows of Engineered Nanomaterials.

    Science.gov (United States)

    Song, Runsheng; Qin, Yuwei; Suh, Sangwon; Keller, Arturo A

    2017-11-07

    Most existing life-cycle release models for engineered nanomaterials (ENM) are static, ignoring the dynamics of stock and flows of ENMs. Our model, nanoRelease, estimates the annual releases of ENMs from manufacturing, use, and disposal of a product explicitly taking stock and flow dynamics into account. Given the variabilities in key parameters (e.g., service life of products and annual release rate during use) nanoRelease is designed as a stochastic model. We apply nanoRelease to three ENMs (TiO 2 , SiO 2 and FeO x ) used in paints and coatings through seven product applications, including construction and building, household and furniture, and automotive for the period from 2000 to 2020 using production volume and market projection information. We also consider model uncertainties using Monte Carlo simulation. Compared with 2016, the total annual releases of ENMs in 2020 will increase by 34-40%, and the stock will increase by 28-34%. The fraction of the end-of-life release among total release flows will increase from 11% in 2002 to 43% in 2020. As compared to static models, our dynamic model predicts about an order of magnitude lower values for the amount of ENM released from this sector in the near-term while stock continues to build up in the system.

  6. Open field release of genetically engineered sterile male Aedes aegypti in Malaysia.

    Directory of Open Access Journals (Sweden)

    Renaud Lacroix

    Full Text Available BACKGROUND: Dengue is the most important mosquito-borne viral disease. In the absence of specific drugs or vaccines, control focuses on suppressing the principal mosquito vector, Aedes aegypti, yet current methods have not proven adequate to control the disease. New methods are therefore urgently needed, for example genetics-based sterile-male-release methods. However, this requires that lab-reared, modified mosquitoes be able to survive and disperse adequately in the field. METHODOLOGY/PRINCIPAL FINDINGS: Adult male mosquitoes were released into an uninhabited forested area of Pahang, Malaysia. Their survival and dispersal was assessed by use of a network of traps. Two strains were used, an engineered 'genetically sterile' (OX513A and a wild-type laboratory strain, to give both absolute and relative data about the performance of the modified mosquitoes. The two strains had similar maximum dispersal distances (220 m, but mean distance travelled of the OX513A strain was lower (52 vs. 100 m. Life expectancy was similar (2.0 vs. 2.2 days. Recapture rates were high for both strains, possibly because of the uninhabited nature of the site. CONCLUSIONS/SIGNIFICANCE: After extensive contained studies and regulatory scrutiny, a field release of engineered mosquitoes was safely and successfully conducted in Malaysia. The engineered strain showed similar field longevity to an unmodified counterpart, though in this setting dispersal was reduced relative to the unmodified strain. These data are encouraging for the future testing and implementation of genetic control strategies and will help guide future field use of this and other engineered strains.

  7. Fuel effects on knock, heat releases and CARS temperatures in a spark ignition engine

    NARCIS (Netherlands)

    Kalghatgi, G.T.; Golombok, M.; Snowdon, P.

    1995-01-01

    Net heat release, knock characteristics and temperature were derived from in-cylinder pressure and end-gas CARS measurements for different fuels in a single-cylinder engine. The maximum net heat release rate resulting from the final phase of autoignition is closely associated with knock intensity.

  8. Earthquake data base for Romania

    International Nuclear Information System (INIS)

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.

    2002-01-01

    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  9. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  10. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  11. Surface-Engineered Nanocontainers Based on Molecular Self-Assembly and Their Release of Methenamine

    Directory of Open Access Journals (Sweden)

    Minghui Zhang

    2018-02-01

    Full Text Available The mixing of polymers and nanoparticles is opening pathways for engineering flexible composites that exhibit advantageous functional properties. To fabricate controllable assembling nanocomposites for efficiently encapsulating methenamine and releasing them on demand, we functionalized the surface of natural halloysite nanotubes (HNTs selectively with polymerizable gemini surfactant which has peculiar aggregation behavior, aiming at endowing the nanomaterials with self-assembly and stimulative responsiveness characteristics. The micromorphology, grafted components and functional groups were identified using transmission electron microscopy (TEM, thermogravimetric analysis (TGA, Fourier transform infrared (FTIR spectroscopy, and X-ray photoelectron spectroscopy (XPS. The created nanocomposites presented various characteristics of methenamine release with differences in the surface composition. It is particularly worth mentioning that the controlled release was more efficient with the increase of geminized monomer proportion, which is reasonably attributed to the fact that the amphiphilic geminized moieties with positive charge and obvious hydrophobic interactions interact with the outer and inner surface in different ways through fabricating polymeric shell as release stoppers at nanotube ends and forming polymer brush into the nanotube lumen for guest immobilization. Meanwhile, the nanocomposites present temperature and salinity responsive characteristics for the release of methenamine. The combination of HNTs with conjugated functional polymers will open pathways for engineering flexible composites which are promising for application in controlled release fields.

  12. A 30-year history of earthquake crisis communication in California and lessons for the future

    Science.gov (United States)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  13. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  14. Mimicking Neurotransmitter Release in Chemical Synapses via Hysteresis Engineering in MoS2 Transistors.

    Science.gov (United States)

    Arnold, Andrew J; Razavieh, Ali; Nasr, Joseph R; Schulman, Daniel S; Eichfeld, Chad M; Das, Saptarshi

    2017-03-28

    Neurotransmitter release in chemical synapses is fundamental to diverse brain functions such as motor action, learning, cognition, emotion, perception, and consciousness. Moreover, improper functioning or abnormal release of neurotransmitter is associated with numerous neurological disorders such as epilepsy, sclerosis, schizophrenia, Alzheimer's disease, and Parkinson's disease. We have utilized hysteresis engineering in a back-gated MoS 2 field effect transistor (FET) in order to mimic such neurotransmitter release dynamics in chemical synapses. All three essential features, i.e., quantal, stochastic, and excitatory or inhibitory nature of neurotransmitter release, were accurately captured in our experimental demonstration. We also mimicked an important phenomenon called long-term potentiation (LTP), which forms the basis of human memory. Finally, we demonstrated how to engineer the LTP time by operating the MoS 2 FET in different regimes. Our findings could provide a critical component toward the design of next-generation smart and intelligent human-like machines and human-machine interfaces.

  15. Idaho National Engineering Laboratory release criteria for decontamination and decommissioning

    International Nuclear Information System (INIS)

    Dolenc, M.R.; Case, M.J.

    1986-01-01

    Criteria have been developed for release of Idaho National Engineering Laboratory (INEL) facilities and land areas following decontamination and decommissioning (D and D). Decommissioning release criteria in the form of dose guidelines were proposed by the US Nuclear Regulatory Commission as early as 1980. These criteria were used on an interim basis for INEL D and D projects. However, dose guidelines alone do not adequately cover the criteria necessary to release sites for unrestricted use. In actual practice, other parameters such as pathways analyses, sampling and instrumentation techniques, and implementation procedures are required to develop the basis for unrestricted release of a site. Thus, a rigorous approach for evaluating these other parameters is needed to develop acceptable D and D release criteria. Because of the complex and sensitive nature of the dose and pathways analyses work, a thorough review by experts in those respective fields was desired. Input and support in preparing or reviewing each part of the criteria development task was solicited from several DOE field offices. Experts were identified and contracted to assist in preparing portions of the release criteria, or to serve on a peer-review committee. Thus, the entire release criteria development task was thoroughly reviewed by recognized experts from each DOE field office, to validate technical content of the INEL site-specific document

  16. Real-time earthquake data feasible

    Science.gov (United States)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  17. Physics of Earthquake Rupture Propagation

    Science.gov (United States)

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  18. Design basis earthquakes for critical industrial facilities and their characteristics, and the Southern Hyogo prefecture earthquake, 17 January 1995

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Heki

    1998-12-01

    This paper deals with how to establish the concept of the design basis earthquake (DBE) for critical industrial facilities such as nuclear power plants in consideration of disasters such as the Southern Hyogo prefecture earthquake, the so-called Kobe earthquake in 1995. The author once discussed various DBEs at the 7th World Conference on Earthquake Engineering. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared the values of accelerations of a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo prefecture earthquake in 1995 exceeded the previous assumption of the author, even though the results of the previous paper had been pessimistic. According to the experience of the Kobe event, the author will point out the necessity of the third earthquake S{sub s} adding to S{sub 1} and S{sub 2} of previous DBEs.

  19. Fabrication and characterization of a rapid prototyped tissue engineering scaffold with embedded multicomponent matrix for controlled drug release

    DEFF Research Database (Denmark)

    Chen, Muwan; Le, Dang Q S; Hein, San

    2012-01-01

    Bone tissue engineering implants with sustained local drug delivery provide an opportunity for better postoperative care for bone tumor patients because these implants offer sustained drug release at the tumor site and reduce systemic side effects. A rapid prototyped macroporous polycaprolactone......, this scaffold can fulfill the requirements for both bone tissue engineering and local sustained release of an anticancer drug in vitro. These results suggest that the scaffold can be used clinically in reconstructive surgery after bone tumor resection. Moreover, by changing the composition and amount...... of individual components, the scaffold can find application in other tissue engineering areas that need local sustained release of drug....

  20. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    Science.gov (United States)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  1. The Alaska earthquake, March 27, 1964: lessons and conclusions

    Science.gov (United States)

    Eckel, Edwin B.

    1970-01-01

    One of the greatest earthquakes of all time struck south-central Alaska on March 27, 1964. Strong motion lasted longer than for most recorded earthquakes, and more land surface was dislocated, vertically and horizontally, than by any known previous temblor. Never before were so many effects on earth processes and on the works of man available for study by scientists and engineers over so great an area. The seismic vibrations, which directly or indirectly caused most of the damage, were but surface manifestations of a great geologic event-the dislocation of a huge segment of the crust along a deeply buried fault whose nature and even exact location are still subjects for speculation. Not only was the land surface tilted by the great tectonic event beneath it, with resultant seismic sea waves that traversed the entire Pacific, but an enormous mass of land and sea floor moved several tens of feet horizontally toward the Gulf of Alaska. Downslope mass movements of rock, earth, and snow were initiated. Subaqueous slides along lake shores and seacoasts, near-horizontal movements of mobilized soil (“landspreading”), and giant translatory slides in sensitive clay did the most damage and provided the most new knowledge as to the origin, mechanics, and possible means of control or avoidance of such movements. The slopes of most of the deltas that slid in 1964, and that produced destructive local waves, are still as steep or steeper than they were before the earthquake and hence would be unstable or metastable in the event of another great earthquake. Rockslide avalanches provided new evidence that such masses may travel on cushions of compressed air, but a widely held theory that glaciers surge after an earthquake has not been substantiated. Innumerable ground fissures, many of them marked by copious emissions of water, caused much damage in towns and along transportation routes. Vibration also consolidated loose granular materials. In some coastal areas, local

  2. Gas and Dust Phenomena of Mega-earthquakes and the Cause

    Science.gov (United States)

    Yue, Z.

    2013-12-01

    A mega-earthquake suddenly releases a large to extremely large amount of kinetic energy within a few tens to two hundreds seconds and over ten to hundreds kilometer distances in the Earth's crust and on ground surface. It also generates seismic waves that can be received globally and co-seismic ground damages such co-seismic ruptures and landslides. However, such vast, dramatic and devastating kinetic actions in the Earth's crustal rocks and on the ground soils cannot be known or predicted by people at few weeks, days, hours, or minutes before they are happening. Although seismologists can develop and use seismometers to report the locations and magnitudes of earthquakes within minutes of their occurrence, they cannot predict earthquakes at present. Therefore, damage earthquakes have caused and would continue to cause huge disasters, fatalities and injuries to our human beings. This problem may indicate that it is necessary to re-examine the cause of mega-earthquakes in addition to the conventional cause of active fault elastic rebounding. In the last ten years, many mega-earthquakes occurred in China and around the Pacific Ocean and caused many casualties to human beings and devastating disasters to environments. The author will give a brief review on the impacts of the mega-earthquakes happened in recent years. He will then present many gas and dust related phenomena associated with the sudden occurrences of these mega earthquakes. They include the 2001 Kunlunshan Earthquake M8.1, 2008 Wenchuan Earthquake M8.0 and the 2010 Yushu Earthquake M7.1 in China, the 2010 Haiti Earthquake M7.0, the 2010 Mexicali Earthquake M7.2, the 2010 Chile Earthquake M8.8, the 2011 Christchurch earthquake M6.3 and the 2011 Japan Earthquake M9.0 around the Pacific Ocean. He will discuss the cause of these gas and dust related phenomena. He will use these phenomena and their common cause to show that the earthquakes were caused the rapid migration and expansion of highly compressed and

  3. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    Science.gov (United States)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  4. Effect of Engineered Nanoparticles on Exopolymeric Substances Release from Marine Phytoplankton

    Science.gov (United States)

    Chiu, Meng-Hsuen; Khan, Zafir A.; Garcia, Santiago G.; Le, Andre D.; Kagiri, Agnes; Ramos, Javier; Tsai, Shih-Ming; Drobenaire, Hunter W.; Santschi, Peter H.; Quigg, Antonietta; Chin, Wei-Chun

    2017-12-01

    Engineered nanoparticles (ENPs), products from modern nanotechnologies, can potentially impact the marine environment to pose serious threats to marine ecosystems. However, the cellular responses of marine phytoplankton to ENPs are still not well established. Here, we investigate four different diatom species ( Odontella mobiliensis, Skeletonema grethae, Phaeodactylum tricornutum, Thalassiosira pseudonana) and one green algae ( Dunaliella tertiolecta) for their extracellular polymeric substances (EPS) release under model ENP treatments: 25 nm titanium dioxide (TiO2), 10-20 nm silicon dioxide (SiO2), and 15-30 nm cerium dioxide (CeO2). We found SiO2 ENPs can significantly stimulate EPS release from these algae (200-800%), while TiO2 ENP exposure induced the lowest release. Furthermore, the increase of intracellular Ca2+ concentration can be triggered by ENPs, suggesting that the EPS release process is mediated through Ca2+ signal pathways. With better understanding of the cellular mechanism mediated ENP-induced EPS release, potential preventative and safety measures can be developed to mitigate negative impact on the marine ecosystem.

  5. The January 17, 1994 Northridge Earthquake: Effects on selected industrial facilities and lifelines

    Energy Technology Data Exchange (ETDEWEB)

    Eli, M.W.; Sommer, S.C. [Lawrence Livermore National Lab., CA (United States); Roche, T.R.; Merz, K.L.

    1995-02-01

    Revision 0 of this report is being published in February 1995 to closely mark the one-year anniversary of the Northridge Earthquake. A September 1994 Draft version of the report was reviewed by DOE and NRC, and many of the review comments are incorporated into Revision 0. While this revision of the report is not entirely complete, it is being made available for comment, review, and evaluation. Since the report was written by several authors, sections of the report have slightly different styles. Several sections of Revision 0 are not complete, but are planned to be completed in Revision 1. The primary unfinished section is Section 3.3 on Electric Power Transmission. Other sections of Revision 0, such as Section 4.5.2 on the Energy Technology Engineering Center and 3.2 on Electric Power Generation, will be enhanced with further detailed information as it becomes available. In addition, further data, including processed response spectra for investigated facilities and cataloging of relay performance, will be added to Revision 1 depending upon investigation support. While Revision 0 of this report is being published by LLNL, Revision 1 is planned to be published by EPRI. The anticipated release date for Revision 1 is December 1995. Unfortunately, the one-year anniversary of the Northridge Earthquake was also marked by the devastating Hyogo-Ken Nanbu (or Hanshin-Awaji) Earthquake in Kobe, Japan. As compared to the Northridge Earthquake, there were many more deaths, collapsed structures, destroyed lifelines, and fires following the Kobe Earthquake. Lessons from the Kobe Earthquake will both reemphasize topics discussed in this report and provide further issues to be addressed when designing and retrofitting structures, systems, and components for seismic strong motion.

  6. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    International Nuclear Information System (INIS)

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs

  7. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    Energy Technology Data Exchange (ETDEWEB)

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  8. Lower bound earthquake magnitude for probabilistic seismic hazard evaluation

    International Nuclear Information System (INIS)

    McCann, M.W. Jr.; Reed, J.W.

    1990-01-01

    This paper presents the results of a study that develops an engineering and seismological basis for selecting a lower-bound magnitude (LBM) for use in seismic hazard assessment. As part of a seismic hazard analysis the range of earthquake magnitudes that are included in the assessment of the probability of exceedance of ground motion must be defined. The upper-bound magnitude is established by earth science experts based on their interpretation of the maximum size of earthquakes that can be generated by a seismic source. The lower-bound or smallest earthquake that is considered in the analysis must also be specified. The LBM limits the earthquakes that are considered in assessing the probability that specified ground motion levels are exceeded. In the past there has not been a direct consideration of the appropriate LBM value that should be used in a seismic hazard assessment. This study specifically looks at the selection of a LBM for use in seismic hazard analyses that are input to the evaluation/design of nuclear power plants (NPPs). Topics addressed in the evaluation of a LBM are earthquake experience data at heavy industrial facilities, engineering characteristics of ground motions associated with small-magnitude earthquakes, probabilistic seismic risk assessments (seismic PRAs), and seismic margin evaluations. The results of this study and the recommendations concerning a LBM for use in seismic hazard assessments are discussed. (orig.)

  9. Performance and heat release analysis of a pilot-ignited natural gas engine

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, S.R.; Biruduganti, M.; Mo, Y.; Bell, S.R.; Midkiff, K.C. [Alabama Univ., Dept. of Mechanical Engineering, Tuscaloosa, AL (United States)

    2002-09-01

    The influence of engine operating variables on the performance, emissions and heat release in a compression ignition engine operating in normal diesel and dual-fuel modes (with natural gas fuelling) was investigated. Substantial reductions in NO{sub x} emissions were obtained with dual-fuel engine operation. There was a corresponding increase in unburned hydrocarbon emissions as the substitution of natural gas was increased. Brake specific energy consumption decreased with natural gas substitution at high loads but increased at low loads. Experimental results at fixed pilot injection timing have also established the importance of intake manifold pressure and temperature in improving dual-fuel performance and emissions at part load. (Author)

  10. State of the art of earthquake engineering in nuclear power plant design

    International Nuclear Information System (INIS)

    Schildknecht, P.O.

    1976-12-01

    A brief outline of definitions based on the USNRC, Seismic and Geologic Siting Criteria for Nuclear Power Plants, and on the plate tectonics and earthquake terminology is given. An introduction into plate tectonics and the associated earthquake phenomena is then presented. Ground motion characteristics are described in connection with the selection of design earthquakes. Mathematical methods of dynamic structural analyses are discussed for linear and nonlinear systems. Response analysis techniques for nuclear power plants are explained considering soil-structure interaction effects. (Auth.)

  11. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi

    2000-06-01

    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  12. National Earthquake Hazards Program at a Crossroads

    Science.gov (United States)

    Showstack, Randy

    The U.S.National Earthquake Hazards Reduction Program, which turns 25 years old on 1 October 2003, is passing through two major transitions, which experts said either could weaken or strengthen the program. On 1 March, a federal government reorganization placed NEHRP's lead agency,the Federal Emergency Management Agency (FEMA),within the new Department of Homeland Security (DHS). A number of earthquake scientists and engineers expressed concern that NEHRP, which already faces budgetary and organizational challenges, and lacks visibility,could end up being marginalized in the bureaucratic shuffle. Some experts, though,as well as agency officials, said they hope DHS will recognize synergies between dealing with earthquakes and terrorist attacks.

  13. Precursory earthquakes of the 1943 eruption of Paricutin volcano, Michoacan, Mexico

    Science.gov (United States)

    Yokoyama, I.; de la Cruz-Reyna, S.

    1990-12-01

    Paricutin volcano is a monogenetic volcano whose birth and growth were observed by modern volcanological techniques. At the time of its birth in 1943, the seismic activity in central Mexico was mainly recorded by the Wiechert seismographs at the Tacubaya seismic station in Mexico City about 320 km east of the volcano area. In this paper we aim to find any characteristics of precursory earthquakes of the monogenetic eruption. Though there are limits in the available information, such as imprecise location of hypocenters and lack of earthquake data with magnitudes under 3.0. The available data show that the first precursory earthquake occurred on January 7, 1943, with a magnitude of 4.4. Subsequently, 21 earthquakes ranging from 3.2 to 4.5 in magnitude occurred before the outbreak of the eruption on February 20. The (S - P) durations of the precursory earthquakes do not show any systematic changes within the observational errors. The hypocenters were rather shallow and did not migrate. The precursory earthquakes had a characteristic tectonic signature, which was retained through the whole period of activity. However, the spectra of the P-waves of the Paricutin earthquakes show minor differences from those of tectonic earthquakes. This fact helped in the identification of Paricutin earthquakes. Except for the first shock, the maximum earthquake magnitudes show an increasing tendency with time towards the outbreak. The total seismic energy released by the precursory earthquakes amounted to 2 × 10 19 ergs. Considering that statistically there is a threshold of cumulative seismic energy release (10 17-18ergs) by precursory earthquakes in polygenetic volcanoes erupting after long quiescence, the above cumulative energy is exceptionally large. This suggests that a monogenetic volcano may need much more energy to clear the way of magma passage to the earth surface than a polygenetic one. The magma ascent before the outbreak of Paricutin volcano is interpretable by a model

  14. Living with earthquakes - development and usage of earthquake-resistant construction methods in European and Asian Antiquity

    Science.gov (United States)

    Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes

    2010-05-01

    Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the

  15. Fault roughness and strength heterogeneity control earthquake size and stress drop

    KAUST Repository

    Zielke, Olaf

    2017-01-13

    An earthquake\\'s stress drop is related to the frictional breakdown during sliding and constitutes a fundamental quantity of the rupture process. High-speed laboratory friction experiments that emulate the rupture process imply stress drop values that greatly exceed those commonly reported for natural earthquakes. We hypothesize that this stress drop discrepancy is due to fault-surface roughness and strength heterogeneity: an earthquake\\'s moment release and its recurrence probability depend not only on stress drop and rupture dimension but also on the geometric roughness of the ruptured fault and the location of failing strength asperities along it. Using large-scale numerical simulations for earthquake ruptures under varying roughness and strength conditions, we verify our hypothesis, showing that smoother faults may generate larger earthquakes than rougher faults under identical tectonic loading conditions. We further discuss the potential impact of fault roughness on earthquake recurrence probability. This finding provides important information, also for seismic hazard analysis.

  16. Earthquake Risk Mitigation in the Tokyo Metropolitan area

    Science.gov (United States)

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.

    2010-12-01

    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  17. Fighting and preventing post-earthquake fires in nuclear power plant

    International Nuclear Information System (INIS)

    Lu Xuefeng; Zhang Xin

    2011-01-01

    Nuclear power plant post-earthquake fires will cause not only personnel injury, severe economic loss, but also serious environmental pollution. For the moment, nuclear power is in a position of rapid development in China. Considering the earthquake-prone characteristics of our country, it is of great engineering importance to investigate the nuclear power plant post-earthquake fires. This article analyzes the cause, influential factors and development characteristics of nuclear power plant post-earthquake fires in details, and summarizes the three principles should be followed in fighting and preventing nuclear power plant post-earthquake fires, such as solving problems in order of importance and urgency, isolation prior to prevention, immediate repair and regular patrol. Three aspects were pointed out that should be paid attention in fighting and preventing post-earthquake fires. (authors)

  18. Performance and efficiency evaluation and heat release study of a direct-injection stratified-charge rotary engine

    Science.gov (United States)

    Nguyen, H. L.; Addy, H. E.; Bond, T. H.; Lee, C. M.; Chun, K. S.

    1987-01-01

    A computer simulation which models engine performance of the Direct Injection Stratified Charge (DISC) rotary engines was used to study the effect of variations in engine design and operating parameters on engine performance and efficiency of an Outboard Marine Corporation (OMC) experimental rotary combustion engine. Engine pressure data were used in a heat release analysis to study the effects of heat transfer, leakage, and crevice flows. Predicted engine data were compared with experimental test data over a range of engine speeds and loads. An examination of methods to improve the performance of the rotary engine using advanced heat engine concepts such as faster combustion, reduced leakage, and turbocharging is also presented.

  19. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    earthquakes and deep-focus earthquakes are the energy release caused by the slip or flow of rocks following a jamming-unjamming transition. (4) The energetics and impending precursors of earthquake: The energy of earthquake is the kinetic energy released from the jamming-unjamming transition. Calculation shows that the kinetic energy of seismic rock sliding is comparable with the total work demanded for rocks’ shear failure and overcoming of frictional resistance. There will be no heat flow paradox. Meanwhile, some valuable seismic precursors are likely to be identified by observing the accumulation of additional tectonic forces, local geological changes, as well as the effect of rock state changes, etc.

  20. Prediction of strong earthquake motions on rock surface using evolutionary process models

    International Nuclear Information System (INIS)

    Kameda, H.; Sugito, M.

    1984-01-01

    Stochastic process models are developed for prediction of strong earthquake motions for engineering design purposes. Earthquake motions with nonstationary frequency content are modeled by using the concept of evolutionary processes. Discussion is focused on the earthquake motions on bed rocks which are important for construction of nuclear power plants in seismic regions. On this basis, two earthquake motion prediction models are developed, one (EMP-IB Model) for prediction with given magnitude and epicentral distance, and the other (EMP-IIB Model) to account for the successive fault ruptures and the site location relative to the fault of great earthquakes. (Author) [pt

  1. Finite element simulation of earthquake cycle dynamics for continental listric fault system

    Science.gov (United States)

    Wei, T.; Shen, Z. K.

    2017-12-01

    We simulate stress/strain evolution through earthquake cycles for a continental listric fault system using the finite element method. A 2-D lithosphere model is developed, with the upper crust composed of plasto-elastic materials and the lower crust/upper mantle composed of visco-elastic materials respectively. The media is sliced by a listric fault, which is soled into the visco-elastic lower crust at its downdip end. The system is driven laterally by constant tectonic loading. Slip on fault is controlled by rate-state friction. We start with a simple static/dynamic friction law, and drive the system through multiple earthquake cycles. Our preliminary results show that: (a) periodicity of the earthquake cycles is strongly modulated by the static/dynamic friction, with longer period correlated with higher static friction and lower dynamic friction; (b) periodicity of earthquake is a function of fault depth, with less frequent events of greater magnitudes occurring at shallower depth; and (c) rupture on fault cannot release all the tectonic stress in the system, residual stress is accumulated in the hanging wall block at shallow depth close to the fault, which has to be released either by conjugate faulting or inelastic folding. We are in a process of exploring different rheologic structure and friction laws and examining their effects on earthquake behavior and deformation pattern. The results will be applied to specific earthquakes and fault zones such as the 2008 great Wenchuan earthquake on the Longmen Shan fault system.

  2. Estimation of Natural Frequencies During Earthquakes

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Rytter, A

    1997-01-01

    This paper presents two different recursive prediction error method (RPEM} implementations of multivariate Auto-Regressive Moving- Average (ARMAV) models for identification of a time variant civil engineering structure subject to an earthquake. The two techniques are tested on measurements made...

  3. Explanation of earthquake response spectra

    OpenAIRE

    Douglas, John

    2017-01-01

    This is a set of five slides explaining how earthquake response spectra are derived from strong-motion records and simple models of structures and their purpose within seismic design and assessment. It dates from about 2002 and I have used it in various introductory lectures on engineering seismology.

  4. Coseismic deformation of the 2001 El Salvador and 2002 Denali fault earthquakes from GPS geodetic measurements

    Science.gov (United States)

    Hreinsdottir, Sigrun

    2005-07-01

    GPS geodetic measurements are used to study two major earthquakes, the 2001 MW 7.7 El Salvador and 2002 MW 7.9 Denali Fault earthquakes. The 2001 MW 7.7 earthquake was a normal fault event in the subducting Cocos plate offshore El Salvador. Coseismic displacements of up to 15 mm were measured at permanent GPS stations in Central America. The GPS data were used to constrain the location of and slip on the normal fault. One month later a MW 6.6 strike-slip earthquake occurred in the overriding Caribbean plate. Coulomb stress changes estimated from the M W 7.7 earthquake suggest that it triggered the MW 6.6 earthquake. Coseismic displacement from the MW 6.6 earthquake, about 40 mm at a GPS station in El Salvador, indicates that the earthquake triggered additional slip on a fault close to the GPS station. The MW 6.6 earthquake further changed the stress field in the overriding Caribbean plate, with triggered seismic activity occurring west and possibly also to the east of the rupture in the days to months following the earthquake. The MW 7.9 Denali Fault earthquake ruptured three faults in the interior of Alaska. It initiated with a thrust motion on the Susitna Glacier fault but then ruptured the Denali and Totschunda faults with predominantly right-lateral strike-slip motion unilaterally from west to east. GPS data measured in the two weeks following the earthquake suggest a complex coseismic rupture along the faults with two main regions of moment release along the Denali fault. A large amount of additional data were collected in the year following the earthquake which greatly improved the resolution on the fault, revealing more details of the slip distribution. We estimate a total moment release of 6.81 x 1020 Nm in the earthquake with a M W 7.2 thrust subevent on Susitna Glacier fault. The slip on the Denali fault is highly variable, with 4 main pulses of moment release. The largest moment pulse corresponds to a MW 7.5 subevent, about 40 km west of the Denali

  5. What is a surprise earthquake? The example of the 2002, San Giuliano (Italy event

    Directory of Open Access Journals (Sweden)

    M. Mucciarelli

    2005-06-01

    Full Text Available Both in scientific literature and in the mass media, some earthquakes are defined as «surprise earthquakes». Based on his own judgment, probably any geologist, seismologist or engineer may have his own list of past «surprise earthquakes». This paper tries to quantify the underlying individual perception that may lead a scientist to apply such a definition to a seismic event. The meaning is different, depending on the disciplinary approach. For geologists, the Italian database of seismogenic sources is still too incomplete to allow for a quantitative estimate of the subjective degree of belief. For seismologists, quantification is possible defining the distance between an earthquake and its closest previous neighbor. Finally, for engineers, the San Giuliano quake could not be considered a surprise, since probabilistic site hazard estimates reveal that the change before and after the earthquake is just 4%.

  6. Estimated airborne release of radionuclides from the Battelle Memorial Institute Columbus Laboratories JN-1b building at the West Jefferson site as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.

    1981-11-01

    The potential airborne releases of radionuclides (source terms) that could result from wind and earthquake dmage are estimated for the Battelle Memorial Institute Columbus Laboratories JN-1b Building at the West Jefferson site in Ohio. The estimated source terms are based on the damage to barriers containing the radionuclides, the inventory of radionuclides at risk, and the fraction of the inventory made airborne as a result of the loss of containment. In an attempt to provide a realistic range of potential source terms that include most of the normal operating conditions, a best estimate bounded by upper and lower limits is calculated by combining the upper-bound, best-estimate, and lower-bound inventories-at-risk with an airborne release factor (upper-bound, best-estimate, and lower-bound if possible) for the situation. The factors used to evaluate the fractional airborne release of materials and the exchange rates between enclosed and exterior atmospheres are discussed. The postulated damage and source terms are discussed for wind and earthquake hazard scenarios in order of their increasing severity

  7. A new mechanistic and engineering fission gas release model for a uranium dioxide fuel

    International Nuclear Information System (INIS)

    Lee, Chan Bock; Yang, Yong Sik; Kim, Dae Ho; Kim, Sun Ki; Bang, Je Geun

    2008-01-01

    A mechanistic and engineering fission gas release model (MEGA) for uranium dioxide (UO 2 ) fuel was developed. It was based upon the diffusional release of fission gases from inside the grain to the grain boundary and the release of fission gases from the grain boundary to the external surface by the interconnection of the fission gas bubbles in the grain boundary. The capability of the MEGA model was validated by a comparison with the fission gas release data base and the sensitivity analyses of the parameters. It was found that the MEGA model correctly predicts the fission gas release in the broad range of fuel burnups up to 98 MWd/kgU. Especially, the enhancement of fission gas release in a high-burnup fuel, and the reduction of fission gas release at a high burnup by increasing the UO 2 grain size were found to be correctly predicted by the MEGA model without using any artificial factor. (author)

  8. Approaches to evaluating weathering effects on release of engineered nanomaterials from solid matrices

    Science.gov (United States)

    Increased production and use of engineered nanomaterials (ENMs) over the past decade has increased the potential for the transport and release of these materials into the environment. Here we present results of two separate studies designed to simulate the effects of weathering o...

  9. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  10. Proceedings of the Regional Seminar on Earthquake Engineering (13th) Held in Istanbul, Turkey on 14-24 September 1987.

    Science.gov (United States)

    1987-09-01

    Earthquake Engineering Conference held in San Francisco in July 198 . It is an international collaboration programme disigned to mitigate the damage...i0 25 30 days Fig. Field data showing restoring Processes of life line os ystems. :j50 0i i i . . rH l o? 50- f- Kitchen fire sources J-0 Kerosene...and another, intermediate, narrow, lobby, serving as entrance, kitchen , a.s.o. As a matter of fact, statistics indicate that the ra- tio of 3 room

  11. Source Release Modeling for the Idaho National Engineering and Environmental Laboratory's Subsurface Disposal Area

    International Nuclear Information System (INIS)

    Becker, B.H.

    2002-01-01

    A source release model was developed to determine the release of contaminants into the shallow subsurface, as part of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) evaluation at the Idaho National Engineering and Environmental Laboratory's (INEEL) Subsurface Disposal Area (SDA). The output of the source release model is used as input to the subsurface transport and biotic uptake models. The model allowed separating the waste into areas that match the actual disposal units. This allows quantitative evaluation of the relative contribution to the total risk and allows evaluation of selective remediation of the disposal units within the SDA

  12. Seismic experience in power and industrial facilities as it relates to small magnitude earthquakes

    International Nuclear Information System (INIS)

    Swan, S.W.; Horstman, N.G.

    1987-01-01

    The data base on the performance of power and industrial facilities in small magnitude earthquakes (M = 4.0 - 5.5) is potentially very large. In California alone many earthquakes in this magnitude range occur every year, often near industrial areas. In 1986 for example, in northern California alone, there were 76 earthquakes between Richter magnitude 4.0 and 5.5. Experience has shown that the effects of small magnitude earthquakes are seldom significant to well-engineered facilities. (The term well-engineered is here defined to include most modern industrial installations, as well as power plants and substations.) Therefore detailed investigations of small magnitude earthquakes are normally not considered worthwhile. The purpose of this paper is to review the tendency toward seismic damage of equipment installations representative of nuclear power plant safety systems. Estimates are made of the thresholds of seismic damage to certain types of equipment in terms of conventional means of measuring the damage potential of an earthquake. The objective is to define thresholds of damage that can be correlated with Richter magnitude. In this manner an earthquake magnitude might be chosen below which damage to nuclear plant safety systems is not considered credible

  13. The origin of high frequency radiation in earthquakes and the geometry of faulting

    Science.gov (United States)

    Madariaga, R.

    2004-12-01

    In a seminal paper of 1967 Kei Aki discovered the scaling law of earthquake spectra and showed that, among other things, the high frequency decay was of type omega-squared. This implies that high frequency displacement amplitudes are proportional to a characteristic length of the fault, and radiated energy scales with the cube of the fault dimension, just like seismic moment. Later in the seventies, it was found out that a simple explanation for this frequency dependence of spectra was that high frequencies were generated by stopping phases, waves emitted by changes in speed of the rupture front as it propagates along the fault, but this did not explain the scaling of high frequency waves with fault length. Earthquake energy balance is such that, ignoring attenuation, radiated energy is the change in strain energy minus energy released for overcoming friction. Until recently the latter was considered to be a material property that did not scale with fault size. Yet, in another classical paper Aki and Das estimated in the late 70s that energy release rate also scaled with earthquake size, because earthquakes were often stopped by barriers or changed rupture speed at them. This observation was independently confirmed in the late 90s by Ide and Takeo and Olsen et al who found that energy release rates for Kobe and Landers were in the order of a MJ/m2, implying that Gc necessarily scales with earthquake size, because if this was a material property, small earthquakes would never occur. Using both simple analytical and numerical models developed by Addia-Bedia and Aochi and Madariaga, we examine the consequence of these observations for the scaling of high frequency waves with fault size. We demonstrate using some classical results by Kostrov, Husseiny and Freund that high frequency energy flow measures energy release rate and is generated when ruptures change velocity (both direction and speed) at fault kinks or jogs. Our results explain why super shear ruptures are

  14. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  15. Earthquake in Japan: The IAEA mission gives its report

    International Nuclear Information System (INIS)

    Anon.

    2007-01-01

    Following the seism that occurred on the 16. july 2007 in Japan (magnitude 6.6 on Richter scale), an IAEA mission has inspected the nuclear power plant of Kashiwazaki Kariwa at the beginning of August. The mission has estimated that the safety of the installation has been provided during and after the earthquake, in spite of the fact that the earthquake has gone past the seism level taken as reference in the conception of the nuclear facility. The systems and the components were in a better state that it could be imagined after a such earthquake. The release have been under the authorised thresholds. At the moment of the seism, three reactors were running on the seven ones of the nuclear power plant, and stopped automatically. The unit 2 that started up, has also stopped automatically. The reactors 1, 5 and 6 were stopped for maintenance. Water poured out coming from the spent fuel storage pool because of the earth tremors. It was picked and thrown out by the release pipe to the sea without notable impact on environment ( volume 1.2 m 3 ). One hundred of containers was overturned. Traces of iodine, chromium 51 and cobalt 60 have been found in the ventilation filters ( reactor 7) these elements have been released in atmosphere in very low quantities. (N.C.)

  16. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  17. The role of post-earthquake structural safety in pre-earthquake retrof in decision: guidelines and applications

    International Nuclear Information System (INIS)

    Bazzurro, P.; Telleen, K.; Maffei, J.; Yin, J.; Cornell, C.A.

    2009-01-01

    Critical structures such as hospitals, police stations, local administrative office buildings, and critical lifeline facilities, are expected to be operational immediately after earthquakes. Any rational decision about whether these structures are strong enough to meet this goal or whether pre-empitive retrofitting is needed cannot be made without an explicit consideration of post-earthquake safety and functionality with respect to aftershocks. Advanced Seismic Assessment Guidelines offer improvement over previous methods for seismic evaluation of buildings where post-earthquake safety and usability is a concern. This new method allows engineers to evaluate the like hood that a structure may have restricted access or no access after an earthquake. The building performance is measured in terms of the post-earthquake occupancy classifications Green Tag, Yellow Tag, and Red Tag, defining these performance levels quantitatively, based on the structure's remaining capacity to withstand aftershocks. These color-coded placards that constitute an established practice in US could be replaced by the standard results of inspections (A to E) performed by the Italian Dept. of Civil Protection after an event. The article also shows some applications of these Guidelines to buildings of the largest utility company in California, Pacific Gas and Electric Company (PGE). [it

  18. Earthquake Energy Distribution along the Earth Surface and Radius

    International Nuclear Information System (INIS)

    Varga, P.; Krumm, F.; Riguzzi, F.; Doglioni, C.; Suele, B.; Wang, K.; Panza, G.F.

    2010-07-01

    The global earthquake catalog of seismic events with M W ≥ 7.0, for the time interval from 1950 to 2007, shows that the depth distribution of earthquake energy release is not uniform. The 90% of the total earthquake energy budget is dissipated in the first ∼30km, whereas most of the residual budget is radiated at the lower boundary of the transition zone (410 km - 660 km), above the upper-lower mantle boundary. The upper border of the transition zone at around 410 km of depth is not marked by significant seismic energy release. This points for a non-dominant role of the slabs in the energy budged of plate tectonics. Earthquake number and energy release, although not well correlated, when analysed with respect to the latitude, show a decrease toward the polar areas. Moreover, the radiated energy has the highest peak close to (±5 o ) the so-called tectonic equator defined by Crespi et al. (2007), which is inclined about 30 o with respect to the geographic equator. At the same time the presence of a clear axial co- ordination of the radiated seismic energy is demonstrated with maxima at latitudes close to critical (±45 o ). This speaks about the presence of external forces that influence seismicity and it is consistent with the fact that Gutenberg-Richter law is linear, for events with M>5, only when the whole Earth's seismicity is considered. These data are consistent with an astronomical control on plate tectonics, i.e., the despinning (slowing of the Earth's angular rotation) of the Earth's rotation caused primarily by the tidal friction due to the Moon. The mutual position of the shallow and ∼660 km deep earthquake energy sources along subduction zones allows us to conclude that they are connected with the same slab along the W-directed subduction zones, but they may rather be disconnected along the opposed E-NE-directed slabs, being the deep seismicity controlled by other mechanisms. (author)

  19. Lessons learned from the 1994 Northridge Earthquake

    International Nuclear Information System (INIS)

    Eli, M.W.; Sommer, S.C.

    1995-01-01

    Southern California has a history of major earthquakes and also has one of the largest metropolitan areas in the United States. The 1994 Northridge Earthquake challenged the industrial facilities and lifetime infrastructure in the northern Los Angeles (LA) area. Lawrence Livermore National Laboratory (LLNL) sent a team of engineers to conduct an earthquake damage investigation in the Northridge area, on a project funded jointly by the United States Nuclear Regulatory Commission (USNRC) and the United States Department of Energy (USDOE). Many of the structures, systems, and components (SSCs) and lifelines that suffered damage are similar to those found in nuclear power plants and in USDOE facilities. Lessons learned from these experiences can have some applicability at commercial nuclear power plants

  20. Injection-induced moment release can also be aseismic

    Science.gov (United States)

    McGarr, Arthur; Barbour, Andrew J.

    2018-01-01

    The cumulative seismic moment is a robust measure of the earthquake response to fluid injection for injection volumes ranging from 3100 to about 12 million m3. Over this range, the moment release is limited to twice the product of the shear modulus and the volume of injected fluid. This relation also applies at the much smaller injection volumes of the field experiment in France reported by Guglielmi, et al. (2015) and laboratory experiments to simulate hydraulic fracturing described by Goodfellow, et al. (2015). In both of these studies, the relevant moment release for comparison with the fluid injection was aseismic and consistent with the scaling that applies to the much larger volumes associated with injection-induced earthquakes with magnitudes extending up to 5.8. Neither the micro-earthquakes, at the site in France, nor the acoustic emission in the laboratory samples contributed significantly to the deformation due to fluid injection.

  1. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  2. Release of engineered nanomaterials from polymer nanocomposites: the effect of matrix degradation.

    Science.gov (United States)

    Duncan, Timothy V

    2015-01-14

    Polymer nanocomposites-polymer-based materials that incorporate filler elements possessing at least one dimension in the nanometer range-are increasingly being developed for commercial applications ranging from building infrastructure to food packaging to biomedical devices and implants. Despite a wide range of intended applications, it is also important to understand the potential for exposure to these nanofillers, which could be released during routine use or abuse of these materials so that it can be determined whether they pose a risk to human health or the environment. This article is the second of a pair that review what is known about the release of engineered nanomaterials (ENMs) from polymer nanocomposites. Two roughly separate ENM release paradigms are considered in this series: the release of ENMs via passive diffusion, desorption, and dissolution into external liquid media and the release of ENMs assisted by matrix degradation. The present article is focused primarily on the second paradigm and includes a thorough, critical review of the associated body of peer-reviewed literature on ENM release by matrix degradation mechanisms, including photodegradation, thermal decomposition, mechanical wear, and hydrolysis. These release mechanisms may be especially relevant to nanocomposites that are likely to be subjected to weathering, including construction and infrastructural materials, sporting equipment, and materials that might potentially end up in landfills. This review pays particular attention to studies that shed light on specific release mechanisms and synergistic mechanistic relationships. The review concludes with a short section on knowledge gaps and future research needs.

  3. Cylinder pressure, performance parameters, heat release, specific heats ratio and duration of combustion for spark ignition engine

    Energy Technology Data Exchange (ETDEWEB)

    Shehata, M.S. [Mechanical Engineering Technology Department, Higher Institute of Technology, Banha University, 4Zagalol Street, Benha, Galubia 1235 Z (Egypt)

    2010-12-15

    An experimental work were conducted for investigating cylinder pressure, performance parameters, heat release, specific heat ratio and duration of combustion for multi cylinder spark ignition engine (SIE). Ccylinder pressure was measured for gasoline, kerosene and Liquefied Petroleum Gases (LPG) separately as a fuel for SIE. Fast Fourier Transformations (FFT) was used to cylinder pressure data transform from time domain into frequency domain to develop empirical correlation for calculating cylinder pressures at different engine speeds and different fuels. In addition, Inverse Fast Fourier Transformations (IFFT) was used to cylinder pressure reconstruct into time domain. The results gave good agreement between the measured cylinder pressure and the reconstructed cylinder pressure in time domain with different engine speeds and different fuels. The measured cylinder pressure and hydraulic dynamotor were the source of data for calculating engine performance parameters. First law of thermodynamics and single zone heat release model with temperature dependant specific heat ratio {gamma}(T) were the main tools for calculating heat release and heat transfer to cylinder walls. Third order empirical correlation for calculating {gamma}(T) was one of the main gains of the present study. The correlation gave good agreement with other researchers with wide temperatures range. For kerosene, cylinder pressure is higher than for gasoline and LPG due to high volumetric efficiency where kerosene density (mass/volume ratio) is higher than gasoline and LPG. In addition, kerosene heating value is higher than gasoline that contributes in heat release rate and pressure increases. Duration of combustion for different engine speeds was determined using four different methods: (I) Mass fuel burnt, (II) Entropy change, (III) Temperature dependant specific heat ratio {gamma}(T), and (IV) Logarithmic scale of (P and V). The duration of combustion for kerosene is smaller than for gasoline and

  4. Cylinder pressure, performance parameters, heat release, specific heats ratio and duration of combustion for spark ignition engine

    International Nuclear Information System (INIS)

    Shehata, M.S.

    2010-01-01

    An experimental work were conducted for investigating cylinder pressure, performance parameters, heat release, specific heat ratio and duration of combustion for multi cylinder spark ignition engine (SIE). Ccylinder pressure was measured for gasoline, kerosene and Liquefied Petroleum Gases (LPG) separately as a fuel for SIE. Fast Fourier Transformations (FFT) was used to cylinder pressure data transform from time domain into frequency domain to develop empirical correlation for calculating cylinder pressures at different engine speeds and different fuels. In addition, Inverse Fast Fourier Transformations (IFFT) was used to cylinder pressure reconstruct into time domain. The results gave good agreement between the measured cylinder pressure and the reconstructed cylinder pressure in time domain with different engine speeds and different fuels. The measured cylinder pressure and hydraulic dynamotor were the sours of data for calculating engine performance parameters. First law of thermodynamics and single zone heat release model with temperature dependant specific heat ratio γ(T) were the main tools for calculating heat release and heat transfer to cylinder walls. Third order empirical correlation for calculating γ(T) was one of the main gains of the present study. The correlation gave good agreement with other researchers with wide temperatures range. For kerosene, cylinder pressure is higher than for gasoline and LPG due to high volumetric efficiency where kerosene density (mass/volume ratio) is higher than gasoline and LPG. In addition, kerosene heating value is higher than gasoline that contributes in heat release rate and pressure increases. Duration of combustion for different engine speeds was determined using four different methods: (I) Mass fuel burnt, (II) Entropy change, (III) Temperature dependant specific heat ratio γ(T), and (IV) Logarithmic scale of (P and V). The duration of combustion for kerosene is smaller than for gasoline and LPG due to high

  5. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  6. Improving the Earthquake Resilience of Buildings The worst case approach

    CERN Document Server

    Takewaki, Izuru; Fujita, Kohei

    2013-01-01

    Engineers are always interested in the worst-case scenario. One of the most important and challenging missions of structural engineers may be to narrow the range of unexpected incidents in building structural design. Redundancy, robustness and resilience play an important role in such circumstances. Improving the Earthquake Resilience of Buildings: The worst case approach discusses the importance of worst-scenario approach for improved earthquake resilience of buildings and nuclear reactor facilities. Improving the Earthquake Resilience of Buildings: The worst case approach consists of two parts. The first part deals with the characterization and modeling of worst or critical ground motions on inelastic structures and the related worst-case scenario in the structural design of ordinary simple building structures. The second part of the book focuses on investigating the worst-case scenario for passively controlled and base-isolated buildings. This allows for detailed consideration of a range of topics includin...

  7. Modelling the effect of injection pressure on heat release parameters and nitrogen oxides in direct injection diesel engines

    Directory of Open Access Journals (Sweden)

    Yüksek Levent

    2014-01-01

    Full Text Available Investigation and modelling the effect of injection pressure on heat release parameters and engine-out nitrogen oxides are the main aim of this study. A zero-dimensional and multi-zone cylinder model was developed for estimation of the effect of injection pressure rise on performance parameters of diesel engine. Double-Wiebe rate of heat release global model was used to describe fuel combustion. extended Zeldovich mechanism and partial equilibrium approach were used for modelling the formation of nitrogen oxides. Single cylinder, high pressure direct injection, electronically controlled, research engine bench was used for model calibration. 1000 and 1200 bars of fuel injection pressure were investigated while injection advance, injected fuel quantity and engine speed kept constant. The ignition delay of injected fuel reduced 0.4 crank angle with 1200 bars of injection pressure and similar effect observed in premixed combustion phase duration which reduced 0.2 crank angle. Rate of heat release of premixed combustion phase increased 1.75 % with 1200 bar injection pressure. Multi-zone cylinder model showed good agreement with experimental in-cylinder pressure data. Also it was seen that the NOx formation model greatly predicted the engine-out NOx emissions for both of the operation modes.

  8. Cadmium release from a reprocessing electrorefiner falling over

    Energy Technology Data Exchange (ETDEWEB)

    Solbrig, Charles W., E-mail: Charles.solbrig@inl.gov [Batelle Energy Alliance, Idaho National Laboratory, PO Box 2528, Idaho Falls, ID 83404 (United States); Pope, Chad L. [Batelle Energy Alliance, Idaho National Laboratory, PO Box 2528, Idaho Falls, ID 83404 (United States)

    2013-02-15

    Highlights: ► We model an accident in a nuclear fuel processing facility caused by an earthquake. ► The earthquake causes the argon cell to breach and the electrorefiner to tip over. ► Cadmium is spilled and a cathode falls on the cadmium and starts to burn. ► Cadmium can be transported to people in the building, the site, and the public. ► The results show negligible doses to all persons except in one low probability case. -- Abstract: The possible biological consequences of a release of cadmium due to a design basis earthquake in the Idaho Nuclear Laboratory's nuclear fuel reprocessing cell are evaluated. The facility is designed to withstand the design basis earthquake except for some non-seismically qualified feedthroughs. The earthquake is hypothesized to breach these feedthroughs (allowing air into the argon atmosphere processing cell) and cause the MK-IV electrorefiner (ER) in the cell to tip over or split and spill its contents of fission product laden salt and cadmium. In addition, the uranium dendrite product cathode is assumed to fall on the cadmium and burn. The heat from the burning cathode results in release of cadmium vapor into the cell atmosphere. Ingestion and inhalation of a sufficient concentration of cadmium for a critical time period can cause irreversible health effects or death. The release of the small quantity of fission products, analyzed elsewhere, results in negligible doses. Analysis reported here shows there is no danger to the general public by the cadmium release or to on-site workers except in one low probability case. This one case requires a fivefold failure where the safety exhaust system fails just after the 4% oxygen concentration combustion limit in the cell is reached. Failure of the SES allows oscillatory inflow and outflow (and hence cadmium outflow) from the cell due to gravity. The dose to a worker in the basement exceeds the mortality limit in this one event if the worker does not leave the basement.

  9. Cadmium release from a reprocessing electrorefiner falling over

    International Nuclear Information System (INIS)

    Solbrig, Charles W.; Pope, Chad L.

    2013-01-01

    Highlights: ► We model an accident in a nuclear fuel processing facility caused by an earthquake. ► The earthquake causes the argon cell to breach and the electrorefiner to tip over. ► Cadmium is spilled and a cathode falls on the cadmium and starts to burn. ► Cadmium can be transported to people in the building, the site, and the public. ► The results show negligible doses to all persons except in one low probability case. -- Abstract: The possible biological consequences of a release of cadmium due to a design basis earthquake in the Idaho Nuclear Laboratory's nuclear fuel reprocessing cell are evaluated. The facility is designed to withstand the design basis earthquake except for some non-seismically qualified feedthroughs. The earthquake is hypothesized to breach these feedthroughs (allowing air into the argon atmosphere processing cell) and cause the MK-IV electrorefiner (ER) in the cell to tip over or split and spill its contents of fission product laden salt and cadmium. In addition, the uranium dendrite product cathode is assumed to fall on the cadmium and burn. The heat from the burning cathode results in release of cadmium vapor into the cell atmosphere. Ingestion and inhalation of a sufficient concentration of cadmium for a critical time period can cause irreversible health effects or death. The release of the small quantity of fission products, analyzed elsewhere, results in negligible doses. Analysis reported here shows there is no danger to the general public by the cadmium release or to on-site workers except in one low probability case. This one case requires a fivefold failure where the safety exhaust system fails just after the 4% oxygen concentration combustion limit in the cell is reached. Failure of the SES allows oscillatory inflow and outflow (and hence cadmium outflow) from the cell due to gravity. The dose to a worker in the basement exceeds the mortality limit in this one event if the worker does not leave the basement

  10. Ground Motion Prediction for Great Interplate Earthquakes in Kanto Basin Considering Variation of Source Parameters

    Science.gov (United States)

    Sekiguchi, H.; Yoshimi, M.; Horikawa, H.

    2011-12-01

    Broadband ground motions are estimated in the Kanto sedimentary basin which holds Tokyo metropolitan area inside for anticipated great interplate earthquakes along surrounding plate boundaries. Possible scenarios of great earthquakes along Sagami trough are modeled combining characteristic properties of the source area and adequate variation in source parameters in order to evaluate possible ground motion variation due to next Kanto earthquake. South to the rupture area of the 2011 Tohoku earthquake along the Japan trench, we consider possible M8 earthquake. The ground motions are computed with a four-step hybrid technique. We first calculate low-frequency ground motions at the engineering basement. We then calculate higher-frequency ground motions at the same position, and combine the lower- and higher-frequency motions using a matched filter. We finally calculate ground motions at the surface by computing the response of the alluvium-diluvium layers to the combined motions at the engineering basement.

  11. Understanding dynamic friction through spontaneously evolving laboratory earthquakes.

    Science.gov (United States)

    Rubino, V; Rosakis, A J; Lapusta, N

    2017-06-29

    Friction plays a key role in how ruptures unzip faults in the Earth's crust and release waves that cause destructive shaking. Yet dynamic friction evolution is one of the biggest uncertainties in earthquake science. Here we report on novel measurements of evolving local friction during spontaneously developing mini-earthquakes in the laboratory, enabled by our ultrahigh speed full-field imaging technique. The technique captures the evolution of displacements, velocities and stresses of dynamic ruptures, whose rupture speed range from sub-Rayleigh to supershear. The observed friction has complex evolution, featuring initial velocity strengthening followed by substantial velocity weakening. Our measurements are consistent with rate-and-state friction formulations supplemented with flash heating but not with widely used slip-weakening friction laws. This study develops a new approach for measuring local evolution of dynamic friction and has important implications for understanding earthquake hazard since laws governing frictional resistance of faults are vital ingredients in physically-based predictive models of the earthquake source.

  12. Defence against earthquakes: a red thread of history

    International Nuclear Information System (INIS)

    Guidoboni, Emanuela

    2015-01-01

    This note gives a short overview from the ancient world down to the end of the eighteenth century (before engineering began as a science, that is) on the idea of “housing safety” and earthquakes. The idea varies, but persists throughout the cultural and economic contexts of history’s changing societies, and in relation to class and lifestyle. Historical research into earthquakes in Italy from the ancient world to the twentieth century has shown how variable the idea actually is, as emerges from theoretical treatises, practical wisdom and projects drawn up in the wake of destructive events. In the seventeenth century the theoretical interpretation of earthquakes began to swing towards a mechanistic view of the Earth, affecting how the effects and propagation of earthquakes were observed. Strong earthquakes continued to occur and cause damage, and after yet another seismic disaster – Umbria 1751 – new building techniques were advocated. The attempt was to make house walls bind more solidly by special linking of the wooden structure of floors and roof beams. Following the massive seismic crisis of February-March 1783, which left central and southern Calabria in ruins, a new house was proposed, called 'baraccata': it was a wooden structure filled in with light materials. This was actually already to be founding the ancient Mediterranean basin (including Pompei); but only at that time was it perfected, proposed by engineers and circulated as an important building innovation. At the end of the eighteenth century town planners came to the fore in the search for safe housing. They suggested new regular shapes, broad grid-plan streets with a specific view to achieving housing safety and ensuring an escape route in case of earthquake. Such rules and regulations were then abandoned or lost, proving that it is not enough to try out [it

  13. Earthquakes trigger the loss of groundwater biodiversity

    Science.gov (United States)

    Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero

    2014-09-01

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  14. Incorporating human-triggered earthquake risks into energy and water policies

    Science.gov (United States)

    Klose, C. D.; Seeber, L.; Jacob, K. H.

    2010-12-01

    A comprehensive understanding of earthquake risks in urbanized regions requires an accurate assessment of both urban vulnerabilities and hazards from earthquakes, including ones whose timing might be affected by human activities. Socioeconomic risks associated with human-triggered earthquakes are often misconstrued and receive little scientific, legal, and public attention. Worldwide, more than 200 damaging earthquakes, associated with industrialization and urbanization, were documented since the 20th century. Geomechanical pollution due to large-scale geoengineering activities can advance the clock of earthquakes, trigger new seismic events or even shot down natural background seismicity. Activities include mining, hydrocarbon production, fluid injections, water reservoir impoundments and deep-well geothermal energy production. This type of geohazard has impacts on human security on a regional and national level. Some planned or considered future engineering projects raise particularly strong concerns about triggered earthquakes, such as for instance, sequestration of carbon dioxide by injecting it deep underground and large-scale natural gas production in the Marcellus shale in the Appalacian basin. Worldwide examples of earthquakes are discussed, including their associated losses of human life and monetary losses (e.g., 1989 Newcastle and Volkershausen earthquakes, 2001 Killari earthquake, 2006 Basel earthquake, 2010 Wenchuan earthquake). An overview is given on global statistics of human-triggered earthquakes, including depths and time delay of triggering. Lastly, strategies are described, including risk mitigation measures such as urban planning adaptations and seismic hazard mapping.

  15. Fabrication and characterization of a rapid prototyped tissue engineering scaffold with embedded multicomponent matrix for controlled drug release

    Science.gov (United States)

    Chen, Muwan; Le, Dang QS; Hein, San; Li, Pengcheng; Nygaard, Jens V; Kassem, Moustapha; Kjems, Jørgen; Besenbacher, Flemming; Bünger, Cody

    2012-01-01

    Bone tissue engineering implants with sustained local drug delivery provide an opportunity for better postoperative care for bone tumor patients because these implants offer sustained drug release at the tumor site and reduce systemic side effects. A rapid prototyped macroporous polycaprolactone scaffold was embedded with a porous matrix composed of chitosan, nanoclay, and β-tricalcium phosphate by freeze-drying. This composite scaffold was evaluated on its ability to deliver an anthracycline antibiotic and to promote formation of mineralized matrix in vitro. Scanning electronic microscopy, confocal imaging, and DNA quantification confirmed that immortalized human bone marrow-derived mesenchymal stem cells (hMSC-TERT) cultured in the scaffold showed high cell viability and growth, and good cell infiltration to the pores of the scaffold. Alkaline phosphatase activity and osteocalcin staining showed that the scaffold was osteoinductive. The drug-release kinetics was investigated by loading doxorubicin into the scaffold. The scaffolds comprising nanoclay released up to 45% of the drug for up to 2 months, while the scaffold without nanoclay released 95% of the drug within 4 days. Therefore, this scaffold can fulfill the requirements for both bone tissue engineering and local sustained release of an anticancer drug in vitro. These results suggest that the scaffold can be used clinically in reconstructive surgery after bone tumor resection. Moreover, by changing the composition and amount of individual components, the scaffold can find application in other tissue engineering areas that need local sustained release of drug. PMID:22904634

  16. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    Science.gov (United States)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  17. Scientific, Engineering, and Financial Factors of the 1989 Human-Triggered Newcastle Earthquake in Australia

    Science.gov (United States)

    Klose, C. D.

    2006-12-01

    This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5losses of triggered earthquakes. An hazard assessment, based on a geomechanical crust model, shows that only four deep coal mines were responsible for triggering this severe earthquake. A small-scale economic risk assessment identifies that the financial loss due to earthquake damage has reduced mining profits that have been re-invested in the Newcastle region for over two centuries beginning in 1801. Furthermore, large-scale economic risk assessment reveals that the financial loss is equivalent to 26% of the Australian Gross Domestic Product (GDP) growth in 1988/89. These costs account for 13% of the total costs of all natural disasters (e.g., flooding, drought, wild fires) and 94% of the costs of all earthquakes recorded in Australia between 1967 and 1999. In conclusion, the increasing number and size of geoengineering activities, such as coal mining near Newcastle or planned carbon dioxide Geosequestration initiatives, represent a growing hazard potential, which can negatively affect socio-economic growth and sustainable development. Finally, hazard and risk degrees, based on geomechanical-mathematical models, can be forecasted in space and over time for urban planning in order to prevent economic losses of human-triggered earthquakes in the future.

  18. Drug-releasing nano-engineered titanium implants: therapeutic efficacy in 3D cell culture model, controlled release and stability

    Energy Technology Data Exchange (ETDEWEB)

    Gulati, Karan [School of Chemical Engineering, The University of Adelaide, SA 5005 (Australia); Kogawa, Masakazu; Prideaux, Matthew; Findlay, David M. [Discipline of Orthopaedics and Trauma, The University of Adelaide, SA 5005 (Australia); Atkins, Gerald J., E-mail: gerald.atkins@adelaide.edu.au [Discipline of Orthopaedics and Trauma, The University of Adelaide, SA 5005 (Australia); Losic, Dusan, E-mail: dusan.losic@adelaide.edu.au [School of Chemical Engineering, The University of Adelaide, SA 5005 (Australia)

    2016-12-01

    There is an ongoing demand for new approaches for treating localized bone pathologies. Here we propose a new strategy for treatment of such conditions, via local delivery of hormones/drugs to the trauma site using drug releasing nano-engineered implants. The proposed implants were prepared in the form of small Ti wires/needles with a nano-engineered oxide layer composed of array of titania nanotubes (TNTs). TNTs implants were inserted into a 3D collagen gel matrix containing human osteoblast-like, and the results confirmed cell migration onto the implants and their attachment and spread. To investigate therapeutic efficacy, TNTs/Ti wires loaded with parathyroid hormone (PTH), an approved anabolic therapeutic for the treatment of severe bone fractures, were inserted into 3D gels containing osteoblast-like cells. Gene expression studies revealed a suppression of SOST (sclerostin) and an increase in RANKL (receptor activator of nuclear factor kappa-B ligand) mRNA expression, confirming the release of PTH from TNTs at concentrations sufficient to alter cell function. The performance of the TNTs wire implants using an example of a drug needed at relatively higher concentrations, the anti-inflammatory drug indomethacin, is also demonstrated. Finally, the mechanical stability of the prepared implants was tested by their insertion into bovine trabecular bone cores ex vivo followed by retrieval, which confirmed the robustness of the TNT structures. This study provides proof of principle for the suitability of the TNT/Ti wire implants for localized bone therapy, which can be customized to cater for specific therapeutic requirements. - Highlights: • Ti wire with titania nanotubes (TNTs) are proposed as ‘in-bone’ therapeutic implants. • 3D cell culture model is used to confirm therapeutic efficacy of drug releasing implants. Osteoblasts migrated and firmly attached to the TNTs and the micro-scale cracks. • Tailorable drug loading from few nanograms to several hundred

  19. Earthquakes: no danger for deep underground nuclear waste repositories

    International Nuclear Information System (INIS)

    2010-03-01

    On the Earth, the continental plates are steadily moving. Principally at the plate boundaries such shifts produce stresses which are released in form of earthquakes. The highest the built-up energy, the more violent will be the shaking. Earthquakes accompany mankind from very ancient times on and they disturb the population. Till now nobody is able to predict where and when they will take place. But on the Earth there are regions where, due to their geological situation, the occurrence of earthquakes is more probable than elsewhere. The impact of a very strong earthquake on the structures at the Earth surface depends on several factors. Besides the ground structure, the density of buildings, construction style and materials used play an important role. Construction-related technical measures can improve the safety of buildings and, together with a correct behaviour of the people concerned, save many lives. Earthquakes are well known in Switzerland. Here, the stresses are due to the collision of the African and European continental plates that created the Alps. The impact of earthquake is more limited in the underground than at the Earth surface. There is no danger for deep underground repositories

  20. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  1. Combustion Heat Release Rate Comparison of Algae Hydroprocessed Renewable Diesel to F-76 in a Two-Stroke Diesel Engine

    Science.gov (United States)

    2013-06-01

    was recorded. Figure 14 shows the gauge on the rocker arm during calibration . Figure 14. Mechanical Injector Rocker Arm Strain Gauge. D. DATA...RELEASE RATE COMPARISON OF ALGAE HYDROPROCESSED RENEWABLE DIESEL TO F-76 IN A TWO-STROKE DIESEL ENGINE by John H. Petersen June 2013 Thesis...RELEASE RATE COMPARISON OF ALGAE HYDROPROCESSED RENEWABLE DIESEL TO F-76 IN A TWO-STROKE DIESEL ENGINE 5. FUNDING NUMBERS 6. AUTHOR(S) John H

  2. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    level of shaking intensity with empirical models of fatality losses calibrated on past earthquakes in each country. Non-seismic detections and macroseismic questionnaires collected online are combined to identify as many as possible of the felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the US Geological Survey, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. All together, we estimate that the number of detected felt earthquakes is around 1 000 per year, compared with the 35 000 earthquakes annually reported by the EMSC! Felt events are already the subject of the web page "Latest significant earthquakes" on EMSC website (http://www.emsc-csem.org/Earthquake/significant_earthquakes.php) and of a dedicated Twitter service @LastQuake. We will present the identification process of the earthquakes that matter, the smartphone application itself (to be released in May) and its future evolutions.

  3. The 2016 Kumamoto Earthquakes: Cascading Geological Hazards and Compounding Risks

    Directory of Open Access Journals (Sweden)

    Katsuichiro Goda

    2016-08-01

    Full Text Available A sequence of two strike-slip earthquakes occurred on 14 and 16 April 2016 in the intraplate region of Kyushu Island, Japan, apart from subduction zones, and caused significant damage and disruption to the Kumamoto region. The analyses of regional seismic catalog and available strong motion recordings reveal striking characteristics of the events, such as migrating seismicity, earthquake surface rupture, and major foreshock-mainshock earthquake sequences. To gain valuable lessons from the events, a UK Earthquake Engineering Field Investigation Team (EEFIT was dispatched to Kumamoto, and earthquake damage surveys were conducted to relate observed earthquake characteristics to building and infrastructure damage caused by the earthquakes. The lessons learnt from the reconnaissance mission have important implications on current seismic design practice regarding the required seismic resistance of structures under multiple shocks and the seismic design of infrastructure subject to large ground deformation. The observations also highlight the consequences of cascading geological hazards on community resilience. To share the gathered damage data widely, geo-tagged photos are organized using Google Earth and the kmz file is made publicly available.

  4. Earthquake recurrence and magnitude and seismic deformation of the northwestern Okhotsk plate, northeast Russia

    Science.gov (United States)

    Hindle, D.; Mackey, K.

    2011-02-01

    Recorded seismicity from the northwestern Okhotsk plate, northeast Asia, is currently insufficient to account for the predicted slip rates along its boundaries due to plate tectonics. However, the magnitude-frequency relationship for earthquakes from the region suggests that larger earthquakes are possible in the future and that events of ˜Mw 7.5 which should occur every ˜100-350 years would account for almost all the slip of the plate along its boundaries due to Eurasia-North America convergence. We use models for seismic slip distribution along the bounding faults of Okhotsk to conclude that relatively little aseismic strain release is occurring and that larger future earthquakes are likely in the region. Our models broadly support the idea of a single Okhotsk plate, with the large majority of tectonic strain released along its boundaries.

  5. Retrospective stress-forecasting of earthquakes

    Science.gov (United States)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  6. The 2016 Central Italy Earthquake: an Overview

    Science.gov (United States)

    Amato, A.

    2016-12-01

    The M6 central Italy earthquake occurred on the seismic backbone of the Italy, just in the middle of the highest hazard belt. The shock hit suddenly during the night of August 24, when people were asleep; no foreshocks occurred before the main event. The earthquake ruptured from 10 km to the surface, and produced a more than 17,000 aftershocks (Oct. 19) spread on a 40x20 km2 area elongated NW-SE. It is geologically very similar to previous recent events of the Apennines. Both the 2009 L'Aquila earthquake to the south and the 1997 Colfiorito to the north, were characterized by the activation of adjacent fault segments. Despite its magnitude and the well known seismic hazard of the region, the earthquake produced extensive damage and 297 fatalities. The town of Amatrice, that paid the highest toll, was classified in zone 1 (the highest) since 1915, but the buildings in this and other villages revealed highly vulnerable. In contrast, in the town of Norcia, that also experienced strong ground shaking, no collapses occurred, most likely due to the retrofitting carried out after an earthquake in 1979. Soon after the quake, the INGV Crisis Unit convened at night in the Rome headquarters, in order to coordinate the activities. The first field teams reached the epicentral area at 7 am with the portable seismic stations installed to monitor the aftershocks; other teams followed to map surface faults, damage, to measure GPS sites, to install instruments for site response studies, and so on. The INGV Crisis Unit includes the Press office and the INGVterremoti team, in order to manage and coordinate the communication towards the Civil Protection Dept. (DPC), the media and the web. Several tens of reports and updates have been delivered in the first month of the sequence to DPC. Also due to the controversial situation arisen from the L'Aquila earthquake and trials, particular attention was given to the communication: continuous and timely information has been released to

  7. Cyclic characteristics of earthquake time histories

    International Nuclear Information System (INIS)

    Hall, J.R. Jr; Shukla, D.K.; Kissenpfennig, J.F.

    1977-01-01

    From an engineering standpoint, an earthquake record may be characterized by a number of parameters, one of which is its 'cyclic characteristics'. The cyclic characteristics are most significant in fatigue analysis of structures and liquefaction analysis of soils where, in addition to the peak motion, cyclic buildup is significant. Whereas duration peak amplitude and response spectra for earthquakes have been studied extensively, the cyclic characteristics of earthquake records have not received an equivalent attention. Present procedures to define the cyclic characteristics are generally based upon counting the number of peaks at various amplitude ranges on a record. This paper presents a computer approach which describes a time history by an amplitude envelope and a phase curve. Using Fast Fourier Transform Techniques, an earthquake time history is represented as a projection along the x-axis of a rotating vector-the length the vector is given by the amplitude spectra-and the angle between the vector and x-axis is given by the phase curve. Thus one cycle is completed when the vector makes a full rotation. Based upon Miner's cumulative damage concept, the computer code automatically combines the cycles of various amplitudes to obtain the equivalent number of cycles of a given amplitude. To illustrate the overall results, the cyclic characteristics of several real and synthetic earthquake time histories have been studied and are presented in the paper, with the conclusion that this procedure provides a physical interpretation of the cyclic characteristics of earthquakes. (Auth.)

  8. Earthquake-enhanced permeability – evidence from carbon dioxide release following the ML3.5 earthquake in West Bohemia

    Czech Academy of Sciences Publication Activity Database

    Fischer, Tomáš; Matyska, C.; Heinicke, J.

    2017-01-01

    Roč. 460, February (2017), s. 60-67 ISSN 0012-821X R&D Projects: GA MŠk LM2010008 Institutional support: RVO:67985530 Keywords : earthquake swarms * fluid triggering * crustal CO2 * fault valve Subject RIV: DC - Siesmology, Volcanology, Earth Structure OBOR OECD: Volcanology Impact factor: 4.409, year: 2016

  9. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip

    Science.gov (United States)

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A.; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne

    2018-01-01

    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 Mw (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes. PMID:29404404

  10. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    Science.gov (United States)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  11. New characteristics of intensity assessment of Sichuan Lushan "4.20" M s7.0 earthquake

    Science.gov (United States)

    Sun, Baitao; Yan, Peilei; Chen, Xiangzhao

    2014-08-01

    The post-earthquake rapid accurate assessment of macro influence of seismic ground motion is of significance for earthquake emergency relief, post-earthquake reconstruction and scientific research. The seismic intensity distribution map released by the Lushan earthquake field team of the China Earthquake Administration (CEA) five days after the strong earthquake ( M7.0) occurred in Lushan County of Sichuan Ya'an City at 8:02 on April 20, 2013 provides a scientific basis for emergency relief, economic loss assessment and post-earthquake reconstruction. In this paper, the means for blind estimation of macroscopic intensity, field estimation of macro intensity, and review of intensity, as well as corresponding problems are discussed in detail, and the intensity distribution characteristics of the Lushan "4.20" M7.0 earthquake and its influential factors are analyzed, providing a reference for future seismic intensity assessments.

  12. 2001 Bhuj, India, earthquake engineering seismoscope recordings and Eastern North America ground-motion attenuation relations

    Science.gov (United States)

    Cramer, C.H.; Kumar, A.

    2003-01-01

    Engineering seismoscope data collected at distances less than 300 km for the M 7.7 Bhuj, India, mainshock are compatible with ground-motion attenuation in eastern North America (ENA). The mainshock ground-motion data have been corrected to a common geological site condition using the factors of Joyner and Boore (2000) and a classification scheme of Quaternary or Tertiary sediments or rock. We then compare these data to ENA ground-motion attenuation relations. Despite uncertainties in recording method, geological site corrections, common tectonic setting, and the amount of regional seismic attenuation, the corrected Bhuj dataset agrees with the collective predictions by ENA ground-motion attenuation relations within a factor of 2. This level of agreement is within the dataset uncertainties and the normal variance for recorded earthquake ground motions.

  13. Prediction and Validation of Heat Release Direct Injection Diesel Engine Using Multi-Zone Model

    Science.gov (United States)

    Anang Nugroho, Bagus; Sugiarto, Bambang; Prawoto; Shalahuddin, Lukman

    2014-04-01

    The objective of this study is to develop simulation model which capable to predict heat release of diesel combustion accurately in efficient computation time. A multi-zone packet model has been applied to solve the combustion phenomena inside diesel cylinder. The model formulations are presented first and then the numerical results are validated on a single cylinder direct injection diesel engine at various engine speed and timing injections. The model were found to be promising to fulfill the objective above.

  14. Cooperative earthquake research between the United States and the People's Republic of China

    Energy Technology Data Exchange (ETDEWEB)

    Russ, D.P.; Johnson, L.E.

    1986-01-01

    This paper describes cooperative research by scientists of the US and the People's Republic of China (PRC) which has resulted in important new findings concerning the fundamental characteristics of earthquakes and new insight into mitigating earthquake hazards. There have been over 35 projects cooperatively sponsored by the Earthquake Studies Protocol in the past 5 years. The projects are organized into seven annexes, including investigations in earthquake prediction, intraplate faults and earthquakes, earthquake engineering and hazards investigation, deep crustal structure, rock mechanics, seismology, and data exchange. Operational earthquake prediction experiments are currently being developed at two primary sites: western Yunnan Province near the town of Xiaguan, where there are several active faults, and the northeast China plain, where the devastating 1976 Tangshan earthquake occurred.

  15. Regional Seismic Amplitude Modeling and Tomography for Earthquake-Explosion Discrimination

    Science.gov (United States)

    Walter, W. R.; Pasyanos, M. E.; Matzel, E.; Gok, R.; Sweeney, J.; Ford, S. R.; Rodgers, A. J.

    2008-12-01

    Empirically explosions have been discriminated from natural earthquakes using regional amplitude ratio techniques such as P/S in a variety of frequency bands. We demonstrate that such ratios discriminate nuclear tests from earthquakes using closely located pairs of earthquakes and explosions recorded on common, publicly available stations at test sites around the world (e.g. Nevada, Novaya Zemlya, Semipalatinsk, Lop Nor, India, Pakistan, and North Korea). We are examining if there is any relationship between the observed P/S and the point source variability revealed by longer period full waveform modeling. For example, regional waveform modeling shows strong tectonic release from the May 1998 India test, in contrast with very little tectonic release in the October 2006 North Korea test, but the P/S discrimination behavior appears similar in both events using the limited regional data available. While regional amplitude ratios such as P/S can separate events in close proximity, it is also empirically well known that path effects can greatly distort observed amplitudes and make earthquakes appear very explosion-like. Previously we have shown that the MDAC (Magnitude Distance Amplitude Correction, Walter and Taylor, 2001) technique can account for simple 1-D attenuation and geometrical spreading corrections, as well as magnitude and site effects. However in some regions 1-D path corrections are a poor approximation and we need to develop 2-D path corrections. Here we demonstrate a new 2-D attenuation tomography technique using the MDAC earthquake source model applied to a set of events and stations in both the Middle East and the Yellow Sea Korean Peninsula regions. We believe this new 2-D MDAC tomography has the potential to greatly improve earthquake-explosion discrimination, particularly in tectonically complex regions such as the Middle East.

  16. An interdisciplinary approach to study Pre-Earthquake processes

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S. A.; Hattori, K.; Taylor, P. T.

    2017-12-01

    We will summarize a multi-year research effort on wide-ranging observations of pre-earthquake processes. Based on space and ground data we present some new results relevant to the existence of pre-earthquake signals. Over the past 15-20 years there has been a major revival of interest in pre-earthquake studies in Japan, Russia, China, EU, Taiwan and elsewhere. Recent large magnitude earthquakes in Asia and Europe have shown the importance of these various studies in the search for earthquake precursors either for forecasting or predictions. Some new results were obtained from modeling of the atmosphere-ionosphere connection and analyses of seismic records (foreshocks /aftershocks), geochemical, electromagnetic, and thermodynamic processes related to stress changes in the lithosphere, along with their statistical and physical validation. This cross - disciplinary approach could make an impact on our further understanding of the physics of earthquakes and the phenomena that precedes their energy release. We also present the potential impact of these interdisciplinary studies to earthquake predictability. A detail summary of our approach and that of several international researchers will be part of this session and will be subsequently published in a new AGU/Wiley volume. This book is part of the Geophysical Monograph series and is intended to show the variety of parameters seismic, atmospheric, geochemical and historical involved is this important field of research and will bring this knowledge and awareness to a broader geosciences community.

  17. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    Science.gov (United States)

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  18. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  19. Effects of Strike-Slip Fault Segmentation on Earthquake Energy and Seismic Hazard

    Science.gov (United States)

    Madden, E. H.; Cooke, M. L.; Savage, H. M.; McBeck, J.

    2014-12-01

    Many major strike-slip faults are segmented along strike, including those along plate boundaries in California and Turkey. Failure of distinct fault segments at depth may be the source of multiple pulses of seismic radiation observed for single earthquakes. However, how and when segmentation affects fault behavior and energy release is the basis of many outstanding questions related to the physics of faulting and seismic hazard. These include the probability for a single earthquake to rupture multiple fault segments and the effects of segmentation on earthquake magnitude, radiated seismic energy, and ground motions. Using numerical models, we quantify components of the earthquake energy budget, including the tectonic work acting externally on the system, the energy of internal rock strain, the energy required to overcome fault strength and initiate slip, the energy required to overcome frictional resistance during slip, and the radiated seismic energy. We compare the energy budgets of systems of two en echelon fault segments with various spacing that include both releasing and restraining steps. First, we allow the fault segments to fail simultaneously and capture the effects of segmentation geometry on the earthquake energy budget and on the efficiency with which applied displacement is accommodated. Assuming that higher efficiency correlates with higher probability for a single, larger earthquake, this approach has utility for assessing the seismic hazard of segmented faults. Second, we nucleate slip along a weak portion of one fault segment and let the quasi-static rupture propagate across the system. Allowing fractures to form near faults in these models shows that damage develops within releasing steps and promotes slip along the second fault, while damage develops outside of restraining steps and can prohibit slip along the second fault. Work is consumed in both the propagation of and frictional slip along these new fractures, impacting the energy available

  20. Strain Anomalies during an Earthquake Sequence in the South Iceland Seismic Zone

    Science.gov (United States)

    Arnadottir, T.; Haines, A. J.; Geirsson, H.; Hreinsdottir, S.

    2017-12-01

    The South Iceland Seismic Zone (SISZ) accommodates E-W translation due to oblique spreading between the North American/Hreppar microplate and Eurasian plate, in South Iceland. Strain is released in the SISZ during earthquake sequences that last days to years, at average intervals of 80-100 years. The SISZ is currently in the midst of an earthquake sequence that started with two M6.5 earthquakes in June 2000, and continued with two M6 earthquakes in May 2008. Estimates of geometric strain accumulation, and seismic strain release in these events indicate that they released at most only half of the strain accumulated since the last earthquake cycle in 1896-1912. Annual GPS campaigns and continuous measurements during 2001-2015 were used to calculate station velocities and strain rates from a new method using the vertical derivatives of horizontal stress (VDoHS). This new method allows higher resolution of strain rates than other (older) approaches, as the strain rates are estimated by integrating VDoHS rates obtained by inversion rather than differentiating interpolated GPS velocities. Estimating the strain rates for eight 1-2 year intervals indicates temporal and spatial variation of strain rates in the SISZ. In addition to earthquake faulting, the strain rates in the SISZ are influenced by anthropogenic signals due to geothermal exploitation, and magma movements in neighboring volcanoes - Hekla and Eyjafjallajökull. Subtle signals of post-seismic strain rate changes are seen following the June 2000 M6.5 main shocks, but interestingly, much larger strain rate variations are observed after the two May 2008 M6 main shocks. A prominent strain anomaly is evident in the epicentral area prior to the May 2008 earthquake sequence. The strain signal persists over at least 4 years in the epicentral area, leading up to the M6 main shocks. The strain is primarily extension in ESE-WNW direction (sub-parallel to the direction of plate spreading), but overall shear across the N

  1. Housing Damage Following Earthquake

    Science.gov (United States)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  2. In vivo drug release behavior and osseointegration of a doxorubicin-loaded tissue-engineered scaffold

    DEFF Research Database (Denmark)

    Sun, Ming; Chen, Muwan; Wang, Miao

    2016-01-01

    Bone tissue-engineered scaffolds with therapeutic effects must meet the basic requirements as to support bone healing at the defect side and to release an effect drug within the therapeutic window. Here, a rapid prototyped PCL scaffold embedded with chitosan/nanoclay/β-tricalcium phosphate...

  3. Sensitivity of the engineered barrier system (EBS) release rate to alternative conceptual models of advective release from waste packages under dripping fractures

    International Nuclear Information System (INIS)

    Lee, J.H.; Atkins, J.E.; McNeish, J.A.; Vallikat, V.

    1996-01-01

    Simulations were conducted to analyze the sensitivity of the engineered barrier system (EBS) release rate to alternative conceptual models of the advective release from waste packages under dripping fractures. The first conceptual model assumed that dripping water directly contacts the waste form inside the 'failed' waste package, and radionuclides are released from the EBS by advection. The second conceptual model assumed that dripping water is diverted around the 'failed' waste package (because of the presence of corrosion products plugging the perforations) and dripping water is prevented from directly contacting the waste form. In the second model, radionuclides were assumed to transport through the perforations by diffusion, and, once outside the waste package, to be released from the EBS by advection. The second model was to incorporate more realism into the EBS release calculations. For the case with the second EBS release model, most radionuclides had significantly lower peak EBS release rates (from at least one to several orders of magnitude) than with the first EBS release model. The impacts of the alternative EBS release models were greater for the radionuclides with a low solubility (or solubility-limited radionuclides) than for the radionuclides with a high solubility (or waste form dissolution-limited radionuclides). The analyses indicated that the EBS release model representing advection through a 'failed' waste package (the first EBS release model) may be too conservative in predicting the EBS performance. One major implication from this sensitivity study was that a 'failed' waste package container with multiple perforations may still be able to perform effectively as an important barrier to radionuclide release. (author)

  4. Performance of HEPA filters at LLNL following the 1980 and 1989 earthquakes

    International Nuclear Information System (INIS)

    Bergman, W.; Elliott, J.; Wilson, K.

    1995-01-01

    The Lawrence Livermore National Laboratory has experienced two significant earthquakes for which data is available to assess the ability of HEPA filters to withstand seismic conditions. A 5.9 magnitude earthquake with an epicenter 10 miles from LLNL struck on January 24, l980. Estimates of the peak ground accelerations ranged from 0.2 to 0.3 g. A 7.0 magnitude earthquake with an epicenter about 50 miles from LLNL struck on October 17, 1989. Measurements of the ground accelerations at LLNL averaged 0.1 g. The results from the in-place filter tests obtained after each of the earthquakes were compiled and studied to determine if the earthquakes had caused filter leakage. Our study showed that only the 1980 earthquake resulted in a small increase in the number of HEPA filters developing leaks. In the 12 months following the 1980 and 1989 earthquakes, the in-place filter tests showed 8.0% and 4.1% of all filters respectively developed leaks. The average percentage of filters developing leaks from 1980 to 1993 was 3.3%+/-1.7%. The increase in the filter leaks is significant for the 1980 earthquake, but not for the 1989 earthquake. No contamination was detected following the earthquakes that would suggest transient releases from the filtration system

  5. Performance of HEPA filters at LLNL following the 1980 and 1989 earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Elliott, J.; Wilson, K. [Lawrence Livermore National Laboratory, CA (United States)

    1995-02-01

    The Lawrence Livermore National Laboratory has experienced two significant earthquakes for which data is available to assess the ability of HEPA filters to withstand seismic conditions. A 5.9 magnitude earthquake with an epicenter 10 miles from LLNL struck on January 24, l980. Estimates of the peak ground accelerations ranged from 0.2 to 0.3 g. A 7.0 magnitude earthquake with an epicenter about 50 miles from LLNL struck on October 17, 1989. Measurements of the ground accelerations at LLNL averaged 0.1 g. The results from the in-place filter tests obtained after each of the earthquakes were compiled and studied to determine if the earthquakes had caused filter leakage. Our study showed that only the 1980 earthquake resulted in a small increase in the number of HEPA filters developing leaks. In the 12 months following the 1980 and 1989 earthquakes, the in-place filter tests showed 8.0% and 4.1% of all filters respectively developed leaks. The average percentage of filters developing leaks from 1980 to 1993 was 3.3%+/-1.7%. The increase in the filter leaks is significant for the 1980 earthquake, but not for the 1989 earthquake. No contamination was detected following the earthquakes that would suggest transient releases from the filtration system.

  6. United States earthquake early warning system: how theory and analysis can save America before the big one happens

    OpenAIRE

    Rockabrand, Ryan

    2017-01-01

    Approved for public release; distribution is unlimited The United States is extremely vulnerable to catastrophic earthquakes. More than 143 million Americans may be threatened by damaging earthquakes in the next 50 years. This thesis argues that the United States is unprepared for the most catastrophic earthquakes the country faces today. Earthquake early warning systems are a major solution in practice to reduce economic risk, to protect property and the environment, and to save lives. Ot...

  7. Experimental evidence that thrust earthquake ruptures might open faults.

    Science.gov (United States)

    Gabuchian, Vahe; Rosakis, Ares J; Bhat, Harsha S; Madariaga, Raúl; Kanamori, Hiroo

    2017-05-18

    Many of Earth's great earthquakes occur on thrust faults. These earthquakes predominantly occur within subduction zones, such as the 2011 moment magnitude 9.0 eathquake in Tohoku-Oki, Japan, or along large collision zones, such as the 1999 moment magnitude 7.7 earthquake in Chi-Chi, Taiwan. Notably, these two earthquakes had a maximum slip that was very close to the surface. This contributed to the destructive tsunami that occurred during the Tohoku-Oki event and to the large amount of structural damage caused by the Chi-Chi event. The mechanism that results in such large slip near the surface is poorly understood as shallow parts of thrust faults are considered to be frictionally stable. Here we use earthquake rupture experiments to reveal the existence of a torquing mechanism of thrust fault ruptures near the free surface that causes them to unclamp and slip large distances. Complementary numerical modelling of the experiments confirms that the hanging-wall wedge undergoes pronounced rotation in one direction as the earthquake rupture approaches the free surface, and this torque is released as soon as the rupture breaks the free surface, resulting in the unclamping and violent 'flapping' of the hanging-wall wedge. Our results imply that the shallow extent of the seismogenic zone of a subducting interface is not fixed and can extend up to the trench during great earthquakes through a torquing mechanism.

  8. POST Earthquake Debris Management — AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  9. Rapid acceleration leads to rapid weakening in earthquake-like laboratory experiments

    Science.gov (United States)

    Chang, Jefferson C.; Lockner, David A.; Reches, Z.

    2012-01-01

    After nucleation, a large earthquake propagates as an expanding rupture front along a fault. This front activates countless fault patches that slip by consuming energy stored in Earth’s crust. We simulated the slip of a fault patch by rapidly loading an experimental fault with energy stored in a spinning flywheel. The spontaneous evolution of strength, acceleration, and velocity indicates that our experiments are proxies of fault-patch behavior during earthquakes of moment magnitude (Mw) = 4 to 8. We show that seismically determined earthquake parameters (e.g., displacement, velocity, magnitude, or fracture energy) can be used to estimate the intensity of the energy release during an earthquake. Our experiments further indicate that high acceleration imposed by the earthquake’s rupture front quickens dynamic weakening by intense wear of the fault zone.

  10. [Engineering aspects of seismic behavior of health-care facilities: lessons from California earthquakes].

    Science.gov (United States)

    Rutenberg, A

    1995-03-15

    The construction of health-care facilities is similar to that of other buildings. Yet the need to function immediately after an earthquake, the helplessness of the many patients and the high and continuous occupancy of these buildings, require that special attention be paid to their seismic performance. Here the lessons from the California experience are invaluable. In this paper the behavior of California hospitals during destructive earthquakes is briefly described. Adequate structural design and execution, and securing of nonstructural elements are required to ensure both safety of occupants, and practically uninterrupted functioning of equipment, mechanical and electrical services and other vital systems. Criteria for post-earthquake functioning are listed. In view of the hazards to Israeli hospitals, in particular those located along the Jordan Valley and the Arava, a program for the seismic evaluation of medical facilities should be initiated. This evaluation should consider the hazards from nonstructural elements, the safety of equipment and systems, and their ability to function after a severe earthquake. It should not merely concentrate on safety-related structural behavior.

  11. Correlation between Earthquakes and AE Monitoring of Historical Buildings in Seismic Areas

    Directory of Open Access Journals (Sweden)

    Giuseppe Lacidogna

    2015-12-01

    Full Text Available In this contribution a new method for evaluating seismic risk in regional areas based on the acoustic emission (AE technique is proposed. Most earthquakes have precursors, i.e., phenomena of changes in the Earth’s physical-chemical properties that take place prior to an earthquake. Acoustic emissions in materials and earthquakes in the Earth’s crust, despite the fact that they take place on very different scales, are very similar phenomena; both are caused by a release of elastic energy from a source located in a medium. For the AE monitoring, two important constructions of Italian cultural heritage are considered: the chapel of the “Sacred Mountain of Varallo” and the “Asinelli Tower” of Bologna. They were monitored during earthquake sequences in their relative areas. By using the Grassberger-Procaccia algorithm, a statistical method of analysis was developed that detects AEs as earthquake precursors or aftershocks. Under certain conditions it was observed that AEs precede earthquakes. These considerations reinforce the idea that the AE monitoring can be considered an effective tool for earthquake risk evaluation.

  12. Leveraging geodetic data to reduce losses from earthquakes

    Science.gov (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    event response products and by expanded use of geodetic imaging data to assess fault rupture and source parameters.Uncertainties in the NSHM, and in regional earthquake models, are reduced by fully incorporating geodetic data into earthquake probability calculations.Geodetic networks and data are integrated into the operations and earthquake information products of the Advanced National Seismic System (ANSS).Earthquake early warnings are improved by more rapidly assessing ground displacement and the dynamic faulting process for the largest earthquakes using real-time geodetic data.Methodology for probabilistic earthquake forecasting is refined by including geodetic data when calculating evolving moment release during aftershock sequences and by better understanding the implications of transient deformation for earthquake likelihood.A geodesy program that encompasses a balanced mix of activities to sustain missioncritical capabilities, grows new competencies through the continuum of fundamental to applied research, and ensures sufficient resources for these endeavors provides a foundation by which the EHP can be a leader in the application of geodesy to earthquake science. With this in mind the following objectives provide a framework to guide EHP efforts:Fully utilize geodetic information to improve key products, such as the NSHM and EEW, and to address new ventures like the USGS Subduction Zone Science Plan.Expand the variety, accuracy, and timeliness of post-earthquake information products, such as PAGER (Prompt Assessment of Global Earthquakes for Response), through incorporation of geodetic observations.Determine if geodetic measurements of transient deformation can significantly improve estimates of earthquake probability.Maintain an observational strategy aligned with the target outcomes of this document that includes continuous monitoring, recording of ephemeral observations, focused data collection for use in research, and application-driven data processing and

  13. Fabrication and characterization of a rapid prototyped tissue engineering scaffold with embedded multicomponent matrix for controlled drug release

    Directory of Open Access Journals (Sweden)

    Chen M

    2012-08-01

    Full Text Available Muwan Chen,1,2 Dang QS Le,1,2 San Hein,2 Pengcheng Li,1 Jens V Nygaard,2 Moustapha Kassem,3 Jørgen Kjems,2 Flemming Besenbacher,2 Cody Bünger11Orthopaedic Research Lab, Aarhus University Hospital, Aarhus C, Denmark; 2Interdisciplinary Nanoscience Center (iNANO, Aarhus University, Aarhus C, Denmark; 3Department of Endocrinology and Metabolism, Odense University Hospital, Odense C, DenmarkAbstract: Bone tissue engineering implants with sustained local drug delivery provide an opportunity for better postoperative care for bone tumor patients because these implants offer sustained drug release at the tumor site and reduce systemic side effects. A rapid prototyped macroporous polycaprolactone scaffold was embedded with a porous matrix composed of chitosan, nanoclay, and β-tricalcium phosphate by freeze-drying. This composite scaffold was evaluated on its ability to deliver an anthracycline antibiotic and to promote formation of mineralized matrix in vitro. Scanning electronic microscopy, confocal imaging, and DNA quantification confirmed that immortalized human bone marrow-derived mesenchymal stem cells (hMSC-TERT cultured in the scaffold showed high cell viability and growth, and good cell infiltration to the pores of the scaffold. Alkaline phosphatase activity and osteocalcin staining showed that the scaffold was osteoinductive. The drug-release kinetics was investigated by loading doxorubicin into the scaffold. The scaffolds comprising nanoclay released up to 45% of the drug for up to 2 months, while the scaffold without nanoclay released 95% of the drug within 4 days. Therefore, this scaffold can fulfill the requirements for both bone tissue engineering and local sustained release of an anticancer drug in vitro. These results suggest that the scaffold can be used clinically in reconstructive surgery after bone tumor resection. Moreover, by changing the composition and amount of individual components, the scaffold can find application in other

  14. Composite microsphere-functionalized scaffold for the controlled release of small molecules in tissue engineering

    Directory of Open Access Journals (Sweden)

    Laura Pandolfi

    2016-01-01

    Full Text Available Current tissue engineering strategies focus on restoring damaged tissue architectures using biologically active scaffolds. The ideal scaffold would mimic the extracellular matrix of any tissue of interest, promoting cell proliferation and de novo extracellular matrix deposition. A plethora of techniques have been evaluated to engineer scaffolds for the controlled and targeted release of bioactive molecules to provide a functional structure for tissue growth and remodeling, as well as enhance recruitment and proliferation of autologous cells within the implant. Recently, novel approaches using small molecules, instead of growth factors, have been exploited to regulate tissue regeneration. The use of small synthetic molecules could be very advantageous because of their stability, tunability, and low cost. Herein, we propose a chitosan–gelatin scaffold functionalized with composite microspheres consisting of mesoporous silicon microparticles and poly(dl-lactic-co-glycolic acid for the controlled release of sphingosine-1-phospate, a small molecule of interest. We characterized the platform with scanning electron microscopy, Fourier transform infrared spectroscopy, and confocal microscopy. Finally, the biocompatibility of this multiscale system was analyzed by culturing human mesenchymal stem cells onto the scaffold. The presented strategy establishes the basis of a versatile scaffold for the controlled release of small molecules and for culturing mesenchymal stem cells for regenerative medicine applications.

  15. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  16. Intermediate temperature heat release in an HCCI engine fueled by ethanol/n-heptane mixtures: An experimental and modeling study

    KAUST Repository

    Vuilleumier, David

    2014-03-01

    This study examines intermediate temperature heat release (ITHR) in homogeneous charge compression ignition (HCCI) engines using blends of ethanol and n-heptane. Experiments were performed over the range of 0-50% n-heptane liquid volume fractions, at equivalence ratios 0.4 and 0.5, and intake pressures from 1.4bar to 2.2bar. ITHR was induced in the mixtures containing predominantly ethanol through the addition of small amounts of n-heptane. After a critical threshold, additional n-heptane content yielded low temperature heat release (LTHR). A method for quantifying the amount of heat released during ITHR was developed by examining the second derivative of heat release, and this method was then used to identify trends in the engine data. The combustion process inside the engine was modeled using a single-zone HCCI model, and good qualitative agreement of pre-ignition pressure rise and heat release rate was found between experimental and modeling results using a detailed n-heptane/ethanol chemical kinetic model. The simulation results were used to identify the dominant reaction pathways contributing to ITHR, as well as to verify the chemical basis behind the quantification of the amount of ITHR in the experimental analysis. The dominant reaction pathways contributing to ITHR were found to be H-atom abstraction from n-heptane by OH and the addition of fuel radicals to O2. © 2013 The Combustion Institute.

  17. Intermediate temperature heat release in an HCCI engine fueled by ethanol/n-heptane mixtures: An experimental and modeling study

    KAUST Repository

    Vuilleumier, David; Kozarac, Darko; Mehl, Marco; Saxena, Samveg; Pitz, William J.; Dibble, Robert W.; Chen, Jyhyuan; Sarathy, Mani

    2014-01-01

    This study examines intermediate temperature heat release (ITHR) in homogeneous charge compression ignition (HCCI) engines using blends of ethanol and n-heptane. Experiments were performed over the range of 0-50% n-heptane liquid volume fractions, at equivalence ratios 0.4 and 0.5, and intake pressures from 1.4bar to 2.2bar. ITHR was induced in the mixtures containing predominantly ethanol through the addition of small amounts of n-heptane. After a critical threshold, additional n-heptane content yielded low temperature heat release (LTHR). A method for quantifying the amount of heat released during ITHR was developed by examining the second derivative of heat release, and this method was then used to identify trends in the engine data. The combustion process inside the engine was modeled using a single-zone HCCI model, and good qualitative agreement of pre-ignition pressure rise and heat release rate was found between experimental and modeling results using a detailed n-heptane/ethanol chemical kinetic model. The simulation results were used to identify the dominant reaction pathways contributing to ITHR, as well as to verify the chemical basis behind the quantification of the amount of ITHR in the experimental analysis. The dominant reaction pathways contributing to ITHR were found to be H-atom abstraction from n-heptane by OH and the addition of fuel radicals to O2. © 2013 The Combustion Institute.

  18. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    source and propagation of seismic waves. In many cases, active faults are capable of buildup and sudden release of tectonic stress. Hence, monitoring the active fault systems near epicentral regions of past earthquakes would be a necessity. In this paper, we try to detect possible anomalies in SLHF and AT during two moderate earthquakes of 6 - 6.5 M in Iran and explain the relationships between the seismic activities prior to these earthquake and active faulting in the area. Our analysis shows abnormal SLHF 5~10 days before these earthquakes. Meaningful anomalous concentrations usually occurred in the epicentral area. On the other hand, spatial distributions of these variations were in accordance with the local active faults. It is concluded that the anomalous increase in SLHF shows great potential in providing early warning of a disastrous earthquake, provided that there is a better understanding of the background noise due to the seasonal effects and climatic factors involved. Changes in near surface air temperature along nearby active faults, one or two weeks before the earthquakes, although not as significant as SLHF changes, can be considered as another earthquake indicator.

  19. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  20. Earthquakes of Garhwal Himalaya region of NW Himalaya, India: A study of relocated earthquakes and their seismogenic source and stress

    Science.gov (United States)

    R, A. P.; Paul, A.; Singh, S.

    2017-12-01

    Since the continent-continent collision 55 Ma, the Himalaya has accommodated 2000 km of convergence along its arc. The strain energy is being accumulated at a rate of 37-44 mm/yr and releases at time as earthquakes. The Garhwal Himalaya is located at the western side of a Seismic Gap, where a great earthquake is overdue atleast since 200 years. This seismic gap (Central Seismic Gap: CSG) with 52% probability for a future great earthquake is located between the rupture zones of two significant/great earthquakes, viz. the 1905 Kangra earthquake of M 7.8 and the 1934 Bihar-Nepal earthquake of M 8.0; and the most recent one, the 2015 Gorkha earthquake of M 7.8 is in the eastern side of this seismic gap (CSG). The Garhwal Himalaya is one of the ideal locations of the Himalaya where all the major Himalayan structures and the Himalayan Seimsicity Belt (HSB) can ably be described and studied. In the present study, we are presenting the spatio-temporal analysis of the relocated local micro-moderate earthquakes, recorded by a seismicity monitoring network, which is operational since, 2007. The earthquake locations are relocated using the HypoDD (double difference hypocenter method for earthquake relocations) program. The dataset from July, 2007- September, 2015 have been used in this study to estimate their spatio-temporal relationships, moment tensor (MT) solutions for the earthquakes of M>3.0, stress tensors and their interactions. We have also used the composite focal mechanism solutions for small earthquakes. The majority of the MT solutions show thrust type mechanism and located near the mid-crustal-ramp (MCR) structure of the detachment surface at 8-15 km depth beneath the outer lesser Himalaya and higher Himalaya regions. The prevailing stress has been identified to be compressional towards NNE-SSW, which is the direction of relative plate motion between the India and Eurasia continental plates. The low friction coefficient estimated along with the stress inversions

  1. Characteristics of broadband slow earthquakes explained by a Brownian model

    Science.gov (United States)

    Ide, S.; Takeo, A.

    2017-12-01

    Brownian slow earthquake (BSE) model (Ide, 2008; 2010) is a stochastic model for the temporal change of seismic moment release by slow earthquakes, which can be considered as a broadband phenomena including tectonic tremors, low frequency earthquakes, and very low frequency (VLF) earthquakes in the seismological frequency range, and slow slip events in geodetic range. Although the concept of broadband slow earthquake may not have been widely accepted, most of recent observations are consistent with this concept. Then, we review the characteristics of slow earthquakes and how they are explained by BSE model. In BSE model, the characteristic size of slow earthquake source is represented by a random variable, changed by a Gaussian fluctuation added at every time step. The model also includes a time constant, which divides the model behavior into short- and long-time regimes. In nature, the time constant corresponds to the spatial limit of tremor/SSE zone. In the long-time regime, the seismic moment rate is constant, which explains the moment-duration scaling law (Ide et al., 2007). For a shorter duration, the moment rate increases with size, as often observed for VLF earthquakes (Ide et al., 2008). The ratio between seismic energy and seismic moment is constant, as shown in Japan, Cascadia, and Mexico (Maury et al., 2017). The moment rate spectrum has a section of -1 slope, limited by two frequencies corresponding to the above time constant and the time increment of the stochastic process. Such broadband spectra have been observed for slow earthquakes near the trench axis (Kaneko et al., 2017). This spectrum also explains why we can obtain VLF signals by stacking broadband seismograms relative to tremor occurrence (e.g., Takeo et al., 2010; Ide and Yabe, 2014). The fluctuation in BSE model can be non-Gaussian, as far as the variance is finite, as supported by the central limit theorem. Recent observations suggest that tremors and LFEs are spatially characteristic

  2. Comparison of the November 2002 Denali and November 2001 Kunlun Earthquakes

    Science.gov (United States)

    Bufe, C. G.

    2002-12-01

    Major earthquakes occurred in Tibet on the central Kunlun fault (M 7.8) on November 14, 2001 (Lin and others, 2002) and in Alaska on the central Denali fault (M 7.9) on November 3, 2002. Both earthquakes generated large surface waves (Kunlun Ms 8.0 (USGS) and Denali Ms 8.5). Each event occurred on east-west-trending strike-slip faults and exhibited nearly unilateral rupture propagating several hundred kilometers from west to east. Surface rupture length estimates were about 400 km for Kunlun, 300 km for Denali. Maximum surface faulting and moment release were observed far to the east of the points of rupture initiation. Harvard moment centroids were located east of USGS epicenters by 182 km (Kunlun) and by 126 km (Denali). Maximum surface faulting was observed near 240 km (Kunlun, 16 m left lateral) and near 175 km (Denali, 9 m right lateral) east of the USGS epicenters. Significant thrust components were observed in the initiation of the Denali event (ERI analysis and mapped thrust) and in the termination of the Kunlun rupture, as evidenced by thrust mechanisms of the largest aftershocks which occurred near the eastern part of the Kunlun rupture. In each sequence the largest aftershock was about 2 orders of magnitude smaller than the mainshock. Moment release along the ruptured segments was examined for the 25-year periods preceding the main shocks. The Denali zone shows precursory accelerating moment release with the dominant events occurring on October 22, 1996 (M 5.8) and October 23, 2002 (M 6.7). The Kunlun zone shows nearly constant moment release over time with the last significant event before the main shock occurring on November 26, 2000 (M 5.4). Moment release data are consistent with previous observations of annual periodicity preceding major earthquakes, possibly due to the evolution of a critical state with seasonal and tidal triggering (Varnes and Bufe, 2001). Annual periodicity is also evident for the larger events in the greater San Francisco Bay

  3. Road Surfaces And Earthquake Engineering: A Theoretical And Experimental Study

    International Nuclear Information System (INIS)

    Pratico, Filippo Giammaria

    2008-01-01

    As is well known, road surfaces greatly affect vehicle-road interaction. As a consequence, road surfaces have a paramount influence on road safety and pavement management systems. On the other hand, earthquakes produce deformations able to modify road surface structure, properties and performance. In the light of these facts, the main goal of this paper has been confined into the modelling of road surface before, during and after the seismic event. The fundamentals of road surface texture theory have been stated in a general formulation. Models in the field of road profile generation and theoretical properties, before, during and after the earthquake, have been formulated and discussed. Practical applications can be hypothesised in the field of vehicle-road interaction as a result of road surface texture derived from deformations and accelerations caused by seismic or similar events

  4. Multi-hazard approaches to civil infrastructure engineering

    CERN Document Server

    LaFave, James

    2016-01-01

    This collection focuses on the development of novel approaches to address one of the most pressing challenges of civil engineering, namely the mitigation of natural hazards. Numerous engineering books to date have focused on, and illustrate considerable progress toward, mitigation of individual hazards (earthquakes, wind, and so forth.). The current volume addresses concerns related to overall safety, sustainability and resilience of the built environment when subject to multiple hazards: natural disaster events that are concurrent and either correlated (e.g., wind and surge); uncorrelated (e.g., earthquake and flood); cascading (e.g., fire following earthquake); or uncorrelated and occurring at different times (e.g., wind and earthquake). The authors examine a range of specific topics including methodologies for vulnerability assessment of structures, new techniques to reduce the system demands through control systems; instrumentation, monitoring and condition assessment of structures and foundations; new te...

  5. Earthquakes, detecting and understanding them

    International Nuclear Information System (INIS)

    2008-05-01

    The signatures at the surface of the Earth is continually changing on a geological timescale. The tectonic plates, which make up this surface, are moving in relation to each other. On human timescale, these movements are the result of earthquakes, which suddenly, release energy accumulated over a period of time. The vibrations they produce propagate through the interior of the Earth: these are seismic waves. However, other phenomena can generate seismic waves, such as volcanoes, quarry blasts, etc. The surf of the ocean waves on the coasts, the wind in the trees and human activity (industry and road traffic) all contribute to the 'seismic background noise'. Sensors are able to detect signals from events which are then discriminated, analyzed and located. Earthquakes and active volcanoes are not distributed randomly over the surface of the globe: they mainly coincide with mountain chains and ocean trenches and ridges. 'An earthquake results from the abrupt release of the energy accumulated by movements and rubbing of different plates'. The study of the propagation of seismic waves has allowed to determine the outline of the plates inside the Earth and has highlighted their movements. There are seven major plates which are colliding, diverging or sliding past each other. Each year the continents move several centimeters with respect to one another. This process, known as 'continental drift', was finally explained by plate tectonics. The initial hypothesis for this science dates from the beginning of the 20. century, but it was not confirmed until the 1960's. It explains that convection inside the Earth is the source of the forces required for these movements. This science, as well as explaining these great movements, has provided a coherent, unifying and quantitative framework, which unites the explanations for all the geophysical phenomena under one mechanism. (authors)

  6. Crowdsourcing earthquake damage assessment using remote sensing imagery

    Directory of Open Access Journals (Sweden)

    Stuart Gill

    2011-06-01

    Full Text Available This paper describes the evolution of recent work on using crowdsourced analysis of remote sensing imagery, particularly high-resolution aerial imagery, to provide rapid, reliable assessments of damage caused by earthquakes and potentially other disasters. The initial effort examined online imagery taken after the 2008 Wenchuan, China, earthquake. A more recent response to the 2010 Haiti earthquake led to the formation of an international consortium: the Global Earth Observation Catastrophe Assessment Network (GEO-CAN. The success of GEO-CAN in contributing to the official damage assessments made by the Government of Haiti, the United Nations, and the World Bank led to further development of a web-based interface. A current initiative in Christchurch, New Zealand, is underway where remote sensing experts are analyzing satellite imagery, geotechnical engineers are marking liquefaction areas, and structural engineers are identifying building damage. The current site includes online training to improve the accuracy of the assessments and make it possible for even novice users to contribute to the crowdsourced solution. The paper discusses lessons learned from these initiatives and presents a way forward for using crowdsourced remote sensing as a tool for rapid assessment of damage caused by natural disasters around the world.

  7. Evaluation of earthquake resistance design for underground structures of nuclear power plant, (1)

    International Nuclear Information System (INIS)

    Tohma, Junichi; Kokusho, Kenji; Iwatate, Takahiro; Ohtomo, Keizo

    1986-01-01

    As to earthquake resistant design of underground civil engineering structures related with emergency cooling water system of nuclear power plant, it is required these structures must maintain the function of great important their own facilities during earthquakes, especially for design earthquake motion. In this study, shaft pipline, pit and duct for cooling sea water facilities were chosen as typical underground structures, and the authors deal with the seismic design method for calculation of the principal sectional force in these structures generated by design earthquake motion. Especially, comparative investigations concerned with response displacement method versus dynamic analysis methods (lumped mass analysis and finite element analysis) are discussed. (author)

  8. Demonstration of pb-PSHA with Ras-Elhekma earthquake, Egypt

    Directory of Open Access Journals (Sweden)

    Elsayed Fergany

    2017-06-01

    Full Text Available The main goal of this work is to: (1 argue for the importance of a physically-based probabilistic seismic hazard analysis (pb-PSHA methodology and show examples to support the argument from recent events, (2 demonstrate the methodology with the ground motion simulations of May 28, 1998, Mw = 5.5 Ras-Elhekma earthquake, north Egypt. The boundaries for the possible rupture parameters that may have been identified prior to the 1998 Ras-Elhekma earthquake were estimated. A range of simulated ground-motions for the Ras-Elhekma earthquake was “predicted” for frequency 0.5–25 Hz at three sites, where the large earthquake was recorded, with average epicentral distances of 220 km. The best rupture model of the 1998 Ras-Elhekma earthquake was identified by calculated the goodness of fit between observed and synthesized records at sites FYM, HAG, and KOT. We used the best rupture scenario of the 1998 earthquake to synthesize the ground motions at interested sites where the main shock was not recorded. Based on the good fit of simulated and observed seismograms, we concluded that this methodology can provide realistic ground motion of an earthquake and highly recommended for engineering purposes in advance or foregoing large earthquakes at non record sites. We propose that there is a need for this methodology for good-representing the true hazard with reducing uncertainties.

  9. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  10. Streamflow responses in Chile to megathrust earthquakes in the 20th and 21st centuries

    Science.gov (United States)

    Mohr, Christian; Manga, Michael; Wang, Chi-yuen; Korup, Oliver

    2016-04-01

    Both coseismic static stress and dynamic stresses associated with seismic waves may cause responses in hydrological systems. Such responses include changes in the water level, hydrochemistry and streamflow discharge. Earthquake effects on hydrological systems provide a means to study the interaction between stress changes and regional hydrology, which is otherwise rarely possible. Chile is a country of frequent and large earthquakes and thus provides abundant opportunities to study such interactions and processes. We analyze streamflow responses in Chile to several megathrust earthquakes, including the 1943 Mw 8.1 Coquimbo, 1950 Mw 8.2 Antofagasta, 1960 Mw 9.5 Valdivia, 1985 Mw 8.0 Valparaiso, 1995 Mw 8.0 Antofagasta, 2010 Mw 8.8 Maule, and the 2014 Mw 8.2 Iquique earthquakes. We use data from 716 stream gauges distributed from the Altiplano in the North to Tierra del Fuego in the South. This network covers the Andes mountain ranges, the central valley, the Coastal Mountain ranges and (mainly in the more southern parts) the Coastal flats. We combine empirical magnitude-distance relationships, machine learning tools, and process-based modeling to characterize responses. We first assess the streamflow anomalies and relate these to topographical, hydro-climatic, geological and earthquake-related (volumetric and dynamic strain) factors using various classifiers. We then apply 1D-groundwater flow modeling to selected catchments in order to test competing hypotheses for the origin of streamflow changes. We show that the co-seismic responses of streamflow mostly involved increasing discharges. We conclude that enhanced vertical permeability can explain most streamflow responses at the regional scale. The total excess water released by a single earthquake, i.e. the Maule earthquake, yielded up to 1 km3. Against the background of megathrust earthquakes frequently hitting Chile, the amount of water released by earthquakes is substantial, particularly for the arid northern

  11. Remotely Triggered Earthquakes Recorded by EarthScope's Transportable Array and Regional Seismic Networks: A Case Study Of Four Large Earthquakes

    Science.gov (United States)

    Velasco, A. A.; Cerda, I.; Linville, L.; Kilb, D. L.; Pankow, K. L.

    2013-05-01

    Changes in field stress required to trigger earthquakes have been classified in two basic ways: static and dynamic triggering. Static triggering occurs when an earthquake that releases accumulated strain along a fault stress loads a nearby fault. Dynamic triggering occurs when an earthquake is induced by the passing of seismic waves from a large mainshock located at least two or more fault lengths from the epicenter of the main shock. We investigate details of dynamic triggering using data collected from EarthScope's USArray and regional seismic networks located in the United States. Triggered events are identified using an optimized automated detector based on the ratio of short term to long term average (Antelope software). Following the automated processing, the flagged waveforms are individually analyzed, in both the time and frequency domains, to determine if the increased detection rates correspond to local earthquakes (i.e., potentially remotely triggered aftershocks). Here, we show results using this automated schema applied to data from four large, but characteristically different, earthquakes -- Chile (Mw 8.8 2010), Tokoku-Oki (Mw 9.0 2011), Baja California (Mw 7.2 2010) and Wells Nevada (Mw 6.0 2008). For each of our four mainshocks, the number of detections within the 10 hour time windows span a large range (1 to over 200) and statistically >20% of the waveforms show evidence of anomalous signals following the mainshock. The results will help provide for a better understanding of the physical mechanisms involved in dynamic earthquake triggering and will help identify zones in the continental U.S. that may be more susceptible to dynamic earthquake triggering.

  12. Engineering models for catastrophe risk and their application to insurance

    Science.gov (United States)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  13. The 2016 Kumamoto earthquake sequence.

    Science.gov (United States)

    Kato, Aitaro; Nakamura, Kouji; Hiyama, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An M j 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an M j 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest.

  14. The 2016 Kumamoto earthquake sequence

    Science.gov (United States)

    KATO, Aitaro; NAKAMURA, Kouji; HIYAMA, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An Mj 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an Mj 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest. PMID:27725474

  15. A global building inventory for earthquake loss estimation and risk management

    Science.gov (United States)

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  16. Earthquake clustering in modern seismicity and its relationship with strong historical earthquakes around Beijing, China

    Science.gov (United States)

    Wang, Jian; Main, Ian G.; Musson, Roger M. W.

    2017-11-01

    Beijing, China's capital city, is located in a typical intraplate seismic belt, with relatively high-quality instrumental catalogue data available since 1970. The Chinese historical earthquake catalogue contains six strong historical earthquakes of Ms ≥ 6 around Beijing, the earliest in 294 AD. This poses a significant potential hazard to one of the most densely populated and economically active parts of China. In some intraplate areas, persistent clusters of events associated with historical events can occur over centuries, for example, the ongoing sequence in the New Madrid zone of the eastern US. Here we will examine the evidence for such persistent clusters around Beijing. We introduce a metric known as the `seismic density index' that quantifies the degree of clustering of seismic energy release. For a given map location, this multi-dimensional index depends on the number of events, their magnitudes, and the distances to the locations of the surrounding population of earthquakes. We apply the index to modern instrumental catalogue data between 1970 and 2014, and identify six clear candidate zones. We then compare these locations to earthquake epicentre and seismic intensity data for the six largest historical earthquakes. Each candidate zone contains one of the six historical events, and the location of peak intensity is within 5 km or so of the reported epicentre in five of these cases. In one case—the great Ms 8 earthquake of 1679—the peak is closer to the area of strongest shaking (Intensity XI or more) than the reported epicentre. The present-day event rates are similar to those predicted by the modified Omori law but there is no evidence of ongoing decay in event rates. Accordingly, the index is more likely to be picking out the location of persistent weaknesses in the lithosphere. Our results imply zones of high seismic density index could be used in principle to indicate the location of unrecorded historical of palaeoseismic events, in China and

  17. USGS response to an urban earthquake, Northridge '94

    Science.gov (United States)

    Updike, Randall G.; Brown, William M.; Johnson, Margo L.; Omdahl, Eleanor M.; Powers, Philip S.; Rhea, Susan; Tarr, Arthur C.

    1996-01-01

    The urban centers of our Nation provide our people with seemingly unlimited employment, social, and cultural opportunities as a result of the complex interactions of a diverse population embedded in an highly-engineered environment. Catastrophic events in one or more of the natural earth systems which underlie or envelop urban environment can have radical effects on the integrity and survivability of that environment. Earthquakes have for centuries been the source of cataclysmic events on cities throughout the world. Unlike many other earth processes, the effects of major earthquakes transcend all political, social, and geomorphic boundaries and can have decided impact on cities tens to hundreds of kilometers from the epicenter. In modern cities, where buildings, transportation corridors, and lifelines are complexly interrelated, the life, economic, and social vulnerabilities in the face of a major earthquake can be particularly acute.

  18. Damages of industrial equipments in the 1995 Hyougoken-Nanbu Earthquake

    International Nuclear Information System (INIS)

    Iwatsubo, Takuzo

    1997-01-01

    Hanshin-Awaji area has a population of approximately 3 million and many industries, including heavy industry, harbor facilities and international trading companies. The 1995 Hyougoken-Nanbu Earthquake occurred just in this area which is 25kmx2km oblong containing Kobe city. About 5,500 people were killed and 250,000 people lost their houses. Japan society of mechanical engineers organized the investigative committee of earthquake disaster of industrial equipments after the earthquake in order to investigate the disaster damages of industrial equipments and to give data for a design manual for mechanical equipments against earthquake excitation. This is an investigation report of the disaster damages of industrial machine equipments. Damages to machine equipment of industries in the high intensity region of the earthquake are illustrated. The mechanisms of the damages and measures against earthquake and safety of nuclear power plant design are discussed. Then it is known that the design of nuclear power plant is different from the general industrial facilities and the damage which was suffered in the general industrial facilities does not occur in the nuclear power plant. (J.P.N.)

  19. Damages of industrial equipments in the 1995 Hyougoken-Nanbu Earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Iwatsubo, Takuzo [Kobe Univ. (Japan). Faculty of Engineering

    1997-03-01

    Hanshin-Awaji area has a population of approximately 3 million and many industries, including heavy industry, harbor facilities and international trading companies. The 1995 Hyougoken-Nanbu Earthquake occurred just in this area which is 25kmx2km oblong containing Kobe city. About 5,500 people were killed and 250,000 people lost their houses. Japan society of mechanical engineers organized the investigative committee of earthquake disaster of industrial equipments after the earthquake in order to investigate the disaster damages of industrial equipments and to give data for a design manual for mechanical equipments against earthquake excitation. This is an investigation report of the disaster damages of industrial machine equipments. Damages to machine equipment of industries in the high intensity region of the earthquake are illustrated. The mechanisms of the damages and measures against earthquake and safety of nuclear power plant design are discussed. Then it is known that the design of nuclear power plant is different from the general industrial facilities and the damage which was suffered in the general industrial facilities does not occur in the nuclear power plant. (J.P.N.)

  20. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    problems which began to discuss only during the last time. Earthquakes often precede volcanic eruptions. According to Darwin, the earthquake-induced shock may be a common mechanism of the simultaneous eruptions of the volcanoes separated by long distances. In particular, Darwin wrote that ‘… the elevation of many hundred square miles of territory near Concepcion is part of the same phenomenon, with that splashing up, if I may so call it, of volcanic matter through the orifices in the Cordillera at the moment of the shock;…'. According to Darwin the crust is a system where fractured zones, and zones of seismic and volcanic activities interact. Darwin formulated the task of considering together the processes studied now as seismology and volcanology. However the difficulties are such that the study of interactions between earthquakes and volcanoes began only recently and his works on this had relatively little impact on the development of geosciences. In this report, we discuss how the latest data on seismic and volcanic events support the Darwin's observations and ideas about the 1835 Chilean earthquake. The material from researchspace. auckland. ac. nz/handle/2292/4474 is used. We show how modern mechanical tests from impact engineering and simple experiments with weakly-cohesive materials also support his observations and ideas. On the other hand, we developed the mathematical theory of the earthquake-induced catastrophic wave phenomena. This theory allow to explain the most important aspects the Darwin's earthquake reports. This is achieved through the simplification of fundamental governing equations of considering problems to strongly-nonlinear wave equations. Solutions of these equations are constructed with the help of analytic and numerical techniques. The solutions can model different strongly-nonlinear wave phenomena which generate in a variety of physical context. A comparison with relevant experimental observations is also presented.

  1. Airborne engineered nanomaterials in the workplace—a review of release and worker exposure during nanomaterial production and handling processes

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yaobo [Institute for Work and Health (IST), Universities of Lausanne and Geneva, Route de la Corniche 2, 1066, Epalinges (Switzerland); Kuhlbusch, Thomas A.J. [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Centre for Nanointegration (CENIDE), University Duisburg-Essen, Duisburg (Germany); Van Tongeren, Martie; Jiménez, Araceli Sánchez [Centre for Human Exposure Science, Institute of Occupational Medicine (IOM), Research Avenue North, Edinburgh EH14 4AP (United Kingdom); Tuinman, Ilse [TNO, Lange Kleiweg 137, Rijswijk (Netherlands); Chen, Rui [CAS Key Laboratory for Biomedical Effects of Nanomaterials and Nanosafety & CAS Center for Excellence in Nanoscience, National Center for Nanoscience and Technology of China, Beijing 100190 (China); Alvarez, Iñigo Larraza [ACCIONA Infrastructure, Materials Area, Innovation Division, C/Valportillo II 8, 28108, Alcobendas (Spain); Mikolajczyk, Urszula [Nofer Institute of Occupational Medicine, Lodz (Poland); Nickel, Carmen; Meyer, Jessica; Kaminski, Heinz [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Wohlleben, Wendel [Dept. Material Physics, BASF SE, Advanced Materials Research, Ludwigshafen (Germany); Stahlmecke, Burkhard [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Clavaguera, Simon [NanoSafety Platform, Commissariat à l’Energie Atomique et aux Energies Alternatives (CEA), Univ. Grenoble Alpes, Grenoble, 38054 (France); and others

    2017-01-15

    Highlights: • Release characteristics can be grouped by the type of occupational activities. • Release levels may be linked to process energy. • A better data reporting practice will facilitate exposure assessment. • The results help prioritize industrial processes for human risk assessment. - Abstract: For exposure and risk assessment in occupational settings involving engineered nanomaterials (ENMs), it is important to understand the mechanisms of release and how they are influenced by the ENM, the matrix material, and process characteristics. This review summarizes studies providing ENM release information in occupational settings, during different industrial activities and using various nanomaterials. It also assesses the contextual information — such as the amounts of materials handled, protective measures, and measurement strategies — to understand which release scenarios can result in exposure. High-energy processes such as synthesis, spraying, and machining were associated with the release of large numbers of predominantly small-sized particles. Low-energy processes, including laboratory handling, cleaning, and industrial bagging activities, usually resulted in slight or moderate releases of relatively large agglomerates. The present analysis suggests that process-based release potential can be ranked, thus helping to prioritize release assessments, which is useful for tiered exposure assessment approaches and for guiding the implementation of workplace safety strategies. The contextual information provided in the literature was often insufficient to directly link release to exposure. The studies that did allow an analysis suggested that significant worker exposure might mainly occur when engineering safeguards and personal protection strategies were not carried out as recommended.

  2. Airborne engineered nanomaterials in the workplace—a review of release and worker exposure during nanomaterial production and handling processes

    International Nuclear Information System (INIS)

    Ding, Yaobo; Kuhlbusch, Thomas A.J.; Van Tongeren, Martie; Jiménez, Araceli Sánchez; Tuinman, Ilse; Chen, Rui; Alvarez, Iñigo Larraza; Mikolajczyk, Urszula; Nickel, Carmen; Meyer, Jessica; Kaminski, Heinz; Wohlleben, Wendel; Stahlmecke, Burkhard; Clavaguera, Simon

    2017-01-01

    Highlights: • Release characteristics can be grouped by the type of occupational activities. • Release levels may be linked to process energy. • A better data reporting practice will facilitate exposure assessment. • The results help prioritize industrial processes for human risk assessment. - Abstract: For exposure and risk assessment in occupational settings involving engineered nanomaterials (ENMs), it is important to understand the mechanisms of release and how they are influenced by the ENM, the matrix material, and process characteristics. This review summarizes studies providing ENM release information in occupational settings, during different industrial activities and using various nanomaterials. It also assesses the contextual information — such as the amounts of materials handled, protective measures, and measurement strategies — to understand which release scenarios can result in exposure. High-energy processes such as synthesis, spraying, and machining were associated with the release of large numbers of predominantly small-sized particles. Low-energy processes, including laboratory handling, cleaning, and industrial bagging activities, usually resulted in slight or moderate releases of relatively large agglomerates. The present analysis suggests that process-based release potential can be ranked, thus helping to prioritize release assessments, which is useful for tiered exposure assessment approaches and for guiding the implementation of workplace safety strategies. The contextual information provided in the literature was often insufficient to directly link release to exposure. The studies that did allow an analysis suggested that significant worker exposure might mainly occur when engineering safeguards and personal protection strategies were not carried out as recommended.

  3. The Strain Energy, Seismic Moment and Magnitudes of Large Earthquakes

    Science.gov (United States)

    Purcaru, G.

    2004-12-01

    The strain energy Est, as potential energy, released by an earthquake and the seismic moment Mo are two fundamental physical earthquake parameters. The earthquake rupture process ``represents'' the release of the accumulated Est. The moment Mo, first obtained in 1966 by Aki, revolutioned the quantification of earthquake size and led to the elimination of the limitations of the conventional magnitudes (originally ML, Richter, 1930) mb, Ms, m, MGR. Both Mo and Est, not in a 1-to-1 correspondence, are uniform measures of the size, although Est is presently less accurate than Mo. Est is partitioned in seismic- (Es), fracture- (Eg) and frictional-energy Ef, and Ef is lost as frictional heat energy. The available Est = Es + Eg (Aki and Richards (1980), Kostrov and Das, (1988) for fundamentals on Mo and Est). Related to Mo, Est and Es, several modern magnitudes were defined under various assumptions: the moment magnitude Mw (Kanamori, 1977), strain energy magnitude ME (Purcaru and Berckhemer, 1978), tsunami magnitude Mt (Abe, 1979), mantle magnitude Mm (Okal and Talandier, 1987), seismic energy magnitude Me (Choy and Boatright, 1995, Yanovskaya et al, 1996), body-wave magnitude Mpw (Tsuboi et al, 1998). The available Est = (1/2μ )Δ σ Mo, Δ σ ~=~average stress drop, and ME is % \\[M_E = 2/3(\\log M_o + \\log(\\Delta\\sigma/\\mu)-12.1) ,\\] % and log Est = 11.8 + 1.5 ME. The estimation of Est was modified to include Mo, Δ and μ of predominant high slip zones (asperities) to account for multiple events (Purcaru, 1997): % \\[E_{st} = \\frac{1}{2} \\sum_i {\\frac{1}{\\mu_i} M_{o,i} \\Delta\\sigma_i} , \\sum_i M_{o,i} = M_o \\] % We derived the energy balance of Est, Es and Eg as: % \\[ E_{st}/M_o = (1+e(g,s)) E_s/M_o , e(g,s) = E_g/E_s \\] % We analyzed a set of about 90 large earthquakes and found that, depending on the goal these magnitudes quantify differently the rupture process, thus providing complementary means of earthquake characterization. Results for some

  4. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  5. Influences of use activities and waste management on environmental releases of engineered nanomaterials

    International Nuclear Information System (INIS)

    Wigger, Henning; Hackmann, Stephan; Zimmermann, Till; Köser, Jan; Thöming, Jorg; Gleich, Arnim von

    2015-01-01

    Engineered nanomaterials (ENM) offer enhanced or new functionalities and properties that are used in various products. This also entails potential environmental risks in terms of hazard and exposure. However, hazard and exposure assessment for ENM still suffer from insufficient knowledge particularly for product-related releases and environmental fate and behavior. This study therefore analyzes the multiple impacts of the product use, the properties of the matrix material, and the related waste management system (WMS) on the predicted environmental concentration (PEC) by applying nine prospective life cycle release scenarios based on reasonable assumptions. The products studied here are clothing textiles treated with silver nanoparticles (AgNPs), since they constitute a controversial application. Surprisingly, the results show counter-intuitive increases by a factor of 2.6 in PEC values for the air compartment in minimal AgNP release scenarios. Also, air releases can shift from washing to wearing activity; their associated release points may shift accordingly, potentially altering release hot spots. Additionally, at end-of-life, the fraction of AgNP-residues contained on exported textiles can be increased by 350% when assuming short product lifespans and globalized WMS. It becomes evident that certain combinations of use activities, matrix material characteristics, and WMS can influence the regional PEC by several orders of magnitude. Thus, in the light of the findings and expected ENM market potential, future assessments should consider these aspects to derive precautionary design alternatives and to enable prospective global and regional risk assessments. - Highlights: • Textile use activities and two waste management systems (WMSs) are investigated. • Matrix material and use activities determine the ENM release. • Counter-intuitive shifts of releases to air can happen during usage. • WMS export can increase by 350% in case of short service life and

  6. Influences of use activities and waste management on environmental releases of engineered nanomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Wigger, Henning, E-mail: hwigger@uni-bremen.de [Faculty of Production Engineering, Department of Technological Design and Development, University of Bremen, Badgasteiner Str. 1, 28359 Bremen (Germany); Hackmann, Stephan [UFT Center for Environmental Research and Sustainable Technology, Department of General and Theoretical Ecology, University of Bremen, Leobener Str., 28359 Bremen (Germany); Zimmermann, Till [Faculty of Production Engineering, Department of Technological Design and Development, University of Bremen, Badgasteiner Str. 1, 28359 Bremen (Germany); ARTEC — Research Center for Sustainability Studies, Enrique-Schmidt-Str. 7, 28359 Bremen (Germany); Köser, Jan [UFT Center for Environmental Research and Sustainable Technology, Department of Sustainable Chemistry, University of Bremen, Leobener Str., 28359 Bremen (Germany); Thöming, Jorg [UFT Center for Environmental Research and Sustainable Technology, Department of Sustainable Chemical Engineering, University of Bremen, Leobener Str., 28359 Bremen (Germany); Gleich, Arnim von [Faculty of Production Engineering, Department of Technological Design and Development, University of Bremen, Badgasteiner Str. 1, 28359 Bremen (Germany); ARTEC — Research Center for Sustainability Studies, Enrique-Schmidt-Str. 7, 28359 Bremen (Germany)

    2015-12-01

    Engineered nanomaterials (ENM) offer enhanced or new functionalities and properties that are used in various products. This also entails potential environmental risks in terms of hazard and exposure. However, hazard and exposure assessment for ENM still suffer from insufficient knowledge particularly for product-related releases and environmental fate and behavior. This study therefore analyzes the multiple impacts of the product use, the properties of the matrix material, and the related waste management system (WMS) on the predicted environmental concentration (PEC) by applying nine prospective life cycle release scenarios based on reasonable assumptions. The products studied here are clothing textiles treated with silver nanoparticles (AgNPs), since they constitute a controversial application. Surprisingly, the results show counter-intuitive increases by a factor of 2.6 in PEC values for the air compartment in minimal AgNP release scenarios. Also, air releases can shift from washing to wearing activity; their associated release points may shift accordingly, potentially altering release hot spots. Additionally, at end-of-life, the fraction of AgNP-residues contained on exported textiles can be increased by 350% when assuming short product lifespans and globalized WMS. It becomes evident that certain combinations of use activities, matrix material characteristics, and WMS can influence the regional PEC by several orders of magnitude. Thus, in the light of the findings and expected ENM market potential, future assessments should consider these aspects to derive precautionary design alternatives and to enable prospective global and regional risk assessments. - Highlights: • Textile use activities and two waste management systems (WMSs) are investigated. • Matrix material and use activities determine the ENM release. • Counter-intuitive shifts of releases to air can happen during usage. • WMS export can increase by 350% in case of short service life and

  7. Study on vibration behaviors of engineered barrier system

    International Nuclear Information System (INIS)

    Mikoshiba, Tadashi; Ogawa, Nobuyuki; Minowa, Chikahiro

    1998-01-01

    High-level radioactive wastes have been buried underground by packing into a strong sealed container made from carbon steel (over-pack) with buffer material (bentonite). The engineered barrier system constructed with an overpack and buffer materials must be resistant to earthquakes as well as invasion of groundwater for a long period. Therefore, seismic evaluation of barrier system for earthquakes is indispensable especially in Japan to keep its structural safety. Here, the effects of earthquake vibration on the engineered barrier systems were investigated experimentally. Random-wave vibration and practical seismic wave one were loaded for the systems and fundamental data were obtained. For the former vibration the response characteristics of both engineered barrier models constructed with overpack and bentonite were non-linear. For the latter one, the stress in bentonite was increased in proportion to the vibration level. (M.N.)

  8. Heterogeneous rupture in the great Cascadia earthquake of 1700 inferred from coastal subsidence estimates

    Science.gov (United States)

    Wang, Pei-Ling; Engelhart, Simon E.; Wang, Kelin; Hawkes, Andrea D.; Horton, Benjamin P.; Nelson, Alan R.; Witter, Robert C.

    2013-01-01

    Past earthquake rupture models used to explain paleoseismic estimates of coastal subsidence during the great A.D. 1700 Cascadia earthquake have assumed a uniform slip distribution along the megathrust. Here we infer heterogeneous slip for the Cascadia margin in A.D. 1700 that is analogous to slip distributions during instrumentally recorded great subduction earthquakes worldwide. The assumption of uniform distribution in previous rupture models was due partly to the large uncertainties of then available paleoseismic data used to constrain the models. In this work, we use more precise estimates of subsidence in 1700 from detailed tidal microfossil studies. We develop a 3-D elastic dislocation model that allows the slip to vary both along strike and in the dip direction. Despite uncertainties in the updip and downdip slip extensions, the more precise subsidence estimates are best explained by a model with along-strike slip heterogeneity, with multiple patches of high-moment release separated by areas of low-moment release. For example, in A.D. 1700, there was very little slip near Alsea Bay, Oregon (~44.4°N), an area that coincides with a segment boundary previously suggested on the basis of gravity anomalies. A probable subducting seamount in this area may be responsible for impeding rupture during great earthquakes. Our results highlight the need for more precise, high-quality estimates of subsidence or uplift during prehistoric earthquakes from the coasts of southern British Columbia, northern Washington (north of 47°N), southernmost Oregon, and northern California (south of 43°N), where slip distributions of prehistoric earthquakes are poorly constrained.

  9. The earthquakes of the Baltic shield

    International Nuclear Information System (INIS)

    Slunga, R.

    1990-06-01

    More than 200 earthquakes in the Baltic Shield area in the size range ML 0.6-4.5 have been studied by dense regional seismic networks. The analysis includes focal depths, dynamic source parameters, and fault plane solutions. In southern Sweden a long part of the Protogene zone marks a change in the seismic activity. The focal depths indicate three crustal layers: Upper crust (0-18 km in southern Sweden, 0-13 km in northern Sweden), middle crust down to 35 km, and the quiet lower crust. The fault plane solutions show that strike-slip is dominating. Along the Tornquist line significant normal faulting occurs. The stresses released by the earthquakes show a remarkable consistency with a regional principle compression N60W. This indicates that plate-tectonic processes are more important than the land uplift. The spatial distribution is consistent with a model where the earthquakes are breakdowns of asperities on normally stably sliding faults. The aseismic sliding is estimated to be 2000 times more extensive than the seismic sliding. Southern Sweden is estimated to deform horizontally at a rate of 1 mm/year or more. (orig.)

  10. Statistical physics approach to earthquake occurrence and forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Arcangelis, Lucilla de [Department of Industrial and Information Engineering, Second University of Naples, Aversa (CE) (Italy); Godano, Cataldo [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy); Grasso, Jean Robert [ISTerre, IRD-CNRS-OSUG, University of Grenoble, Saint Martin d’Héres (France); Lippiello, Eugenio, E-mail: eugenio.lippiello@unina2.it [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy)

    2016-04-25

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space–time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for

  11. An Fc engineering approach that modulates antibody-dependent cytokine release without altering cell-killing functions.

    Science.gov (United States)

    Kinder, Michelle; Greenplate, Allison R; Strohl, William R; Jordan, Robert E; Brezski, Randall J

    2015-01-01

    Cytotoxic therapeutic monoclonal antibodies (mAbs) often mediate target cell-killing by eliciting immune effector functions via Fc region interactions with cellular and humoral components of the immune system. Key functions include antibody-dependent cell-mediated cytotoxicity (ADCC), antibody-dependent cellular phagocytosis (ADCP), and complement-dependent cytotoxicity (CDC). However, there has been increased appreciation that along with cell-killing functions, the induction of antibody-dependent cytokine release (ADCR) can also influence disease microenvironments and therapeutic outcomes. Historically, most Fc engineering approaches have been aimed toward modulating ADCC, ADCP, or CDC. In the present study, we describe an Fc engineering approach that, while not resulting in impaired ADCC or ADCP, profoundly affects ADCR. As such, when peripheral blood mononuclear cells are used as effector cells against mAb-opsonized tumor cells, the described mAb variants elicit a similar profile and quantity of cytokines as IgG1. In contrast, although the variants elicit similar levels of tumor cell-killing as IgG1 with macrophage effector cells, the variants do not elicit macrophage-mediated ADCR against mAb-opsonized tumor cells. This study demonstrates that Fc engineering approaches can be employed to uncouple macrophage-mediated phagocytic and subsequent cell-killing functions from cytokine release.

  12. The current situation of the NDL Great East Japan Earthquake Archive 'HINAGIKU'

    International Nuclear Information System (INIS)

    Suwa, Yasuko

    2014-01-01

    On March 7, 2013, the National Diet Library (NDL) started full-scale operation of the NDL Great East Japan Earthquake Archive 'HINAGIKU'. Hinagiku is the Searching Portal that enables integrated search and utilization of sound and videos, pictures, websites, etc., about the Great East Japan Earthquake. Its aim is to hand down all records and lessons to future generations and to utilize them for the restoration and reconstruction of the affected areas and for disaster prevention measures. Since its release last year, Hinagiku has been enlarging search targets in cooperation with related institutions. In this article, I will give an overview of the NDL Great East Japan Earthquake Archive and discuss about its challenges for the future. (author)

  13. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  14. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  15. IAEA safety guides in the light of recent developments in earthquake engineering

    International Nuclear Information System (INIS)

    Gurpinar, A.

    1988-11-01

    The IAEA safety guides 50-SG-S1 and 50-SG-S2 emphasize on the determination of the design basis earthquake ground motion and earthquake resistant design considerations for nuclear power plants, respectively. Since the elaboration of these safety guides years have elapsed and a review of some of these concepts is necessary, taking into account the information collected and the technical developments. In this article, topics within the scope of these safety guides are discussed. In particular, the results of some recent research which may have a bearing on the nuclear industry are highlighted. Conclusions and recommendations are presented. 6 fig., 19 refs. (F.M.)

  16. Disaster mitigation science for Earthquakes and Tsunamis -For resilience society against natural disasters-

    Science.gov (United States)

    Kaneda, Y.; Takahashi, N.; Hori, T.; Kawaguchi, K.; Isouchi, C.; Fujisawa, K.

    2017-12-01

    Destructive natural disasters such as earthquakes and tsunamis have occurred frequently in the world. For instance, 2004 Sumatra Earthquake in Indonesia, 2008 Wenchuan Earthquake in China, 2010 Chile Earthquake and 2011 Tohoku Earthquake in Japan etc., these earthquakes generated very severe damages. For the reduction and mitigation of damages by destructive natural disasters, early detection of natural disasters and speedy and proper evacuations are indispensable. And hardware and software developments/preparations for reduction and mitigation of natural disasters are quite important. In Japan, DONET as the real time monitoring system on the ocean floor is developed and deployed around the Nankai trough seismogenic zone southwestern Japan. So, the early detection of earthquakes and tsunamis around the Nankai trough seismogenic zone will be expected by DONET. The integration of the real time data and advanced simulation researches will lead to reduce damages, however, in the resilience society, the resilience methods will be required after disasters. Actually, methods on restorations and revivals are necessary after natural disasters. We would like to propose natural disaster mitigation science for early detections, evacuations and restorations against destructive natural disasters. This means the resilience society. In natural disaster mitigation science, there are lots of research fields such as natural science, engineering, medical treatment, social science and literature/art etc. Especially, natural science, engineering and medical treatment are fundamental research fields for natural disaster mitigation, but social sciences such as sociology, geography and psychology etc. are very important research fields for restorations after natural disasters. Finally, to realize and progress disaster mitigation science, human resource cultivation is indispensable. We already carried out disaster mitigation science under `new disaster mitigation research project on Mega

  17. Earthquake Early Warning: User Education and Designing Effective Messages

    Science.gov (United States)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

    2014-12-01

    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental

  18. The numerical simulation study of the dynamic evolutionary processes in an earthquake cycle on the Longmen Shan Fault

    Science.gov (United States)

    Tao, Wei; Shen, Zheng-Kang; Zhang, Yong

    2016-04-01

    concentration areas in the model, one is located in the mid and upper crust on the hanging wall where the strain energy could be released by permanent deformation like folding, and the other lies in the deep part of the fault where the strain energy could be released by earthquakes. (5) The whole earthquake dynamic process could be clearly reflected by the evolutions of the strain energy increments on the stages of the earthquake cycle. In the inter-seismic period, the strain energy accumulates relatively slowly; prior to the earthquake, the fault is locking and the strain energy accumulates fast, and some of the strain energy is released on the upper crust on the hanging wall of the fault. In coseismic stage, the strain energy is released fast along the fault. In the poseismic stage, the slow accumulation process of strain recovers rapidly as that in the inerseismic period in around one hundred years. The simulation study in this thesis would help better understand the earthquake dynamic process.

  19. Space-time behavior of continental intraplate earthquakes and implications for hazard assessment in China and the Central U.S.

    Science.gov (United States)

    Stein, Seth; Liu, Mian; Luo, Gang; Wang, Hui

    2014-05-01

    Earthquakes in midcontinents and those at plate boundaries behave quite differently in space and time, owing to the geometry of faults and the rate at which they are loaded. Faults at plate boundaries are loaded at constant rates by steady relative plate motion. Consequently, earthquakes concentrate along the plate boundary faults, and show quasi-periodic occurrences, although the actual temporal patterns are often complicated. However, in midcontinents, the tectonic loading is shared by a complex system of interacting faults spread over a large region, such that a large earthquake on one fault could increase the loading rates on remote faults in the system. Because the low tectonic loading rate is shared by many faults in midcontinents, individual faults may remain dormant for a long time and then become active for a short period. The resulting earthquakes are therefore episodic and spatially migrating. These effects can be seen in many areas, with a prime example being a 2000-year record from North China, which shows migration of large earthquakes between fault systems spread over a large region such that no large earthquakes rupture the same fault segment twice. Because seismic activity within mid-continents is usually much lower than that along plate boundary zones, even small earthquakes can cause widespread concerns, especially when these events occur in the source regions of previous large earthquakes. However, these small earthquakes may be aftershocks that continue for decades or even longer, because aftershock sequences often last much longer in midcontinents where tectonic loading is slow, than at plate boundaries. The recent seismicity in the Tangshan region in North China is likely aftershocks of the 1976 M7.8 Tangshan earthquake. Similarly, current seismicity in the New Madrid seismic zone in central U.S. appears to be aftershocks of a cluster of M ~7.0 events in 1811-1812. These large events and similar events in the past millennium release strain

  20. Experimental Study of Thermal Field Evolution in the Short-Impending Stage Before Earthquakes

    Science.gov (United States)

    Ren, Yaqiong; Ma, Jin; Liu, Peixun; Chen, Shunyun

    2017-08-01

    Phenomena at critical points are vital for identifying the short-impending stage prior to earthquakes. The peak stress is a critical point when stress is converted from predominantly accumulation to predominantly release. We call the duration between the peak stress and instability "the meta-instability stage", which refers to the short-impending stage of earthquakes. The meta-instability stage consists of a steady releasing quasi-static stage and an accelerated releasing quasi-dynamic stage. The turning point of the above two stages is the remaining critical point. To identify the two critical points in the field, it is necessary to study the characteristic phenomena of various physical fields in the meta-instability stage in the laboratory, and the strain and displacement variations were studied. Considering that stress and relative displacement can be detected by thermal variations and peculiarities in the full-field observations, we employed a cooled thermal infrared imaging system to record thermal variations in the meta-instability stage of stick slip events generated along a simulated, precut planer strike slip fault in a granodiorite block on a horizontally bilateral servo-controlled press machine. The experimental results demonstrate the following: (1) a large area of decreasing temperatures in wall rocks and increasing temperatures in sporadic sections of the fault indicate entrance into the meta-instability stage. (2) The rapid expansion of regions of increasing temperatures on the fault and the enhancement of temperature increase amplitude correspond to the turning point from the quasi-static stage to the quasi-dynamic stage. Our results reveal thermal indicators for the critical points prior to earthquakes that provide clues for identifying the short-impending stage of earthquakes.

  1. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    Science.gov (United States)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  2. Impact-based earthquake alerts with the U.S. Geological Survey's PAGER system: what's next?

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Garcia, D.; So, E.; Hearne, M.

    2012-01-01

    In September 2010, the USGS began publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses with its Prompt Assessment of Global Earthquakes for Response (PAGER) system. These estimates significantly enhanced the utility of the USGS PAGER system which had been, since 2006, providing estimated population exposures to specific shaking intensities. Quantifying earthquake impacts and communicating estimated losses (and their uncertainties) to the public, the media, humanitarian, and response communities required a new protocol—necessitating the development of an Earthquake Impact Scale—described herein and now deployed with the PAGER system. After two years of PAGER-based impact alerting, we now review operations, hazard calculations, loss models, alerting protocols, and our success rate for recent (2010-2011) events. This review prompts analyses of the strengths, limitations, opportunities, and pressures, allowing clearer definition of future research and development priorities for the PAGER system.

  3. Current problems and subjects on numerical analysis of earthquake geotechnical engineering. For seamless analysis

    International Nuclear Information System (INIS)

    Yoshida, Taiki

    2016-01-01

    There are continuum and discontinuum analyses in the evaluation of seismic stability of surrounding slope in nuclear power plant facility. However, we cannot rationally evaluate such seismic stability due to excessive conservative margin of the results by each analysis. If we can simulate the behavior from small to large deformation by hybridizing them, we can contribute not only to the rationalization of the slope stability evaluation but also the enhancement of evaluation precision in the numerical analysis. In this review, the previous numerical analyses and application cases of them in earthquake geotechnical engineering were classified into three categories, that is, continuum analysis, discontinuum one and the hybridizing process to identify their research themes. The present review has revealed that the research themes are the standardization of condition for conversion, construction of the technique to determine parameters related to conversion and the reasonable physical property set of DEM(Distinct Element Method) after conversion. Our future work will be development of a numerical analysis code hybridizing continuum and discontinuum analyses based on the identified research themes. (author)

  4. Triggered seismicity and deformation between the Landers, California, and Little Skull Mountain, Nevada, earthquakes

    Science.gov (United States)

    Bodin, Paul; Gomberg, Joan

    1994-01-01

    This article presents evidence for the channeling of strain energy released by the Ms = 7.4 Landers, California, earthquake within the eastern California shear zone (ECSZ). We document an increase in seismicity levels during the 22-hr period starting with the Landers earthquake and culminating 22 hr later with the Ms = 5.4 Little Skull Mountain (LSM), Nevada, earthquake. We evaluate the completeness of regional seismicity catalogs during this period and find that the continuity of post-Landers strain release within the ECSZ is even more pronounced than is evident from the catalog data. We hypothesize that regional-scale connectivity of faults within the ECSZ and LSM region is a critical ingredient in the unprecedented scale and distribution of remotely triggered earthquakes and geodetically manifest strain changes that followed the Landers earthquake. The viability of static strain changes as triggering agents is tested using numerical models. Modeling results illustrate that regional-scale fault connectivity can increase the static strain changes by approximately an order of magnitude at distances of at least 280 km, the distance between the Landers and LSM epicenters. This is possible for models that include both a network of connected faults that slip “sympathetically” and realistic levels of tectonic prestrain. Alternatively, if dynamic strains are a more significant triggering agent than static strains, ECSZ structure may still be important in determining the distribution of triggered seismic and aseismic deformation.

  5. Teleseismic analysis of the 1990 and 1991 earthquakes near Potenza

    Directory of Open Access Journals (Sweden)

    G. Ekstrom

    1994-06-01

    Full Text Available Analysis of the available teleseismic data for two moderate earthquakes near the town of Potenza in the Southern Apennines shows that both involve strike-slip faulting on a plane oriented approximately east-west. Only the larger, 5 May 1990, earthquake is sufficiently large for analysis by conventional teleseismic waveform inversion methods, and is seen to consist of a foreshock followed 11 seconds later by the main release of moment. The focal mechanism and seismic moment of the 26 May 1991 earthquake is determined by quantitative comparison of its 15-60 s period surface waves with those generated by the 5 May 1990 event. The focal mechanisms for the two events are found to be very similar. The 1991 earthquake has a scalar moment that is approximately 18% that of the 1990 mainshock. Comparison of higher frequency P waves for the two events, recorded at regional distance, shows that the ratio of trace amplitudes is smaller than the ratio of scalar moments, suggesting that the stress drop for the 1991 event is distinctly smaller than for the 1990 mainshock.

  6. Modeling of fission product release in integral codes

    International Nuclear Information System (INIS)

    Obaidurrahman, K.; Raman, Rupak K.; Gaikwad, Avinash J.

    2014-01-01

    The Great Tohoku earthquake and tsunami that stroke the Fukushima-Daiichi nuclear power station in March 11, 2011 has intensified the needs of detailed nuclear safety research and with this objective all streams associated with severe accident phenomenology are being revisited thoroughly. The present paper would cover an overview of state of art FP release models being used, the important phenomenon considered in semi-mechanistic models and knowledge gaps in present FP release modeling. Capability of FP release module, ELSA of ASTEC integral code in appropriate prediction of FP release under several diversified core degraded conditions will also be demonstrated. Use of semi-mechanistic fission product release models at AERB in source-term estimation shall be briefed. (author)

  7. Aspect of the 2011 off the Pacific coast Tohoku Earthquake, Japan

    International Nuclear Information System (INIS)

    Kato, Aitaro

    2012-01-01

    The 2011 off the Pacific coast of Tohoku Earthquake (Tohoku-Oki), Japan, was the first magnitude (M) 9 subduction megathrust event to be recorded by a dense network of seismic, geodetic, and tsunami observations. I here review the Tohoku-Oki earthquake in terms of, 1) asperity model, 2) earthquake source observations, 3) precedent processes, 4) postseismic slip (afetrslip). Based on finite source models of the Tohoku-Oki mainshock, the coseismic fault slip exceeded 30 m at shallow part of the subduction zone off-shore of Miyagi. The rupture reached the trench axis, producing a large uplift therein, which was likely an important factor generating devastating tsunami waves. The mainshock was preceded by slow-slip transients propagating toward the initial rupture point, which may have caused substantial stress loading, prompting the unstable dynamic rupture of the mainshock. Furthermore, a sequence of M 7-class interplate earthquakes and subsequent large afterslip events, those occurred before the mainshock rupture, might be interpreted as preparation stage of the earthquake generation. Most of slip released by the postseismic deformation following the Tohoku-Oki mainshock is located in the region peripheral to the large coseismic slip area. (author)

  8. Fracture analysis of concrete gravity dam under earthquake induced ...

    African Journals Online (AJOL)

    Michael Horsfall

    Fracture analysis of concrete gravity dam under earthquake induced loads. 1. ABBAS MANSOURI;. 2 ... 1 Civil Engineering, Islamic Azad University (South Branch of Tehran)Tehran, Iran ..... parameter has on the results of numerical calculations. In this analysis ... with the help of Abaqus software (Abaqus theory manual ...

  9. Probabilistic risk assessment of earthquakes at the Rocky Flats Plant and subsequent upgrade to reduce risk

    International Nuclear Information System (INIS)

    Day, S.A.

    1989-01-01

    An analysis to determine the risk associated with earthquakes at the Rocky Flats Plant was performed. Seismic analyses and structural evaluations were used to postulate building and equipment damage and radiological releases to the environment from various magnitudes of earthquakes. Dispersion modeling and dose assessment to the public were then calculated. The frequency of occurrence of various magnitudes of earthquakes were determined from the Department of Energy natural Phenomena Hazards Modeling Project. Risk to the public was probabilistically assessed for each magnitude of earthquake and for overall seismic risk. Based on the results of this Probabilistic Risk Assessment and a cost/benefit analysis, seismic upgrades are being implemented for several plutonium-handling facilities for the purpose of risk reduction

  10. Future Developments for the Earthquake Early Warning System following the 2011 off the Pacific Coast of Tohoku Earthquake

    Science.gov (United States)

    Yamada, M.; Mori, J. J.

    2011-12-01

    The 2011 off the Pacific Coast of Tohoku Earthquake (Mw9.0) caused significant damage over a large area of northeastern Honshu. An earthquake early warning was issued to the public in the Tohoku region about 8 seconds after the first P-arrival, which is 31 seconds after the origin time. There was no 'blind zone', and warnings were received at all locations before S-wave arrivals, since the earthquake was fairly far offshore. Although the early warning message was properly reported in Tohoku region which was the most severely affected area, a message was not sent to the more distant Tokyo region because the intensity was underestimated. . This underestimation was because the magnitude determination in the first few seconds was relatively small (Mj8.1)., and there was no consideration of a finite fault with a long length. Another significant issue is that warnings were sometimes not properly provided for aftershocks. Immediately following the earthquake, the waveforms of some large aftershocks were contaminated by long-period surface waves from the mainshock, which made it difficult to pick P-wave arrivals. Also, correctly distinguishing and locating later aftershocks was sometimes difficult, when multiple events occurred within a short period of time. This masinhock begins with relatively small moment release for the first 10 s . Since the amplitude of the initial waveforms is small, most methods that use amplitudes and periods of the P-wave (e.g. Wu and Kanamori, 2005) cannot correctly determine the size of the4 earthquake in the first several seconds. The current JMA system uses the peak displacement amplitude for the magnitude estimation, and the magnitude saturated at about M8 1 minute after the first P-wave arrival. . Magnitudes of smaller earthquakes can be correctly identified from the first few seconds of P- or S-wave arrivals, but this M9 event cannot be characterized in such a short time. The only way to correctly characterize the size of the Tohoku

  11. Monitoring the West Bohemian earthquake swarm in 2008/2009 by a temporary small-aperture seismic array

    Science.gov (United States)

    Hiemer, Stefan; Roessler, Dirk; Scherbaum, Frank

    2012-04-01

    The most recent intense earthquake swarm in West Bohemia lasted from 6 October 2008 to January 2009. Starting 12 days after the onset, the University of Potsdam monitored the swarm by a temporary small-aperture seismic array at 10 km epicentral distance. The purpose of the installation was a complete monitoring of the swarm including micro-earthquakes ( M L 0.0). In the course of this work, the main temporal features (frequency-magnitude distribution, propagation of back azimuth and horizontal slowness, occurrence rate of aftershock sequences and interevent-time distribution) of the recent 2008/2009 earthquake swarm are presented and discussed. Temporal changes of the coefficient of variation (based on interevent times) suggest that the swarm earthquake activity of the 2008/2009 swarm terminates by 12 January 2009. During the main phase in our studied swarm period after 19 October, the b value of the Gutenberg-Richter relation decreases from 1.2 to 0.8. This trend is also reflected in the power-law behavior of the seismic moment release. The corresponding total seismic moment release of 1.02×1017 Nm is equivalent to M L,max = 5.4.

  12. Geotechnical hazards from large earthquakes and heavy rainfalls

    CERN Document Server

    Kazama, Motoki; Lee, Wei

    2017-01-01

    This book is a collection of papers presented at the International Workshop on Geotechnical Natural Hazards held July 12–15, 2014, in Kitakyushu, Japan. The workshop was the sixth in the series of Japan–Taiwan Joint Workshops on Geotechnical Hazards from Large Earthquakes and Heavy Rainfalls, held under the auspices of the Asian Technical Committee No. 3 on Geotechnology for Natural Hazards of the International Society for Soil Mechanics and Geotechnical Engineering. It was co-organized by the Japanese Geotechnical Society and the Taiwanese Geotechnical Society. The contents of this book focus on geotechnical and natural hazard-related issues in Asia such as earthquakes, tsunami, rainfall-induced debris flows, slope failures, and landslides. The book contains the latest information and mitigation technology on earthquake- and rainfall-induced geotechnical natural hazards. By dissemination of the latest state-of-the-art research in the area, the information contained in this book will help researchers, des...

  13. Analysis of earthquake clustering and source spectra in the Salton Sea Geothermal Field

    Science.gov (United States)

    Cheng, Y.; Chen, X.

    2015-12-01

    The Salton Sea Geothermal field is located within the tectonic step-over between San Andreas Fault and Imperial Fault. Since the 1980s, geothermal energy exploration has resulted with step-like increase of microearthquake activities, which mirror the expansion of geothermal field. Distinguishing naturally occurred and induced seismicity, and their corresponding characteristics (e.g., energy release) is important for hazard assessment. Between 2008 and 2014, seismic data recorded by a local borehole array were provided public access from CalEnergy through SCEC data center; and the high quality local recording of over 7000 microearthquakes provides unique opportunity to sort out characteristics of induced versus natural activities. We obtain high-resolution earthquake location using improved S-wave picks, waveform cross-correlation and a new 3D velocity model. We then develop method to identify spatial-temporally isolated earthquake clusters. These clusters are classified into aftershock-type, swarm-type, and mixed-type (aftershock-like, with low skew, low magnitude and shorter duration), based on the relative timing of largest earthquakes and moment-release. The mixed-type clusters are mostly located at 3 - 4 km depth near injection well; while aftershock-type clusters and swarm-type clusters also occur further from injection well. By counting number of aftershocks within 1day following mainshock in each cluster, we find that the mixed-type clusters have much higher aftershock productivity compared with other types and historic M4 earthquakes. We analyze detailed spatial variation of 'b-value'. We find that the mixed-type clusters are mostly located within high b-value patches, while large (M>3) earthquakes and other types of clusters are located within low b-value patches. We are currently processing P and S-wave spectra to analyze the spatial-temporal correlation of earthquake stress parameter and seismicity characteristics. Preliminary results suggest that the

  14. Fast rise times and the physical mechanism of deep earthquakes

    Science.gov (United States)

    Houston, H.; Williams, Q.

    1991-01-01

    A systematic global survey of the rise times and stress drops of deep and intermediate earthquakes is reported. When the rise times are scaled to the seismic moment release of the events, their average is nearly twice as fast for events deeper than about 450 km as for shallower events.

  15. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    Science.gov (United States)

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  16. Influence of LOD variations on seismic energy release

    Science.gov (United States)

    Riguzzi, F.; Krumm, F.; Wang, K.; Kiszely, M.; Varga, P.

    2009-04-01

    Tidal friction causes significant time variations of geodynamical parameters, among them geometrical flattening. The axial despinning of the Earth due to tidal friction through the change of flattening generates incremental meridional and azimuthal stresses. The stress pattern in an incompressible elastic upper mantle and crust is symmetric to the equator and has its inflection points at the critical latitude close to ±45°. Consequently the distribution of seismic energy released by strong, shallow focus earthquakes should have also sharp maxima at this latitude. To investigate the influence of length of day (LOD) variations on earthquake activity an earthquake catalogue of strongest seismic events (M>7.0) was completed for the period 1900-2007. It is shown with the use of this catalogue that for the studied time-interval the catalogue is complete and consists of the seismic events responsible for more than 90% of released seismic energy. Study of the catalogue for earthquakes M>7.0 shows that the seismic energy discharged by the strongest seismic events has significant maxima at ±45°, what renders probably that the seismic activity of our planet is influenced by an external component, i.e. by the tidal friction, which acts through the variation of the hydrostatic figure of the Earth caused by it. Distribution along the latitude of earthquake numbers and energies was investigated also for the case of global linear tectonic structures, such as mid ocean ridges and subduction zones. It can be shown that the number of the shallow focus shocks has a repartition along the latitude similar to the distribution of the linear tectonic structures. This means that the position of foci of seismic events is mainly controlled by the tectonic activity.

  17. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    Science.gov (United States)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  18. Synthetic strong ground motions for engineering design utilizing empirical Green`s functions

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L.J.; Jarpe, S.P.; Kasameyer, P.W.; Foxall, W.

    1996-04-11

    We present a methodology for developing realistic synthetic strong ground motions for specific sites from specific earthquakes. We analyzed the possible ground motion resulting from a M = 7.25 earthquake that ruptures 82 km of the Hayward fault for a site 1.4 km from the fault in the eastern San Francisco Bay area. We developed a suite of 100 rupture scenarios for the Hayward fault earthquake and computed the corresponding strong ground motion time histories. We synthesized strong ground motion with physics-based solutions of earthquake rupture and applied physical bounds on rupture parameters. By having a suite of rupture scenarios of hazardous earthquakes for a fixed magnitude and identifying the hazard to the site from the statistical distribution of engineering parameters, we introduce a probabilistic component into the deterministic hazard calculation. Engineering parameters of synthesized ground motions agree with those recorded from the 1995 Kobe, Japan and the 1992 Landers, California earthquakes at similar distances and site geologies.

  19. Evaluating protein incorporation and release in electrospun composite scaffolds for bone tissue engineering applications.

    Science.gov (United States)

    Briggs, Tonye; Matos, Jeffrey; Collins, George; Arinzeh, Treena Livingston

    2015-10-01

    Electrospun polymer/ceramic composites have gained interest for use as scaffolds for bone tissue engineering applications. In this study, we investigated methods to incorporate Platelet Derived Growth Factor-BB (PDGF-BB) in electrospun polycaprolactone (PCL) or PCL prepared with polyethylene oxide (PEO), where both contained varying levels (up to 30 wt %) of ceramic composed of biphasic calcium phosphates, hydroxyapatite (HA)/β-tricalcium phosphate (TCP). Using a model protein, lysozyme, we compared two methods of protein incorporation, adsorption and emulsion electrospinning. Adsorption of lysozyme on scaffolds with ceramic resulted in minimal release of lysozyme over time. Using emulsion electrospinning, lysozyme released from scaffolds containing a high concentration of ceramic where the majority of the release occurred at later time points. We investigated the effect of reducing the electrostatic interaction between the protein and the ceramic on protein release with the addition of the cationic surfactant, cetyl trimethylammonium bromide (CTAB). In vitro release studies demonstrated that electrospun scaffolds prepared with CTAB released more lysozyme or PDGF-BB compared with scaffolds without the cationic surfactant. Human mesenchymal stem cells (MSCs) on composite scaffolds containing PDGF-BB incorporated through emulsion electrospinning expressed higher levels of osteogenic markers compared to scaffolds without PDGF-BB, indicating that the bioactivity of the growth factor was maintained. This study revealed methods for incorporating growth factors in polymer/ceramic scaffolds to promote osteoinduction and thereby facilitate bone regeneration. © 2015 Wiley Periodicals, Inc.

  20. Small Buildings in Earthquake Areas. Educational Building Digest 2.

    Science.gov (United States)

    Mooij, D.

    This booklet is intended for builders and others who actually construct small buildings in earthquake areas and not for professionally qualified architects or engineers. In outline form with sketches the following topics are discussed: general construction and design principles; foundations; earth walls; brick, block, and stone walls; timber frame…

  1. Coseismic and postseismic deformation associated with the 2016 Mw 7.8 Kaikoura earthquake, New Zealand: fault movement investigation and seismic hazard analysis

    Science.gov (United States)

    Jiang, Zhongshan; Huang, Dingfa; Yuan, Linguo; Hassan, Abubakr; Zhang, Lupeng; Yang, Zhongrong

    2018-04-01

    The 2016 moment magnitude (Mw) 7.8 Kaikoura earthquake demonstrated that multiple fault segments can undergo rupture during a single seismic event. Here, we employ Global Positioning System (GPS) observations and geodetic modeling methods to create detailed images of coseismic slip and postseismic afterslip associated with the Kaikoura earthquake. Our optimal geodetic coseismic model suggests that rupture not only occurred on shallow crustal faults but also to some extent at the Hikurangi subduction interface. The GPS-inverted moment release during the earthquake is equivalent to a Mw 7.9 event. The near-field postseismic deformation is mainly derived from right-lateral strike-slip motions on shallow crustal faults. The afterslip did not only significantly extend northeastward on the Needles fault but also appeared at the plate interface, slowly releasing energy over the past 6 months, equivalent to a Mw 7.3 earthquake. Coulomb stress changes induced by coseismic deformation exhibit complex patterns and diversity at different depths, undoubtedly reflecting multi-fault rupture complexity associated with the earthquake. The Coulomb stress can reach several MPa during coseismic deformation, which can explain the trigger mechanisms of afterslip in two high-slip regions and the majority of aftershocks. Based on the deformation characteristics of the Kaikoura earthquake, interseismic plate coverage, and historical earthquakes, we conclude that Wellington is under higher seismic threat after the earthquake and great attention should be paid to potential large earthquake disasters in the near future.[Figure not available: see fulltext.

  2. Mental Health of Survivors of the 2010 Haitian Earthquake Living in the United States

    Centers for Disease Control (CDC) Podcasts

    2010-04-16

    Thousands of survivors of the 2010 Haitian Earthquake are currently living in the United States. This podcast features a brief non-disease-specific interview with Dr. Marc Safran, CDC's longest serving psychiatrist, about a few of the mental health challenges such survivors may face.  Created: 4/16/2010 by CDC Center of Attribution: Mental and Behavioral Health Team, 2010 CDC Haiti Earthquake Mission, CDC Emergency Operations Center.   Date Released: 5/6/2010.

  3. [Construction and evaluation of the tissue engineered nerve of bFGF-PLGA sustained release microspheres].

    Science.gov (United States)

    Wang, Guanglin; Lin, Wei; Gao, Weiqiang; Xiao, Yuhua; Dong, Changchao

    2008-12-01

    To study the outcomes of nerve defect repair with the tissue engineered nerve, which is composed of the complex of SCs, 30% ECM gel, bFGF-PLGA sustained release microspheres, PLGA microfilaments and permeable poly (D, L-lactic acid) (PDLLA) catheters. SCs were cultured and purified from the sciatic nerves of 1-day-old neonatal SD rats. The 1st passage cells were compounded with bFGF-PLGA sustained release microspheres and ECM gel, and then were injected into permeable PDLLA catheters with PLGA microfilaments inside. In this way, the tissue engineered nerve was constructed. Sixty SD rats were included. The model of 15-mm sciatic nerve defects was made, and then the rats were randomly divided into 5 groups, with 12 rats in each. In group A, autograft was adopted. In group B, the blank PDLLA catheters with PBS inside were used. In group C, PDLLA catheters, with PLGA microfilaments and 30% ECM gel inside, were used. In group D, PDLLA catheters, with PLGA microfilaments, SCs and 30% ECM gel inside, were used. In group E, the tissue engineered nerve was applied. After the operation, observation was made for general conditions of the rats. The sciatic function index (SFI) analysis was performed at 12, 16, 20 and 24 weeks after the operation, respectively. Electrophysiological detection and histological observation were performed at 12 and 24 weeks after the operation, respectively. All rats survived to the end of the experiment. At 12 and 16 weeks after the operation, group E was significantly different from group B in SFI (P fibers in group E were significantly differents from those in groups A, B and C (P fibers in group E were smaller than those in group A (P fibers in group E was significantly different from those in groups A, B, C (P fibers in group E were bigger than those in groups B and C (P < 0.05). The tissue engineered nerve with the complex of SCs, ECM gel, bFGF-PLGA sustained release microspheres, PLGA microfilaments and permeable PDLLA catheters promote

  4. Chaotic behaviour in the non-linear optimal control of unilaterally contacting building systems during earthquakes

    CERN Document Server

    Liolios, A

    2003-01-01

    The paper presents a new numerical approach for a non-linear optimal control problem arising in earthquake civil engineering. This problem concerns the elastoplastic softening-fracturing unilateral contact between neighbouring buildings during earthquakes when Coulomb friction is taken into account under second-order instabilizing effects. So, the earthquake response of the adjacent structures can appear instabilities and chaotic behaviour. The problem formulation presented here leads to a set of equations and inequalities, which is equivalent to a dynamic hemivariational inequality in the way introduced by Panagiotopoulos [Hemivariational Inequalities. Applications in Mechanics and Engineering, Springer-Verlag, Berlin, 1993]. The numerical procedure is based on an incremental problem formulation and on a double discretization, in space by the finite element method and in time by the Wilson-theta method. The generally non-convex constitutive contact laws are piecewise linearized, and in each time-step a non-c...

  5. Use of earthquake experience data

    International Nuclear Information System (INIS)

    Eder, S.J.; Eli, M.W.

    1991-01-01

    At many of the older existing US Department of Energy (DOE) facilities, the need has arisen for evaluation guidelines for natural phenomena hazard assessment. The effect of a design basis earthquake at most of these facilities is one of the main concerns. Earthquake experience data can provide a basis for the needed seismic evaluation guidelines, resulting in an efficient screening evaluation methodology for several of the items that are in the scope of the DOE facility reviews. The experience-based screening evaluation methodology, when properly established and implemented by trained engineers, has proven to result in sufficient safety margins and focuses on real concerns via facility walkdowns, usually at costs much less than the alternative options of analysis and testing. This paper summarizes a program that is being put into place to establish uniform seismic evaluation guidelines and criteria for evaluation of existing DOE facilities. The intent of the program is to maximize use of past experience, in conjunction with a walkdown screening evaluation process

  6. Ground motion following selection of SRS design basis earthquake and associated deterministic approach

    International Nuclear Information System (INIS)

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section's Seismic Qualification Program for reactor restart

  7. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  8. TSUNAMIGENIC SOURCE MECHANISM AND EFFICIENCY OF THE MARCH 11, 2011 SANRIKU EARTHQUAKE IN JAPAN

    Directory of Open Access Journals (Sweden)

    George Pararas-Carayannis

    2011-01-01

    Full Text Available The great Tohoku earthquake of March 11, 2011 generated a very destructive and anomalously high tsunami. To understand its source mechanism, an examination was undertaken of the seismotectonics of the region and of the earthquake’ focal mechanism, energy release, rupture patterns and spatial and temporal sequencing and clustering of major aftershocks. It was determined that the great tsunami resulted from a combination of crustal deformations of the ocean floor due to up-thrust tectonic motions, augmented by additional uplift due to the quake’s slow and long rupturing process, as well as to large coseismic lateral movements which compressed and deformed the compacted sediments along the accretionary prism of the overriding plane. The deformation occurred randomly and non-uniformly along parallel normal faults and along oblique, en-echelon faults to the earthquake’s overall rupture direction – the latter failing in a sequential bookshelf manner with variable slip angles. As the 1992 Nicaragua and the 2004 Sumatra earthquakes demonstrated, such bookshelf failures of sedimentary layers could contribute to anomalously high tsunamis. As with the 1896 tsunami, additional ocean floor deformation and uplift of the sediments was responsible for the higher waves generated by the 2011 earthquake. The efficiency of tsunami generation was greater along the shallow eastern segment of the fault off the Miyagi Prefecture where most of the energy release of the earthquake and the deformations occurred, while the segment off the Ibaraki Prefecture – where the rupture process was rapid – released less seismic energy, resulted in less compaction and deformation of sedimentary layers and thus to a tsunami of lesser offshore height. The greater tsunamigenic efficiency of the 2011 earthquake and high degree of the tsunami’s destructiveness along Honshu’s coastlines resulted from vertical crustal displacements of more than 10 meters due to up

  9. Tectonic styles of future earthquakes in Italy as input data for seismic hazard

    Science.gov (United States)

    Pondrelli, S.; Meletti, C.; Rovida, A.; Visini, F.; D'Amico, V.; Pace, B.

    2017-12-01

    In a recent elaboration of a new seismogenic zonation and hazard model for Italy, we tried to understand how many indications we have on the tectonic style of future earthquake/rupture. Using all available or recomputed seismic moment tensors for relevant seismic events (Mw starting from 4.5) of the last 100 yrs, first arrival focal mechanisms for less recent earthquakes and also geological data on past activated faults, we collected a database gathering a thousands of data all over the Italian peninsula and regions around it. After several summations of seismic moment tensors, over regular grids of different dimensions and different thicknesses of the seismogenic layer, we applied the same procedure to each of the 50 area sources that were designed in the seismogenic zonation. The results for several seismic zones are very stable, e.g. along the southern Apennines we expect future earthquakes to be mostly extensional, although in the outer part of the chain strike-slip events are possible. In the Northern part of the Apennines we also expect different, opposite tectonic styles for different hypocentral depths. In several zones, characterized by a low seismic moment release, defined for the study region using 1000 yrs of catalog, the next possible tectonic style of future earthquakes is less clear. It is worth to note that for some zones the possible greatest earthquake could be not represented in the available observations. We also add to our analysis the computation of the seismic release rate, computed using a distributed completeness, identified for single great events of the historical seismic catalog for Italy. All these information layers, overlapped and compared, may be used to characterize each new seismogenic zone.

  10. The earthquakes of stable continental regions. Volume 2: Appendices A to E. Final report

    International Nuclear Information System (INIS)

    Johnston, A.C.; Kanter, L.R.; Coppersmith, K.J.; Cornell, C.A.

    1994-12-01

    The objectives of the study were to develop a comprehensive database of earthquakes in stable continental regions (SCRs) and to statistically examine use of the database for the assessment of large earthquake potential. We identified nine major and several minor SCRs worldwide and compiled a database of geologic characteristics of tectonic domains within each SCR. We examined all available earthquake data from SCRs, from historical accounts of events with no instrumental ground-motion data to present-day instrumentally recorded events. In all, 1,385 events were analyzed. Using moment magnitude 4.5 as the lower bound threshold for inclusion in the database, 870 were assigned to an SCR, 124 were found to be transitional to an SCR, and 391 were examined, but rejected. We then performed a seismotectonic analysis to determine what distinguishes seismic activity in SCRs from other types of crust, such as active plate margins or active continental regions. General observations are: (1) SCRs comprise nearly two-thirds of all continental crust of which 25% is considered to be extended (i.e., rifted); (2) the majority of seismic energy release and the largest earthquakes in SCRs have occurred in extended crust; and (3) active plate margins release seismic energy at a rate per unit area approximately 7,000 times the average for non-extended SCRs. Finally, results of a statistical examination of distributions of historical maximum earthquakes between different crustal domain types indicated that additional information is needed in order to adequately constrain estimates of maximum earthquakes for any given region. Thus, a Bayesian approach was developed in which statistical constraints from the database were used to develop a prior distribution, which may then be combined with source-specific information to constrain maximum magnitude assessments for use in probabilistic seismic hazard analyses

  11. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Science.gov (United States)

    2010-08-17

    ... accommodate Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov... of Technology, 365 Innovation Drive, Memphis, TN 38152-3115. Please note admittance instructions...: Trends and developments in the science and engineering of earthquake hazards reduction; The effectiveness...

  12. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes

    Science.gov (United States)

    Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai

    2018-01-01

    Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.

  13. Future of Earthquake Early Warning: Quantifying Uncertainty and Making Fast Automated Decisions for Applications

    Science.gov (United States)

    Wu, Stephen

    Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications. Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake. To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that

  14. Earthquake response analyses of soil-structure system considering kinematic interaction

    International Nuclear Information System (INIS)

    Murakami, H.; Yokono, K.; Miura, S.; Ishii, K.

    1985-01-01

    Improvement of soil-structure interaction analysis has been one of major concerns in earthquake engineering field, especially in nuclear industries, to evaluate the safety of structure accurately under earthquake events. This research aims to develop a rational analytical tool which considers effect of the 'kinematic interaction' satisfactory with a proposed simple low-pass filter. In this paper, first the effect of the kinematic interaction is investigated based on earthquake response analysis of a reactor building using the practical design models: the spring-mass-dashpot system and the 'lattice model', in which a building and soil medium are modeled by a system of lumped masses. Next, the filter is developed based on parametrical studies with various sizes of depth and width of foundations embedded in two-layers soil, which represents more general soil condition in practical designs compared with a homogeneous soil medium. (orig.)

  15. A preliminary census of engineering activities located in Sicily (Southern Italy) which may "potentially" induce seismicity

    Science.gov (United States)

    Aloisi, Marco; Briffa, Emanuela; Cannata, Andrea; Cannavò, Flavio; Gambino, Salvatore; Maiolino, Vincenza; Maugeri, Roberto; Palano, Mimmo; Privitera, Eugenio; Scaltrito, Antonio; Spampinato, Salvatore; Ursino, Andrea; Velardita, Rosanna

    2015-04-01

    The seismic events caused by human engineering activities are commonly termed as "triggered" and "induced". This class of earthquakes, though characterized by low-to-moderate magnitude, have significant social and economical implications since they occur close to the engineering activity responsible for triggering/inducing them and can be felt by the inhabitants living nearby, and may even produce damage. One of the first well-documented examples of induced seismicity was observed in 1932 in Algeria, when a shallow magnitude 3.0 earthquake occurred close to the Oued Fodda Dam. By the continuous global improvement of seismic monitoring networks, numerous other examples of human-induced earthquakes have been identified. Induced earthquakes occur at shallow depths and are related to a number of human activities, such as fluid injection under high pressure (e.g. waste-water disposal in deep wells, hydrofracturing activities in enhanced geothermal systems and oil recovery, shale-gas fracking, natural and CO2 gas storage), hydrocarbon exploitation, groundwater extraction, deep underground mining, large water impoundments and underground nuclear tests. In Italy, induced/triggered seismicity is suspected to have contributed to the disaster of the Vajont dam in 1963. Despite this suspected case and the presence in the Italian territory of a large amount of engineering activities "capable" of inducing seismicity, no extensive researches on this topic have been conducted to date. Hence, in order to improve knowledge and correctly assess the potential hazard at a specific location in the future, here we started a preliminary study on the entire range of engineering activities currently located in Sicily (Southern Italy) which may "potentially" induce seismicity. To this end, we performed: • a preliminary census of all engineering activities located in the study area by collecting all the useful information coming from available on-line catalogues; • a detailed compilation

  16. Rupture distribution of the 1977 western Argentina earthquake

    Science.gov (United States)

    Langer, C.J.; Hartzell, S.

    1996-01-01

    Teleseismic P and SH body waves are used in a finite-fault, waveform inversion for the rupture history of the 23 November 1977 western Argentina earthquake. This double event consists of a smaller foreshock (M0 = 5.3 ?? 1026 dyn-cm) followed about 20 s later by a larger main shock (M0 = 1.5 ?? 1027 dyn-cm). Our analysis indicates that these two events occurred on different fault segments: with the foreshock having a strike, dip, and average rake of 345??, 45??E, and 50??, and the main shock 10??, 45??E, and 80??, respectively. The foreshock initiated at a depth of 17 km and propagated updip and to the north. The main shock initiated at the southern end of the foreshock zone at a depth of 25 to 30 km, and propagated updip and unilaterally to the south. The north-south separation of the centroids of the moment release for the foreshock and main shock is about 60 km. The apparent triggering of the main shock by the foreshock is similar to other earthquakes that have involved the failure of multiple fault segments, such as the 1992 Landers, California, earthquake. Such occurrences argue against the use of individual, mapped, surface fault or fault-segment lengths in the determination of the size and frequency of future earthquakes.

  17. A study on generation of simulated earthquake ground motion for seismic design of nuclear power plant

    International Nuclear Information System (INIS)

    Ichiki, Tadaharu; Matsumoto, Takuji; Kitada, Yoshio; Osaki, Yorihiko; Kanda, Jun; Masao, Toru.

    1985-01-01

    The aseismatic design of nuclear power generation facilities carried out in Japan at present must conform to the ''Guideline for aseismatic design examination regarding power reactor facilities'' decided by the Atomic Energy Commission in 1978. In this guideline, the earthquake motion used for the analysis of dynamic earthquake response is to be given in the form of the magnitude determined on the basis of the investigation of historical earthquakes and active faults around construction sites and the response spectra corresponding to the distance from epicenters. Accordingly when the analysis of dynamic earthquake response is actually carried out, the simulated earthquake motion made in conformity with these set up response spectra is used as the input earthquake motion for the design. For the purpose of establishing the techniques making simulated earthquake motion which is more appropriate and rational from engineering viewpoint, the research was carried out, and the results are summarized in this paper. The techniques for making simulated earthquake motion, the response of buildings and the response spectra of floors are described. (Kako, I.)

  18. A complex rupture image of the 2011 off the Pacific coast of Tohoku Earthquake revealed by the MeSO-net

    Science.gov (United States)

    Honda, Ryou; Yukutake, Yohei; Ito, Hiroshi; Harada, Masatake; Aketagawa, Tamotsu; Yoshida, Akio; Sakai, Shin'ichi; Nakagawa, Shigeki; Hirata, Naoshi; Obara, Kazushige; Kimura, Hisanori

    2011-07-01

    Strong ground motions from the 2011 off the Pacific coast of Tohoku Earthquake, the most powerful earthquake to have occurred in and around Japan after the installation of a modern seismic network, were recorded for more than 300 seconds by a dense and wide-span seismic network, the Metropolitan Seismic Observation Network (MeSO-net), installed around the Tokyo metropolitan area about 200 km away from the epicenter. We investigate the rupture process of the earthquake in space and time by performing semblance-enhanced stacking analysis of the waveforms in a frequency range of 0.05 to 0.5 Hz. By projecting the power of the stacked waveforms to an assumed fault plane, the rupture propagation image of the large and complex earthquake has been successfully obtained. The seismic energy was mainly generated from the off-shore areas of about 100 km away from the coast in Miyagi and Fukushima Prefectures. The shallow and eastern part of the fault along the Japan trench off Miyagi Prefecture released strong seismic energy which might have been related to the excitation of gigantic tsunami. In contrast, the southern shallow part of the fault plane, off Ibaraki Prefecture, released only minor seismic energy. Our analysis suggests that the focal areas combining both the officially-forecasted Miyagi-oki earthquake and those of historical earthquakes that occurred off the coast of Fukushima Prefecture in 1938 were broken, resulting in the 2011 great M 9 earthquake.

  19. Characterizing Aftershock Sequences of the Recent Strong Earthquakes in Central Italy

    Science.gov (United States)

    Kossobokov, Vladimir G.; Nekrasova, Anastasia K.

    2017-10-01

    The recent strong earthquakes in Central Italy allow for a comparative analysis of their aftershocks from the viewpoint of the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. In particular, we consider aftershocks as a sequence of avalanches in self-organized system of blocks-and-faults of the Earth lithosphere, each aftershock series characterized with the distribution of the USLE control parameter, η. We found the existence, in a long-term, of different, intermittent levels of rather steady seismic activity characterized with a near constant value of η, which switch, in mid-term, at times of transition associated with catastrophic events. On such a transition, seismic activity may follow different scenarios with inter-event time scaling of different kind, including constant, logarithmic, power law, exponential rise/decay or a mixture of those as observed in the case of the ongoing one associated with the three strong earthquakes in 2016. Evidently, our results do not support the presence of universality of seismic energy release, while providing constraints on modelling seismic sequences for earthquake physicists and supplying decision makers with information for improving local seismic hazard assessments.

  20. The Chiloé Mw 7.6 earthquake of 2016 December 25 in Southern Chile and its relation to the Mw 9.5 1960 Valdivia earthquake

    Science.gov (United States)

    Lange, Dietrich; Ruiz, Javier; Carrasco, Sebastián; Manríquez, Paula

    2018-04-01

    On 2016 December 25, an Mw 7.6 earthquake broke a portion of the Southern Chilean subduction zone south of Chiloé Island, located in the central part of the Mw 9.5 1960 Valdivia earthquake. This region is characterized by repeated earthquakes in 1960 and historical times with very sparse interseismic activity due to the subduction of a young (˜15 Ma), and therefore hot, oceanic plate. We estimate the coseismic slip distribution based on a kinematic finite-fault source model, and through joint inversion of teleseismic body waves and strong motion data. The coseismic slip model yields a total seismic moment of 3.94 × 1020 N.m that occurred over ˜30 s, with the rupture propagating mainly downdip, reaching a peak slip of ˜4.2 m. Regional moment tensor inversion of stronger aftershocks reveals thrust type faulting at depths of the plate interface. The fore- and aftershock seismicity is mostly related to the subduction interface with sparse seismicity in the overriding crust. The 2016 Chiloé event broke a region with increased locking and most likely broke an asperity of the 1960 earthquake. The updip limit of the main event, aftershocks, foreshocks and interseismic activity are spatially similar, located ˜15 km offshore and parallel to Chiloé Islands west coast. The coseismic slip model of the 2016 Chiloé earthquake suggests a peak slip of 4.2 m that locally exceeds the 3.38 m slip deficit that has accumulated since 1960. Therefore, the 2016 Chiloé earthquake possibly released strain that has built up prior to the 1960 Valdivia earthquake.

  1. Chaotic behaviour in the non-linear optimal control of unilaterally contacting building systems during earthquakes

    International Nuclear Information System (INIS)

    Liolios, A.A.; Boglou, A.K.

    2003-01-01

    The paper presents a new numerical approach for a non-linear optimal control problem arising in earthquake civil engineering. This problem concerns the elastoplastic softening-fracturing unilateral contact between neighbouring buildings during earthquakes when Coulomb friction is taken into account under second-order instabilizing effects. So, the earthquake response of the adjacent structures can appear instabilities and chaotic behaviour. The problem formulation presented here leads to a set of equations and inequalities, which is equivalent to a dynamic hemivariational inequality in the way introduced by Panagiotopoulos [Hemivariational Inequalities. Applications in Mechanics and Engineering, Springer-Verlag, Berlin, 1993]. The numerical procedure is based on an incremental problem formulation and on a double discretization, in space by the finite element method and in time by the Wilson-θ method. The generally non-convex constitutive contact laws are piecewise linearized, and in each time-step a non-convex linear complementarity problem is solved with a reduced number of unknowns

  2. New streams and springs after the 2014 Mw6.0 South Napa earthquake.

    Science.gov (United States)

    Wang, Chi-Yuen; Manga, Michael

    2015-07-09

    Many streams and springs, which were dry or nearly dry before the 2014 Mw6.0 South Napa earthquake, started to flow after the earthquake. A United States Geological Survey stream gauge also registered a coseismic increase in discharge. Public interest was heightened by a state of extreme drought in California. Since the new flows were not contaminated by pre-existing surface water, their composition allowed unambiguous identification of their origin. Following the earthquake we repeatedly surveyed the new flows, collecting data to test hypotheses about their origin. We show that the new flows originated from groundwater in nearby mountains released by the earthquake. The estimated total amount of new water is ∼ 10(6) m(3), about 1/40 of the annual water use in the Napa-Sonoma area. Our model also makes a testable prediction of a post-seismic decrease of seismic velocity in the shallow crust of the affected region.

  3. Research on Collection of Earthquake Disaster Information from the Crowd

    Science.gov (United States)

    Nian, Z.

    2017-12-01

    In China, the assessment of the earthquake disasters information is mainly based on the inversion of the seismic source mechanism and the pre-calculated population data model, the real information of the earthquake disaster is usually collected through the government departments, the accuracy and the speed need to be improved. And in a massive earthquake like the one in Mexico, the telecommunications infrastructure on ground were damaged , the quake zone was difficult to observe by satellites and aircraft in the bad weather. Only a bit of information was sent out through maritime satellite of other country. Thus, the timely and effective development of disaster relief was seriously affected. Now Chinese communication satellites have been orbiting, people don't only rely on the ground telecom base station to keep communication with the outside world, to open the web page,to land social networking sites, to release information, to transmit images and videoes. This paper will establish an earthquake information collection system which public can participate. Through popular social platform and other information sources, the public can participate in the collection of earthquake information, and supply quake zone information, including photos, video, etc.,especially those information made by unmanned aerial vehicle (uav) after earthqake, the public can use the computer, potable terminals, or mobile text message to participate in the earthquake information collection. In the system, the information will be divided into earthquake zone basic information, earthquake disaster reduction information, earthquake site information, post-disaster reconstruction information etc. and they will been processed and put into database. The quality of data is analyzed by multi-source information, and is controlled by local public opinion on them to supplement the data collected by government departments timely and implement the calibration of simulation results ,which will better guide

  4. [Medical rescue of China National Earthquake Disaster Emergency Search and Rescue Team in Lushan earthquake].

    Science.gov (United States)

    Liu, Ya-hua; Yang, Hui-ning; Liu, Hui-liang; Wang, Fan; Hu, Li-bin; Zheng, Jing-chen

    2013-05-01

    To summarize and analyze the medical mission of China National Earthquake Disaster Emergency Search and Rescue Team (CNESAR) in Lushan earthquake, to promote the medical rescue effectiveness incorporated with search and rescue. Retrospective analysis of medical work data by CNESAR from April 21th, 2013 to April 27th during Lushan earthquake rescue, including the medical staff dispatch and the wounded case been treated. The reasonable medical corps was composed by 22 members, including 2 administrators, 11 doctors [covering emergency medicine, orthopedics (joints and limbs, spinal), obstetrics and gynecology, gastroenterology, cardiology, ophthalmology, anesthesiology, medical rescue, health epidemic prevention, clinical laboratory of 11 specialties], 1 ultrasound technician, 5 nurses, 1 pharmacist, 1 medical instrument engineer and 1 office worker for propaganda. There were two members having psychological consultants qualifications. The medical work were carried out in seven aspects, including medical care assurance for the CNESAR members, first aid cooperation with search and rescue on site, clinical work in refugees' camp, medical round service for scattered village people, evacuation for the wounded, mental intervention, and the sanitary and anti-epidemic work. The medical work covered 24 small towns, and medical staff established 3 medical clinics at Taiping Town, Shuangshi Town of Lushan County and Baoxing County. Medical rescue, mental intervention for the old and kids, and sanitary and anti-epidemic were performed at the above sites. The medical corps had successful evacuated 2 severe wounded patients and treated the wounded over thousands. Most of the wounded were soft tissue injuries, external injury, respiratory tract infections, diarrhea, and heat stroke. Compared with the rescue action in 2008 Wenchuan earthquake, the aggregation and departure of rescue team in Lushan earthquake, the traffic control order in disaster area, the self-aid and buddy aid

  5. Sol-gel derived manganese-releasing bioactive glass as a therapeutical approach for bone tissue engineering

    Energy Technology Data Exchange (ETDEWEB)

    Barrioni, B.R.; Oliveira, A.C.; Leite, M.F.; Pereira, M.M. [Universidade Federal de Minas Gerais (UFMG), MG (Brazil)

    2016-07-01

    Full text: Bioactive glasses (BG) have been highlighted in tissue engineering, due to their high bioactivity and biocompatibility, being potential materials for bone tissue repair. Its composition is variable and quite flexible, allowing the incorporation of therapeutic metallic ions, which has been regarded as a promising approach in the development of BG with superior properties for tissue engineering. These ions can be released in a controlled manner during the dissolution process of the glass, having the advantage of being released at the exactly implant site where they are needed, thus optimizing the therapeutic efficacy and reducing undesired side effects in the patient. Among several ions that have been studied, Manganese (Mn) has been shown to favor osteogenic differentiation. Besides, this ion is also a cofactor for several enzymes involved in remodeling of extracellular matrix, presenting an important role in cell adhesion. Therefore, it is very important to study the Mn role in the BG network and its influence on the glass bioactivity. In this context, new bioactive glass compositions derived from the 58S (60%SiO2-36%CaO-4%P2O5, mol%) were synthesized in this work, using the sol-gel method, by the incorporation of Mn into their structure. FTIR and Raman spectra showed the presence of typical BG chemical groups, whereas the amorphous structure typical of these materials was confirmed by XRD analysis, which also indicated that the Mn incorporation in the glass network was well succeeded, as its precursor did not recrystallize. The role of Mn in the glass network was also evaluated by XPS. The influence of Mn on carbonated hydroxyapatite layer formation after different periods of immersion of the BG powder in Simulated Body Fluid was evaluated using zeta potential, SEM, EDS and FTIR, whereas the controlled ion release was measured through ICP-OES. MTT assay revealed that Mn-containing BG showed no cytotoxic effect on cell culture. All these results indicate

  6. Sol-gel derived manganese-releasing bioactive glass as a therapeutical approach for bone tissue engineering

    International Nuclear Information System (INIS)

    Barrioni, B.R.; Oliveira, A.C.; Leite, M.F.; Pereira, M.M.

    2016-01-01

    Full text: Bioactive glasses (BG) have been highlighted in tissue engineering, due to their high bioactivity and biocompatibility, being potential materials for bone tissue repair. Its composition is variable and quite flexible, allowing the incorporation of therapeutic metallic ions, which has been regarded as a promising approach in the development of BG with superior properties for tissue engineering. These ions can be released in a controlled manner during the dissolution process of the glass, having the advantage of being released at the exactly implant site where they are needed, thus optimizing the therapeutic efficacy and reducing undesired side effects in the patient. Among several ions that have been studied, Manganese (Mn) has been shown to favor osteogenic differentiation. Besides, this ion is also a cofactor for several enzymes involved in remodeling of extracellular matrix, presenting an important role in cell adhesion. Therefore, it is very important to study the Mn role in the BG network and its influence on the glass bioactivity. In this context, new bioactive glass compositions derived from the 58S (60%SiO2-36%CaO-4%P2O5, mol%) were synthesized in this work, using the sol-gel method, by the incorporation of Mn into their structure. FTIR and Raman spectra showed the presence of typical BG chemical groups, whereas the amorphous structure typical of these materials was confirmed by XRD analysis, which also indicated that the Mn incorporation in the glass network was well succeeded, as its precursor did not recrystallize. The role of Mn in the glass network was also evaluated by XPS. The influence of Mn on carbonated hydroxyapatite layer formation after different periods of immersion of the BG powder in Simulated Body Fluid was evaluated using zeta potential, SEM, EDS and FTIR, whereas the controlled ion release was measured through ICP-OES. MTT assay revealed that Mn-containing BG showed no cytotoxic effect on cell culture. All these results indicate

  7. The Manchester earthquake swarm of October 2002

    Science.gov (United States)

    Baptie, B.; Ottemoeller, L.

    2003-04-01

    An earthquake sequence started in the Greater Manchester area of the United Kingdom on October 19, 2002. This has continued to the time of writing and has consisted of more than 100 discrete earthquakes. Three temporary seismograph stations were installed to supplement existing permanent stations and to better understand the relationship between the seismicity and local geology. Due to the urban location, these were experienced by a large number of people. The largest event on October 21 had a magnitude ML 3.9. The activity appears to be an earthquake swarm, since there is no clear distinction between a main shock and aftershocks. However, most of the energy during the sequence was actually released in two earthquakes separated by a few seconds in time, on October 21 at 11:42. Other examples of swarm activity in the UK include Comrie (1788-1801, 1839-46), Glenalmond (1970-72), Doune (1997) and Blackford (1997-98, 2000-01) in central Scotland, Constantine (1981, 1986, 1992-4) in Cornwall, and Johnstonbridge (mid1980s) and Dumfries (1991,1999). The clustering of these events in time and space does suggest that there is a causal relationship between the events of the sequence. Joint hypocenter determination was used to simultaneously locate the swarm earthquakes, determine station corrections and improve the relative locations. It seems likely that all events in the sequence originate from a relatively small source volume. This is supported by the similarities in source mechanism and waveform signals between the various events. Focal depths were found to be very shallow and of the order of about 2-3 km. Source mechanisms determined for the largest of the events show strike-slip solutions along either northeast-southwest or northwest-southeast striking fault planes. The surface expression of faults in the epicentral area is generally northwest-southeast, suggesting that this is the more likely fault plane.

  8. Broadband Ground Motion Reconstruction for the Kanto Basin during the 1923 Kanto Earthquake

    Science.gov (United States)

    Sekiguchi, Haruko; Yoshimi, Masayuki

    2011-03-01

    Ground motions of the 1923 Kanto Earthquake inside the Kanto Basin are numerically simulated in a wide frequency range (0-10 Hz) based on new knowledge of the earthquake's source processes, the sedimentary structure of the basin, and techniques for generating broadband source models of great earthquakes. The Kanto Earthquake remains one of the most important exemplars for ground motion prediction in Japan due to its size, faulting geometry, and location beneath the densely populated Kanto sedimentary basin. We reconstruct a broadband source model of the 1923 Kanto Earthquake from inversion results by introducing small-scale heterogeneities. The corresponding ground motions are simulated using a hybrid technique comprising the following four calculations: (1) low-frequency ground motion of the engineering basement, modeled using a finite difference method; (2) high-frequency ground motion of the engineering basement, modeled using a stochastic Green's function method; (3) total ground motion of the engineering basement (i.e. 1 + 2); and (4) ground motion at the surface in response to the total basement ground motion. We employ a recently developed three-dimensional (3D) velocity structure model of the Kanto Basin that incorporates prospecting data, microtremor observations and measurements derived from strong ground motion records. Our calculations reveal peak ground velocities (PGV) exceeding 50 cm/s in the area above the fault plane: to the south, where the fault plane is shallowest, PGV reaches 150-200 cm/s at the engineering basement and 200-250 cm/s at the surface. Intensity 7, the maximum value in the Japan Meteorological Agency's intensity scale, is calculated to have occurred widely in Sagami Bay, which corresponds well with observed house-collapse rates due to the 1923 event. The modeling reveals a pronounced forward directivity effect for the area lying above the southern, shallow part of the fault plane. The high PGV and intensity seen above the

  9. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    Science.gov (United States)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  10. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    Science.gov (United States)

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  11. The November 23, 1980 Irpinia earthquake (Terremoto Campano Lucano) observations of soil and soil-structure interaction effects

    International Nuclear Information System (INIS)

    Guerpinar, A.; Vardanega, C.; Ries, E.R.

    1981-01-01

    A catastrophe of major dimensions such as the November 23, 1980 Irpinia Earthquake (Terremoto Campano Lucano) should be examined from different points of view, e.g. geological engineering, architectural, rural and urban planning, socio-economical, so that the effects of future events can be mitigated to a certain extent. This paper covers a portion of the engineering lessons to be drawn from this event. These efforts have been directed to bring out cases and observations which may have significance in the siting and design of nuclear power plants. The Irpinia Earthquake caused widespread damage in a region of southern Italy which is developing in terms of industrial and transportation facilities. It was, therefore, possible to observe damage (or the lack of it) on a wide variety of structures, such as buildings, bridges, tunnels, roads and chimneys ranging in age from very old to very new. The seven-day field trip took place at the end of January 1981, about two months after the earthquake. With few sporadic exceptions, such as the hospital building in S. Angelo dei Lombardi, the damaged structures were untouched and reliable engineering observations on the damage patterns were possible. One of the most striking aspects of the earthquake was the extent of the damage caused to structures by soil failures or soil-structure interaction effects. This aspect, in particular, is addressed in this paper. (orig.)

  12. Prevention of strong earthquakes: Goal or utopia?

    Science.gov (United States)

    Mukhamediev, Sh. A.

    2010-11-01

    In the present paper, we consider ideas suggesting various kinds of industrial impact on the close-to-failure block of the Earth’s crust in order to break a pending strong earthquake (PSE) into a number of smaller quakes or aseismic slips. Among the published proposals on the prevention of a forthcoming strong earthquake, methods based on water injection and vibro influence merit greater attention as they are based on field observations and the results of laboratory tests. In spite of this, the cited proofs are, for various reasons, insufficient to acknowledge the proposed techniques as highly substantiated; in addition, the physical essence of these methods has still not been fully understood. First, the key concept of the methods, namely, the release of the accumulated stresses (or excessive elastic energy) in the source region of a forthcoming strong earthquake, is open to objection. If we treat an earthquake as a phenomenon of a loss in stability, then, the heterogeneities of the physicomechanical properties and stresses along the existing fault or its future trajectory, rather than the absolute values of stresses, play the most important role. In the present paper, this statement is illustrated by the classical examples of stable and unstable fractures and by the examples of the calculated stress fields, which were realized in the source regions of the tsunamigenic earthquakes of December 26, 2004 near the Sumatra Island and of September 29, 2009 near the Samoa Island. Here, just before the earthquakes, there were no excessive stresses in the source regions. Quite the opposite, the maximum shear stresses τmax were close to their minimum value, compared to τmax in the adjacent territory. In the present paper, we provide quantitative examples that falsify the theory of the prevention of PSE in its current form. It is shown that the measures for the prevention of PSE, even when successful for an already existing fault, can trigger or accelerate a catastrophic

  13. HOT Faults", Fault Organization, and the Occurrence of the Largest Earthquakes

    Science.gov (United States)

    Carlson, J. M.; Hillers, G.; Archuleta, R. J.

    2006-12-01

    We apply the concept of "Highly Optimized Tolerance" (HOT) for the investigation of spatio-temporal seismicity evolution, in particular mechanisms associated with largest earthquakes. HOT provides a framework for investigating both qualitative and quantitative features of complex feedback systems that are far from equilibrium and punctuated by rare, catastrophic events. In HOT, robustness trade-offs lead to complexity and power laws in systems that are coupled to evolving environments. HOT was originally inspired by biology and engineering, where systems are internally very highly structured, through biological evolution or deliberate design, and perform in an optimum manner despite fluctuations in their surroundings. Though faults and fault systems are not designed in ways comparable to biological and engineered structures, feedback processes are responsible in a conceptually comparable way for the development, evolution and maintenance of younger fault structures and primary slip surfaces of mature faults, respectively. Hence, in geophysical applications the "optimization" approach is perhaps more aptly replaced by "organization", reflecting the distinction between HOT and random, disorganized configurations, and highlighting the importance of structured interdependencies that evolve via feedback among and between different spatial and temporal scales. Expressed in the terminology of the HOT concept, mature faults represent a configuration optimally organized for the release of strain energy; whereas immature, more heterogeneous fault networks represent intermittent, suboptimal systems that are regularized towards structural simplicity and the ability to generate large earthquakes more easily. We discuss fault structure and associated seismic response pattern within the HOT concept, and outline fundamental differences between this novel interpretation to more orthodox viewpoints like the criticality concept. The discussion is flanked by numerical simulations of a

  14. Response and recovery lessons from the 2010-2011 earthquake sequence in Canterbury, New Zealand

    Science.gov (United States)

    Pierepiekarz, Mark; Johnston, David; Berryman, Kelvin; Hare, John; Gomberg, Joan S.; Williams, Robert A.; Weaver, Craig S.

    2014-01-01

    The impacts and opportunities that result when low-probability moderate earthquakes strike an urban area similar to many throughout the US were vividly conveyed in a one-day workshop in which social and Earth scientists, public officials, engineers, and an emergency manager shared their experiences of the earthquake sequence that struck the city of Christchurch and surrounding Canterbury region of New Zealand in 2010-2011. Without question, the earthquake sequence has had unprecedented impacts in all spheres on New Zealand society, locally to nationally--10% of the country's population was directly impacted and losses total 8-10% of their GDP. The following paragraphs present a few lessons from Christchurch.

  15. A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM

    Science.gov (United States)

    Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.

    2007-12-01

    The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.

  16. REGIONAL SEISMIC AMPLITUDE MODELING AND TOMOGRAPHY FOR EARTHQUAKE-EXPLOSION DISCRIMINATION

    Energy Technology Data Exchange (ETDEWEB)

    Walter, W R; Pasyanos, M E; Matzel, E; Gok, R; Sweeney, J; Ford, S R; Rodgers, A J

    2008-07-08

    We continue exploring methodologies to improve earthquake-explosion discrimination using regional amplitude ratios such as P/S in a variety of frequency bands. Empirically we demonstrate that such ratios separate explosions from earthquakes using closely located pairs of earthquakes and explosions recorded on common, publicly available stations at test sites around the world (e.g. Nevada, Novaya Zemlya, Semipalatinsk, Lop Nor, India, Pakistan, and North Korea). We are also examining if there is any relationship between the observed P/S and the point source variability revealed by longer period full waveform modeling (e. g. Ford et al 2008). For example, regional waveform modeling shows strong tectonic release from the May 1998 India test, in contrast with very little tectonic release in the October 2006 North Korea test, but the P/S discrimination behavior appears similar in both events using the limited regional data available. While regional amplitude ratios such as P/S can separate events in close proximity, it is also empirically well known that path effects can greatly distort observed amplitudes and make earthquakes appear very explosion-like. Previously we have shown that the MDAC (Magnitude Distance Amplitude Correction, Walter and Taylor, 2001) technique can account for simple 1-D attenuation and geometrical spreading corrections, as well as magnitude and site effects. However in some regions 1-D path corrections are a poor approximation and we need to develop 2-D path corrections. Here we demonstrate a new 2-D attenuation tomography technique using the MDAC earthquake source model applied to a set of events and stations in both the Middle East and the Yellow Sea Korean Peninsula regions. We believe this new 2-D MDAC tomography has the potential to greatly improve earthquake-explosion discrimination, particularly in tectonically complex regions such as the Middle East. Monitoring the world for potential nuclear explosions requires characterizing seismic

  17. Hysteresis behavior of seismic isolators in earthquakes near a fault ...

    African Journals Online (AJOL)

    Seismic performance and appropriate design of structures located near the faults has always been a major concern of design engineers. Because during an earthquake; the effects of plasticity will make differences in characteristics of near field records. These pulsed movements at the beginning of records will increase the ...

  18. The 2012 August 27 Mw7.3 El Salvador earthquake: expression of weak coupling on the Middle America subduction zone

    Science.gov (United States)

    Geirsson, Halldor; LaFemina, Peter C.; DeMets, Charles; Hernandez, Douglas Antonio; Mattioli, Glen S.; Rogers, Robert; Rodriguez, Manuel; Marroquin, Griselda; Tenorio, Virginia

    2015-09-01

    Subduction zones exhibit variable degrees of interseismic coupling as resolved by inversions of geodetic data and analyses of seismic energy release. The degree to which a plate boundary fault is coupled can have profound effects on its seismogenic behaviour. Here we use GPS measurements to estimate co- and post-seismic deformation from the 2012 August 27, Mw7.3 megathrust earthquake offshore El Salvador, which was a tsunami earthquake. Inversions of estimated coseismic displacements are in agreement with published seismically derived source models, which indicate shallow (earthquake exceeds the coseismic deformation. Our analysis indicates that the post-seismic deformation is dominated by afterslip, as opposed to viscous relaxation, and we estimate a post-seismic moment release one to eight times greater than the coseismic moment during the first 500 d, depending on the relative location of coseismic versus post-seismic slip on the plate interface. We suggest that the excessive post-seismic motion is characteristic for the El Salvador-Nicaragua segment of the Central American margin and may be a characteristic of margins hosting tsunami earthquakes.

  19. Probabilistic earthquake risk assessment as a tool to improve safety and explanatory adequacy

    International Nuclear Information System (INIS)

    Itoi, Tatsuya

    2015-01-01

    This paper explains the concept of probabilistic earthquake risk assessment, mainly from the viewpoint as a tool to improve safety and explanatory adequacy. The definition of risk is the expected value of undesirable effect in an engineering meaning that is likely to occur in the future, and it is defined in risk management as the triplet of scenario (what can happen), frequency, and impact. As for the earthquake risk assessment of a nuclear power plant, the fragility of structure / system / component (SSC) against earthquake (so-called earthquake fragility) is assessed, and by combining with the earthquake hazard that has been separately obtained, the occurrence frequency and impact of the accident are obtained. From the view of the authors, earthquake risk assessment is for the purpose of decision-making, and is not intended to calculate the probability in a scientifically rigorous manner. For ensuring the quality of risk assessment, the table of 'Expert utilization standards for the evaluation of epistemological uncertainty' is used. Sole quantitative risk assessment is not necessarily sufficient for risk management. It would be important to find how to build the 'framework for comprehensive decision-making.' (A.O.)

  20. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  1. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  2. Correlation of pre-earthquake electromagnetic signals with laboratory and field rock experiments

    Directory of Open Access Journals (Sweden)

    T. Bleier

    2010-09-01

    Full Text Available Analysis of the 2007 M5.4 Alum Rock earthquake near San José California showed that magnetic pulsations were present in large numbers and with significant amplitudes during the 2 week period leading up the event. These pulsations were 1–30 s in duration, had unusual polarities (many with only positive or only negative polarities versus both polarities, and were different than other pulsations observed over 2 years of data in that the pulse sequence was sustained over a 2 week period prior to the quake, and then disappeared shortly after the quake. A search for the underlying physics process that might explain these pulses was was undertaken, and one theory (Freund, 2002 demonstrated that charge carriers were released when various types of rocks were stressed in a laboratory environment. It was also significant that the observed charge carrier generation was transient, and resulted in pulsating current patterns. In an attempt to determine if this phenomenon occurred outside of the laboratory environment, the authors scaled up the physics experiment from a relatively small rock sample in a dry laboratory setting, to a large 7 metric tonne boulder comprised of Yosemite granite. This boulder was located in a natural, humid (above ground setting at Bass Lake, Ca. The boulder was instrumented with two Zonge Engineering, Model ANT4 induction type magnetometers, two Trifield Air Ion Counters, a surface charge detector, a geophone, a Bruker Model EM27 Fourier Transform Infra Red (FTIR spectrometer with Sterling cycle cooler, and various temperature sensors. The boulder was stressed over about 8 h using expanding concrete (Bustartm, until it fractured into three major pieces. The recorded data showed surface charge build up, magnetic pulsations, impulsive air conductivity changes, and acoustical cues starting about 5 h before the boulder actually broke. These magnetic and air conductivity pulse signatures resembled both the laboratory

  3. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  4. Earthquake disaster simulation of civil infrastructures from tall buildings to urban areas

    CERN Document Server

    Lu, Xinzheng

    2017-01-01

    Based on more than 12 years of systematic investigation on earthquake disaster simulation of civil infrastructures, this book covers the major research outcomes including a number of novel computational models, high performance computing methods and realistic visualization techniques for tall buildings and urban areas, with particular emphasize on collapse prevention and mitigation in extreme earthquakes, earthquake loss evaluation and seismic resilience. Typical engineering applications to several tallest buildings in the world (e.g., the 632 m tall Shanghai Tower and the 528 m tall Z15 Tower) and selected large cities in China (the Beijing Central Business District, Xi'an City, Taiyuan City and Tangshan City) are also introduced to demonstrate the advantages of the proposed computational models and techniques. The high-fidelity computational model developed in this book has proven to be the only feasible option to date for earthquake-induced collapse simulation of supertall buildings that are higher than 50...

  5. Investigation of the relationship between earthquakes and indoor radon concentrations at a building in Gyeongju, Korea

    Directory of Open Access Journals (Sweden)

    Jae Wook Kim

    2018-04-01

    Full Text Available This article measured and analyzed the indoor radon concentrations at one university building in Gyeongju, Republic of Korea, to investigate if there is any relationship between earthquakes and indoor radon concentration. Since 12 September 2016, when two 5.1 and 5.8 magnitude earthquakes occurred, hundreds of aftershocks affected Gyeongju until January 2017. The measurements were made at the ground floor of the Energy Engineering Hall of Dongguk University in Gyeongju over a period between February 2016 and January 2017. The measurements were made with an RAD7 detector on the basis of the US Environmental Protection Agency measurement protocol. Each measurement was continuously made every 30 minutes over the measurement period every month. Among earthquakes with 2.0 or greater magnitude, the earthquakes whose occurrence timings fell into the measurement periods were screened for further analysis. We observed similar spike-like patterns between the indoor radon concentration distributions and earthquakes: a sudden increase in the peak indoor radon concentration 1–4 days before an earthquake, gradual decrease before the earthquake, and sudden drop on the day of the earthquake if the interval between successive earthquakes was moderately longer, for example, 3 days in this article. Keywords: Earthquakes, Gyeongju, Indoor Radon Concentration, RAD7, Radon Anomaly

  6. Assessing the Utility of and Improving USGS Earthquake Hazards Program Products

    Science.gov (United States)

    Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.

    2010-12-01

    A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.

  7. Earthquake forewarning — A multidisciplinary challenge from the ground up to space

    Science.gov (United States)

    Freund, Friedemann

    2013-08-01

    Most destructive earthquakes nucleate at between 5-7 km and about 35-40 km depth. Before earthquakes, rocks are subjected to increasing stress. Not every stress increase leads to rupture. To understand pre-earthquake phenomena we note that igneous and high-grade metamorphic rocks contain defects which, upon stressing, release defect electrons in the oxygen anion sublattice, known as positive holes. These charge carriers are highly mobile, able to flow out of stressed rocks into surrounding unstressed rocks. They form electric currents, which emit electromagnetic radiation, sometimes in pulses, sometimes sustained. The arrival of positive holes at the ground-air interface can lead to air ionization, often exclusively positive. Ionized air rising upward can lead to cloud condensation. The upward flow of positive ions can lead to instabilities in the mesosphere, to mesospheric lightning, to changes in the Total Electron Content (TEC) at the lower edge of the ionosphere, and electric field turbulences. Advances in deciphering the earthquake process can only be achieved in a broadly multidisciplinary spirit.

  8. 9th Structural Engineering Convention 2014

    CERN Document Server

    2015-01-01

    The book presents research papers presented by academicians, researchers, and practicing structural engineers from India and abroad in the recently held Structural Engineering Convention (SEC) 2014 at Indian Institute of Technology Delhi during 22 – 24 December 2014. The book is divided into three volumes and encompasses multidisciplinary areas within structural engineering, such as earthquake engineering and structural dynamics, structural mechanics, finite element methods, structural vibration control, advanced cementitious and composite materials, bridge engineering, and soil-structure interaction. Advances in Structural Engineering is a useful reference material for structural engineering fraternity including undergraduate and postgraduate students, academicians, researchers and practicing engineers.

  9. Distinguishing megathrust from intraplate earthquakes using lacustrine turbidites (Laguna Lo Encañado, Central Chile)

    Science.gov (United States)

    Van Daele, Maarten; Araya-Cornejo, Cristian; Pille, Thomas; Meyer, Inka; Kempf, Philipp; Moernaut, Jasper; Cisternas, Marco

    2017-04-01

    triggered by megathrust earthquakes. These findings are an important step forward in the interpretation of lacustrine turbidites in subduction settings, and will eventually improve hazard assessments based on such paleoseismic records in the study area, and in other subduction zones. References Howarth et al., 2014. Lake sediments record high intensity shaking that provides insight into the location and rupture length of large earthquakes on the Alpine Fault, New Zealand. Earth and Planetary Science Letters 403, 340-351. Lomnitz, 1960. A study of the Maipo Valley earthquakes of September 4, 1958, Second World Conference on Earthquake Engineering, Tokyo and Kyoto, Japan, pp. 501-520. Sepulveda et al., 2008. New Findings on the 1958 Las Melosas Earthquake Sequence, Central Chile: Implications for Seismic Hazard Related to Shallow Crustal Earthquakes in Subduction Zones. Journal of Earthquake Engineering 12, 432-455. Van Daele et al., 2015. A comparison of the sedimentary records of the 1960 and 2010 great Chilean earthquakes in 17 lakes: Implications for quantitative lacustrine palaeoseismology. Sedimentology 62, 1466-1496.

  10. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    Science.gov (United States)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  11. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  12. Quantification of social contributions to earthquake mortality

    Science.gov (United States)

    Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.

    2013-12-01

    Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.

  13. International Civil and Infrastructure Engineering Conference 2013

    CERN Document Server

    Yusoff, Marina; Ismail, Zulhabri; Amin, Norliyati; Fadzil, Mohd

    2014-01-01

    The special focus of this proceedings is to cover the areas of infrastructure engineering and sustainability management. The state-of-the art information in infrastructure and sustainable issues in engineering covers earthquake, bioremediation, synergistic management, timber engineering, flood management and intelligent transport systems. It provides precise information with regards to innovative research development in construction materials and structures in addition to a compilation of interdisciplinary finding combining nano-materials and engineering.

  14. International Civil and Infrastructure Engineering Conference 2014

    CERN Document Server

    Yusoff, Marina; Alisibramulisi, Anizahyati; Amin, Norliyati; Ismail, Zulhabri

    2015-01-01

    The special focus of this proceedings is to cover the areas of infrastructure engineering and sustainability management. The state-of-the art information in infrastructure and sustainable issues in engineering covers earthquake, bioremediation, synergistic management, timber engineering, flood management and intelligent transport systems. It provides precise information with regards to innovative research development in construction materials and structures in addition to a compilation of interdisciplinary finding combining nano-materials and engineering.

  15. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  16. Y-12 site-specific earthquake response analysis and soil liquefaction assessment

    International Nuclear Information System (INIS)

    Ahmed, S.B.; Hunt, R.J.; Manrod, W.E. III.

    1995-01-01

    A site-specific earthquake response analysis and soil liquefaction assessment were performed for the Oak Ridge Y-12 Plant. The main purpose of these studies was to use the results of the analyses for evaluating the safety of the performance category -1, -2, and -3 facilities against the natural phenomena seismic hazards. Earthquake response was determined for seven (7), one dimensional soil columns (Fig. 12) using two horizontal components of the PC-3 design basis 2000-year seismic event. The computer program SHAKE 91 (Ref. 7) was used to calculate the absolute response accelerations on top of ground (soil/weathered shale) and rock outcrop. The SHAKE program has been validated for horizontal response calculations at periods less than 2.0 second at several sites and consequently is widely accepted in the geotechnical earthquake engineering area for site response analysis

  17. Applications of human factors engineering to LNG release prevention and control

    Energy Technology Data Exchange (ETDEWEB)

    Shikiar, R.; Rankin, W.L.; Rideout, T.B.

    1982-06-01

    The results of an investigation of human factors engineering and human reliability applications to LNG release prevention and control are reported. The report includes a discussion of possible human error contributions to previous LNG accidents and incidents, and a discussion of generic HF considerations for peakshaving plants. More specific recommendations for improving HF practices at peakshaving plants are offered based on visits to six facilities. The HF aspects of the recently promulgated DOT regulations are reviewed, and recommendations are made concerning how these regulations can be implemented utilizing standard HF practices. Finally, the integration of HF considerations into overall system safety is illustrated by a presentation of human error probabilities applicable to LNG operations and by an expanded fault tree analysis which explicitly recognizes man-machine interfaces.

  18. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  19. Isolating social influences on vulnerability to earthquake shaking: identifying cost-effective mitigation strategies.

    Science.gov (United States)

    Bhloscaidh, Mairead Nic; McCloskey, John; Pelling, Mark; Naylor, Mark

    2013-04-01

    Until expensive engineering solutions become more universally available, the objective targeting of resources at demonstrably effective, low-cost interventions might help reverse the trend of increasing mortality in earthquakes. Death tolls in earthquakes are the result of complex interactions between physical effects, such as the exposure of the population to strong shaking, and the resilience of the exposed population along with supporting critical infrastructures and institutions. The identification of socio-economic factors that contribute to earthquake mortality is crucial to identifying and developing successful risk management strategies. Here we develop a quantitative methodology more objectively to assess the ability of communities to withstand earthquake shaking, focusing on, in particular, those cases where risk management performance appears to exceed or fall below expectations based on economic status. Using only published estimates of the shaking intensity and population exposure for each earthquake, data that is available for earthquakes in countries irrespective of their level of economic development, we develop a model for mortality based on the contribution of population exposure to shaking only. This represents an attempt to remove, as far as possible, the physical causes of mortality from our analysis (where we consider earthquake engineering to reduce building collapse among the socio-economic influences). The systematic part of the variance with respect to this model can therefore be expected to be dominated by socio-economic factors. We find, as expected, that this purely physical analysis partitions countries in terms of basic socio-economic measures, for example GDP, focusing analytical attention on the power of economic measures to explain variance in observed distributions of earthquake risk. The model allows the definition of a vulnerability index which, although broadly it demonstrates the expected income-dependence of vulnerability to

  20. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  1. Release, transport and toxicity of engineered nanoparticles.

    Science.gov (United States)

    Soni, Deepika; Naoghare, Pravin K; Saravanadevi, Sivanesan; Pandey, Ram Avatar

    2015-01-01

    Recent developments in nanotechnology have facilitated the synthesis of novel engineered nanoparticles (ENPs) that possess new and different physicochemical properties. These ENPs have been ex tensive ly used in various commercial sectors to achieve both social and economic benefits. However. the increasing production and consumption of ENPs by many different industries has raised concerns about their possible release and accumulation in the environment. Released EN Ps may either remain suspended in the atmosphere for several years or may accumulate and eventually be modified int o other substances. Settled nanoparticles can he easily washed away during ra in s. and therefore may easily enter the food chain via water and so il. Thus. EN Ps can contaminate air. water and soil and can subsequently pose adverse risks to the health of different organisms. Studies to date indicate that ENP transport to and within the ecosystem depend on their chemical and physical properties (viz .. size. shape and solubility) . Therefore. the EN Ps display variable behavior in the environment because of their individual properties th at affect their tendency for adsorption, absorption, diffusional and colloidal interaction. The transport of EN Ps also influences their fate and chemical transformation in ecosystems. The adsorption, absorption and colloidal interaction of ENPs affect their capacity to be degraded or transformed, whereas the tendency of ENPs to agglomerate fosters their sedimentation. How widely ENPs are transported and their environmental fate influence how tox ic they may become to environmental organisms. One barrier to fully understanding how EN Ps are transformed in the environment and how best to characterize their toxicity, is related to the nature of their ultrafine structure. Experiments with different animals, pl ants, and cell lines have revealed that ENPs induce toxicity via several cellular pathways that is linked to the size. shape. surface area

  2. Geodetic constraints on afterslip characteristics following the March 9, 2011, Sanriku-oki earthquake, Japan

    Science.gov (United States)

    Ohta, Yusaku; Hino, Ryota; Inazu, Daisuke; Ohzono, Mako; Ito, Yoshihiro; Mishina, Masaaki; Iinuma, Takeshi; Nakajima, Junichi; Osada, Yukihito; Suzuki, Kensuke; Fujimoto, Hiromi; Tachibana, Kenji; Demachi, Tomotsugu; Miura, Satoshi

    2012-08-01

    A magnitude 7.3 foreshock occurred at the subducting Pacific plate interface on March 9, 2011, 51 h before the magnitude 9.0 Tohoku earthquake off the Pacific coast of Japan. We propose a coseismic and postseismic afterslip model of the magnitude 7.3 event based on a global positioning system network and ocean bottom pressure gauge sites. The estimated coseismic slip and afterslip areas show complementary spatial distributions; the afterslip distribution is located up-dip of the coseismic slip for the foreshock and northward of hypocenter of the Tohoku earthquake. The slip amount for the afterslip is roughly consistent with that determined by repeating earthquake analysis carried out in a previous study. The estimated moment release for the afterslip reached magnitude 6.8, even within a short time period of 51h. A volumetric strainmeter time series also suggests that this event advanced with a rapid decay time constant compared with other typical large earthquakes.

  3. Awareness and understanding of earthquake hazards at school

    Science.gov (United States)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  4. Who is Responsible for Human Suffering due to Earthquakes?

    Science.gov (United States)

    Wyss, M.

    2012-12-01

    A court in L'Aquila, Italy, convicted seven to six years in prison and a combined fine of two million Euros for not following their "obligation to avoid death, injury and damage, or at least to minimize them," as the prosecution alleged. These men lose their jobs and pensions, and are banned from holding public office. Meanwhile, the town of L'Aquila is teeming with furious citizens, who are preparing additional civil suits against the defendants, whom they hold responsible for the deaths of their loved ones, killed by collapsing buildings during the magnitude 6.3 earthquake of April 6, 2009. Before this shock, an earthquake swarm had scared the inhabitants for several weeks. To calm the population, the vice-director of the Department of Civil Protection (DCP) called a meeting of the Italian Commission of Great Risks (CGR) in L'Aquila to assess the situation on March 31. One hour before this meeting, the vice-director stated in a TV interview that the seismic situation in L'Aquila was "certainly normal" and posed "no danger" and he added that "the scientific community continues to assure me that, to the contrary, it's a favorable situation because of the continuous discharge of energy." This statement is untrue in two ways. Firstly, small earthquakes do not release enough strain energy to reduce the potential for a large shock, and secondly no seismologist would make such a statement because we know it is not true. However, the population clung to the idea: "the more tremors, the less danger". People who lost relatives allege that they would have left their homes, had they not been falsely assured of their safety. The court treated all seven alike, although they had very different functions and obligations. Two were leaders in DCP, four were members of the CGR, and one was a seismology expert, who brought the latest seismic data. The minutes of the meeting show that none of the experts said anything wrong. They all stated that the probability of a main shock to

  5. Complex rupture during the 12 January 2010 Haiti earthquake

    Science.gov (United States)

    Hayes, G.P.; Briggs, R.W.; Sladen, A.; Fielding, E.J.; Prentice, C.; Hudnut, K.; Mann, P.; Taylor, F.W.; Crone, A.J.; Gold, R.; Ito, T.; Simons, M.

    2010-01-01

    Initially, the devastating Mw 7.0, 12 January 2010 Haiti earthquake seemed to involve straightforward accommodation of oblique relative motion between the Caribbean and North American plates along the Enriquillog-Plantain Garden fault zone. Here, we combine seismological observations, geologic field data and space geodetic measurements to show that, instead, the rupture process may have involved slip on multiple faults. Primary surface deformation was driven by rupture on blind thrust faults with only minor, deep, lateral slip along or near the main Enriquillog-Plantain Garden fault zone; thus the event only partially relieved centuries of accumulated left-lateral strain on a small part of the plate-boundary system. Together with the predominance of shallow off-fault thrusting, the lack of surface deformation implies that remaining shallow shear strain will be released in future surface-rupturing earthquakes on the Enriquillog-Plantain Garden fault zone, as occurred in inferred Holocene and probable historic events. We suggest that the geological signature of this earthquakeg-broad warping and coastal deformation rather than surface rupture along the main fault zoneg-will not be easily recognized by standard palaeoseismic studies. We conclude that similarly complex earthquakes in tectonic environments that accommodate both translation and convergenceg-such as the San Andreas fault through the Transverse Ranges of Californiag-may be missing from the prehistoric earthquake record. ?? 2010 Macmillan Publishers Limited. All rights reserved.

  6. Modeling subduction earthquake sources in the central-western region of Colombia using waveform inversion of body waves

    Science.gov (United States)

    Monsalve-Jaramillo, Hugo; Valencia-Mina, William; Cano-Saldaña, Leonardo; Vargas, Carlos A.

    2018-05-01

    Source parameters of four earthquakes located within the Wadati-Benioff zone of the Nazca plate subducting beneath the South American plate in Colombia were determined. The seismic moments for these events were recalculated and their approximate equivalent rupture area, slip distribution and stress drop were estimated. The source parameters for these earthquakes were obtained by deconvolving multiple events through teleseismic analysis of body waves recorded in long period stations and with simultaneous inversion of P and SH waves. The calculated source time functions for these events showed different stages that suggest that these earthquakes can reasonably be thought of being composed of two subevents. Even though two of the overall focal mechanisms obtained yielded similar results to those reported by the CMT catalogue, the two other mechanisms showed a clear difference compared to those officially reported. Despite this, it appropriate to mention that the mechanisms inverted in this work agree well with the expected orientation of faulting at that depth as well as with the wave forms they are expected to produce. In some of the solutions achieved, one of the two subevents exhibited a focal mechanism considerably different from the total earthquake mechanism; this could be interpreted as the result of a slight deviation from the overall motion due the complex stress field as well as the possibility of a combination of different sources of energy release analogous to the ones that may occur in deeper earthquakes. In those cases, the subevents with very different focal mechanism compared to the total earthquake mechanism had little contribution to the final solution and thus little contribution to the total amount of energy released.

  7. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  8. 3rd International Civil and Infrastructure Engineering Conference

    CERN Document Server

    Hamid, Nor; Arshad, Mohd; Arshad, Ahmad; Ridzuan, Ahmad; Awang, Haryati

    2016-01-01

    The special focus of these proceedings is on the areas of infrastructure engineering and sustainability management. They provide detailed information on innovative research developments in construction materials and structures, in addition to a compilation of interdisciplinary findings combining nano-materials and engineering. The coverage of cutting-edge infrastructure and sustainability issues in engineering includes earthquakes, bioremediation, synergistic management, timber engineering, flood management and intelligent transport systems.

  9. A cooperative NRC/CEA research project on earthquake ground motion on soil sites: overview

    International Nuclear Information System (INIS)

    Murphy, A.J.; Mohammadioun, B.

    1989-10-01

    This paper provides an overview of a multi-phase experiment being conducted jointly by the U.S. Nuclear Regulatory Commission and the French Commissariat a l'Energie Atomique. The objective of the experiment is to collect a comprehensive set of data on the propagation of earthquake ground motions vertically through a shallow soil column (on the order of several tens of meters). The data will be used to validate several of the available engineering computer codes for modeling earthquake ground motion. The data set will also be used to develop an improved understanding of the earthquake source function and the potential for non-linear effects controlling the propagation through the shallow soil column

  10. Proactive vs. reactive learning on buildings response and earthquake risks, in schools of Romania

    Directory of Open Access Journals (Sweden)

    Daniela DOBRE

    2015-07-01

    Full Text Available During the last 20 years, many specific activities of earthquake education and preparedness were initiated and supported in Romania by drafting materials for citizens, students, professors etc. (Georgescu et al., 2004, 2006. The education, training and information on earthquake disaster potential are important factors to mitigate the earthquake effects. Such activities, however, need time to be developed and may take different forms of presentation in order to capture the attention, to increase interest, to develop skills and attitudes in order to induce a proper behavior towards safety preparedness. It shall also be based on the accumulation of concerns and knowledge, which are, in principle, a consequence of the motivation, but which depend on the methods applied and actions taken for efficient earthquake preparedness, assessed and updated following actual earthquakes (Masuda, Midorikawa, Miki and Ohmachi, 1988. We are now at a crossroad and the proactive attitude and behavior (anticipative and participative needs to be extended in learning, within institutional framework, but correlated with the usual targets of schools and teenagers proactive issue (ROEDUSEIS-NET; Page and Page, 2003, by encouraging students in activities closer to earthquake engineering.

  11. Earthquake prediction rumors can help in building earthquake awareness: the case of May the 11th 2011 in Rome (Italy)

    Science.gov (United States)

    Amato, A.; Arcoraci, L.; Casarotti, E.; Cultrera, G.; Di Stefano, R.; Margheriti, L.; Nostro, C.; Selvaggi, G.; May-11 Team

    2012-04-01

    Banner headlines in an Italian newspaper read on May 11, 2011: "Absence boom in offices: the urban legend in Rome become psychosis". This was the effect of a large-magnitude earthquake prediction in Rome for May 11, 2011. This prediction was never officially released, but it grew up in Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment and this increased the earthquake prediction credibility. Given the echo of this earthquake prediction, INGV decided to organize on May 11 (the same day the earthquake was predicted to happen) an Open Day in its headquarter in Rome to inform on the Italian seismicity and the earthquake physics. The Open Day was preceded by a press conference two days before, attended by about 40 journalists from newspapers, local and national TV's, press agencies and web news magazines. Hundreds of articles appeared in the following two days, advertising the 11 May Open Day. On May 11 the INGV headquarter was peacefully invaded by over 3,000 visitors from 9am to 9pm: families, students, civil protection groups and many journalists. The program included conferences on a wide variety of subjects (from social impact of rumors to seismic risk reduction) and distribution of books and brochures, in addition to several activities: meetings with INGV researchers to discuss scientific issues, visits to the seismic monitoring room (open 24h/7 all year), guided tours through interactive exhibitions on earthquakes and Earth's deep structure. During the same day, thirteen new videos have also been posted on our youtube/INGVterremoti channel to explain the earthquake process and hazard, and to provide real time periodic updates on seismicity in Italy. On May 11 no large earthquake happened in Italy. The initiative, built up in few weeks, had a very large feedback

  12. Hazus® estimated annualized earthquake losses for the United States

    Science.gov (United States)

    Jaiswal, Kishor; Bausch, Doug; Rozelle, Jesse; Holub, John; McGowan, Sean

    2017-01-01

    Large earthquakes can cause social and economic disruption that can be unprecedented to any given community, and the full recovery from these impacts may or may not always be achievable. In the United States (U.S.), the 1994 M6.7 Northridge earthquake in California remains the third costliest disaster in U.S. history; and it was one of the most expensive disasters for the federal government. Internationally, earthquakes in the last decade alone have claimed tens of thousands of lives and caused hundreds of billions of dollars of economic impact throughout the globe (~90 billion U.S. dollars (USD) from 2008 M7.9 Wenchuan China, ~20 billion USD from 2010 M8.8 Maule earthquake in Chile, ~220 billion USD from 2011 M9.0 Tohoku Japan earthquake, ~25 billion USD from 2011 M6.3 Christchurch New Zealand, and ~22 billion USD from 2016 M7.0 Kumamoto Japan). Recent earthquakes show a pattern of steadily increasing damages and losses that are primarily due to three key factors: (1) significant growth in earthquake-prone urban areas, (2) vulnerability of the older building stock, including poorly engineered non-ductile concrete buildings, and (3) an increased interdependency in terms of supply and demand for the businesses that operate among different parts of the world. In the United States, earthquake risk continues to grow with increased exposure of population and development even though the earthquake hazard has remained relatively stable except for the regions of induced seismic activity. Understanding the seismic hazard requires studying earthquake characteristics and locales in which they occur, while understanding the risk requires an assessment of the potential damage from earthquake shaking to the built environment and to the welfare of people—especially in high-risk areas. Estimating the varying degree of earthquake risk throughout the United States is critical for informed decision-making on mitigation policies, priorities, strategies, and funding levels in the

  13. Transpressional rupture of an unmapped fault during the 2010 Haiti earthquake

    KAUST Repository

    Calais, Éric

    2010-10-24

    On 12 January 2010, a Mw7.0 earthquake struck the Port-au-Prince region of Haiti. The disaster killed more than 200,000 people and caused an estimated $8 billion in damages, about 100% of the country?s gross domestic product. The earthquake was initially thought to have ruptured the Enriquillog-Plantain Garden fault of the southern peninsula of Haiti, which is one of two main strike-slip faults inferred to accommodate the 2cmyr -1 relative motion between the Caribbean and North American plates. Here we use global positioning system and radar interferometry measurements of ground motion to show that the earthquake involved a combination of horizontal and contractional slip, causing transpressional motion. This result is consistent with the long-term pattern of strain accumulation in Hispaniola. The unexpected contractional deformation caused by the earthquake and by the pattern of strain accumulation indicates present activity on faults other than the Enriquillog-Plantain Garden fault. We show that the earthquake instead ruptured an unmapped north-dipping fault, called the Léogâne fault. The Léogâne fault lies subparallel tog-but is different fromg-the Enriquillog-Plantain Garden fault. We suggest that the 2010 earthquake may have activated the southernmost front of the Haitian fold-and-thrust belt as it abuts against the Enriquillog-Plantain Garden fault. As the Enriquillog-Plantain Garden fault did not release any significant accumulated elastic strain, it remains a significant seismic threat for Haiti and for Port-au-Prince in particular. © 2010 Macmillan Publishers Limited. All rights reserved.

  14. Transpressional rupture of an unmapped fault during the 2010 Haiti earthquake

    KAUST Repository

    Calais, É ric; Freed, Andrew M.; Mattioli, Glen S.; Amelung, Falk; Jonsson, Sigurjon; Jansma, Pamela E.; Hong, Sanghoon; Dixon, Timothy H.; Pré petit, Claude; Momplaisir, Roberte

    2010-01-01

    On 12 January 2010, a Mw7.0 earthquake struck the Port-au-Prince region of Haiti. The disaster killed more than 200,000 people and caused an estimated $8 billion in damages, about 100% of the country?s gross domestic product. The earthquake was initially thought to have ruptured the Enriquillog-Plantain Garden fault of the southern peninsula of Haiti, which is one of two main strike-slip faults inferred to accommodate the 2cmyr -1 relative motion between the Caribbean and North American plates. Here we use global positioning system and radar interferometry measurements of ground motion to show that the earthquake involved a combination of horizontal and contractional slip, causing transpressional motion. This result is consistent with the long-term pattern of strain accumulation in Hispaniola. The unexpected contractional deformation caused by the earthquake and by the pattern of strain accumulation indicates present activity on faults other than the Enriquillog-Plantain Garden fault. We show that the earthquake instead ruptured an unmapped north-dipping fault, called the Léogâne fault. The Léogâne fault lies subparallel tog-but is different fromg-the Enriquillog-Plantain Garden fault. We suggest that the 2010 earthquake may have activated the southernmost front of the Haitian fold-and-thrust belt as it abuts against the Enriquillog-Plantain Garden fault. As the Enriquillog-Plantain Garden fault did not release any significant accumulated elastic strain, it remains a significant seismic threat for Haiti and for Port-au-Prince in particular. © 2010 Macmillan Publishers Limited. All rights reserved.

  15. Emergency feature. Great east Japan earthquake disaster Fukushima Daiichi accident

    International Nuclear Information System (INIS)

    Kawata, Tomio; Tsujikura, Yonezo; Kitamura, Toshiro

    2011-01-01

    The Tohoku Pacific Ocean earthquake occurred in March 11, 2011. The disastrous tsunami attacked Fukushima Daiichi nuclear power plants after automatically shutdown by the earthquake and all motor operated pumps became inoperable due to station black out. Despite the strenuous efforts of operators, if caused serious accident such as loss of cooling function, hydrogen explosion and release of large amount of radioactive materials into the environment, leading to nuclear power emergency that ordered resident to evacuate or remain indoors. This emergency feature consisted of four articles. The first was the interview with the president of JAIF (Japan Atomic Industrial Forum) on how to identify the cause of the accident completely, intensify safety assurance measures and promote discussions on a role of nuclear power in the nation's entire energy policy toward the reconstruction. Others were reactor states and events sequence after the accident with trend data of radiation in the reactor site, statement of president of AESJ (Atomic Energy Society of Japan) on nuclear crisis following Tohoku Pacific Ocean earthquake our response and my experience in evacuation life. (T. Tanaka)

  16. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  17. Recurrent slow slip events as a barrier to the northward rupture propagation of the 2016 Pedernales earthquake (Central Ecuador)

    Science.gov (United States)

    Vaca, Sandro; Vallée, Martin; Nocquet, Jean-Mathieu; Battaglia, Jean; Régnier, Marc

    2018-01-01

    The northern Ecuador segment of the Nazca/South America subduction zone shows spatially heterogeneous interseismic coupling. Two highly coupled zones (0.4° S-0.35° N and 0.8° N-4.0° N) are separated by a low coupled area, hereafter referred to as the Punta Galera-Mompiche Zone (PGMZ). Large interplate earthquakes repeatedly occurred within the coupled zones in 1958 (Mw 7.7) and 1979 (Mw 8.1) for the northern patch and in 1942 (Mw 7.8) and 2016 (Mw 7.8) for the southern patch, while the whole segment is thought to have rupture during the 1906 Mw 8.4-8.8 great earthquake. We find that during the last decade, the PGMZ has experienced regular and frequent seismic swarms. For the best documented sequence (December 2013-January 2014), a joint seismological and geodetic analysis reveals a six-week-long Slow Slip Event (SSE) associated with a seismic swarm. During this period, the microseismicity is organized into families of similar earthquakes spatially and temporally correlated with the evolution of the aseismic slip. The moment release (3.4 × 1018 Nm, Mw 6.3), over a 60 × 40 km area, is considerably larger than the moment released by earthquakes (5.8 × 1015 Nm, Mw 4.4) during the same time period. In 2007-2008, a similar seismic-aseismic episode occurred, with higher magnitudes both for the seismic and aseismic processes. Cross-correlation analyses of the seismic waveforms over a 15 years-long period further suggest a 2-year repeat time for seismic swarms, which also implies that SSEs recurrently affect this area. Such SSEs contribute to release the accumulated stress, likely explaining why the 2016 Pedernales earthquake did not propagate northward into the PGMZ.

  18. The Effect of Ethanol Addition to Gasoline on Low- and Intermediate-Temperature Heat Release under Boosted Conditions in Kinetically Controlled Engines

    Science.gov (United States)

    Vuilleumier, David Malcolm

    The detailed study of chemical kinetics in engines has become required to further advance engine efficiency while simultaneously lowering engine emissions. This push for higher efficiency engines is not caused by a lack of oil, but by efforts to reduce anthropogenic carbon dioxide emissions, that cause global warming. To operate in more efficient manners while reducing traditional pollutant emissions, modern internal combustion piston engines are forced to operate in regimes in which combustion is no longer fully transport limited, and instead is at least partially governed by chemical kinetics of combusting mixtures. Kinetically-controlled combustion allows the operation of piston engines at high compression ratios, with partially-premixed dilute charges; these operating conditions simultaneously provide high thermodynamic efficiency and low pollutant formation. The investigations presented in this dissertation study the effect of ethanol addition on the low-temperature chemistry of gasoline type fuels in engines. These investigations are carried out both in a simplified, fundamental engine experiment, named Homogeneous Charge Compression Ignition, as well as in more applied engine systems, named Gasoline Compression Ignition engines and Partial Fuel Stratification engines. These experimental investigations, and the accompanying modeling work, show that ethanol is an effective scavenger of radicals at low temperatures, and this inhibits the low temperature pathways of gasoline oxidation. Further, the investigations measure the sensitivity of gasoline auto-ignition to system pressure at conditions that are relevant to modern engines. It is shown that at pressures above 40 bar and temperatures below 850 Kelvin, gasoline begins to exhibit Low-Temperature Heat Release. However, the addition of 20% ethanol raises the pressure requirement to 60 bar, while the temperature requirement remains unchanged. These findings have major implications for a range of modern engines

  19. Energy budgets of mining-induced earthquakes and their interactions with nearby stopes

    Science.gov (United States)

    McGarr, A.

    2000-01-01

    In the early 1960's, N.G.W. Cook, using an underground network of geophones, demonstrated that most Witwatersrand tremors are closely associated with deep level gold mining operations. He also showed that the energy released by the closure of the tabular stopes at depths of the order of 2 km was more than sufficient to account for the mining-induced earthquakes. I report here updated versions of these two results based on more modern underground data acquired in the Witwatersrand gold fields. Firstly, an extensive suite of in situ stress data indicate that the ambient state of crustal stress here is close to the failure state in the absence of mining even though the tectonic setting is thoroughly stable. Mining initially stabilizes the rock mass by reducing the pore fluid pressure from its initial hydrostatic state to nearly zero. The extensive mine excavations, as Cook showed, concentrate the deviatoric stresses, in localized regions of the abutments, back into a failure state resulting in seismicity. Secondly, there appears to be two distinct types of mining-induced earthquakes: the first is strongly coupled to the mining and involves shear failure plus a coseismic volume reduction; the second type is not evidently coupled to any particular mine face, shows purely deviatoric failure, and is presumably caused by more regional changes in the state of stress due to mining. Thirdly, energy budgets for mining induced earthquakes of both types indicate that, of the available released energy, only a few per cent is radiated by the seismic waves with the majority being consumed in overcoming fault friction. Published by Elsevier Science Ltd.In the early 1960's, N.G.W. Cook, using an underground network of geophones, demonstrated that most Witwatersrand tremors are closely associated with deep level gold mining operations. He also showed that the energy released by the closure of the tabular stopes at depths of the order of 2 km was more than sufficient to account for the

  20. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  1. 1983 Borah Peak earthquake and INEL structural performance

    International Nuclear Information System (INIS)

    Gorman, V.W.; Guenzler, R.C.

    1983-12-01

    At 8:06 a.m. Mountain Daylight Time on October 28, 1983 an earthquake registering 7.3 on the Richter Magnitude scale occurred about 30 km northwest of the town of Mackay, in central Idaho. This report describes the event and associated effects and the responses of facilities at Idaho National Engineering Laboratory (INEL), located approximately 100 km. from the epicenter, to ground motion. 21 references, 36 figures, 5 tables

  2. Earthquake protection of essential civil and industrial equipments

    International Nuclear Information System (INIS)

    Bourrier, P.; Le Breton, F.; Thevenot, A.

    1986-01-01

    This document presents the principal reflexions concerning seismic engineering applications for equipment and the difference of the non-employment towards these structures. The notion of essential equipment is then pointed out as well as the main particularities of equipment considered as structures. Finally, this document illustrates a few pathological examples encountered after an earthquake, and presents some equipments of a nuclear power plant which to resist an increased safety seism [fr

  3. Benefits of multidisciplinary collaboration for earthquake casualty estimation models: recent case studies

    Science.gov (United States)

    So, E.

    2010-12-01

    Earthquake casualty loss estimation, which depends primarily on building-specific casualty rates, has long suffered from a lack of cross-disciplinary collaboration in post-earthquake data gathering. An increase in our understanding of what contributes to casualties in earthquakes involve coordinated data-gathering efforts amongst disciplines; these are essential for improved global casualty estimation models. It is evident from examining past casualty loss models and reviewing field data collected from recent events, that generalized casualty rates cannot be applied globally for different building types, even within individual countries. For a particular structure type, regional and topographic building design effects, combined with variable material and workmanship quality all contribute to this multi-variant outcome. In addition, social factors affect building-specific casualty rates, including social status and education levels, and human behaviors in general, in that they modify egress and survivability rates. Without considering complex physical pathways, loss models purely based on historic casualty data, or even worse, rates derived from other countries, will be of very limited value. What’s more, as the world’s population, housing stock, and living and cultural environments change, methods of loss modeling must accommodate these variables, especially when considering casualties. To truly take advantage of observed earthquake losses, not only do damage surveys need better coordination of international and national reconnaissance teams, but these teams must integrate difference areas of expertise including engineering, public health and medicine. Research is needed to find methods to achieve consistent and practical ways of collecting and modeling casualties in earthquakes. International collaboration will also be necessary to transfer such expertise and resources to the communities in the cities which most need it. Coupling the theories and findings from

  4. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    A devastating earthquake had been predicted for May 11, 2011 in Rome. This prediction was never released officially by anyone, but it grew up in the Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions. Indeed, around May 11, 2011, a planetary alignment was really expected and this contributed to give credibility to the earthquake prediction among people. During the previous months, INGV was overwhelmed with requests for information about this supposed prediction by Roman inhabitants and tourists. Given the considerable mediatic impact of this expected earthquake, INGV decided to organize an Open Day in its headquarter in Rome for people who wanted to learn more about the Italian seismicity and the earthquake as natural phenomenon. The Open Day was preceded by a press conference two days before, in which we talked about this prediction, we presented the Open Day, and we had a scientific discussion with journalists about the earthquake prediction and more in general on the real problem of seismic risk in Italy. About 40 journalists from newspapers, local and national tv's, press agencies and web news attended the Press Conference and hundreds of articles appeared in the following days, advertising the 11 May Open Day. The INGV opened to the public all day long (9am - 9pm) with the following program: i) meetings with INGV researchers to discuss scientific issues; ii) visits to the seismic monitoring room, open 24h/7 all year; iii) guided tours through interactive exhibitions on earthquakes and Earth's deep structure; iv) lectures on general topics from the social impact of rumors to seismic risk reduction; v) 13 new videos on channel YouTube.com/INGVterremoti to explain the earthquake process and give updates on various aspects of seismic monitoring in Italy; vi) distribution of books and brochures. Surprisingly, more than 3000 visitors came to visit INGV

  5. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  6. Engineered collagen hydrogels for the sustained release of biomolecules and imaging agents: promoting the growth of human gingival cells.

    Science.gov (United States)

    Choi, Jonghoon; Park, Hoyoung; Kim, Taeho; Jeong, Yoon; Oh, Myoung Hwan; Hyeon, Taeghwan; Gilad, Assaf A; Lee, Kwan Hyi

    2014-01-01

    We present here the in vitro release profiles of either fluorescently labeled biomolecules or computed tomography contrast nanoagents from engineered collagen hydrogels under physiological conditions. The collagen constructs were designed as potential biocompatible inserts into wounded human gingiva. The collagen hydrogels were fabricated under a variety of conditions in order to optimize the release profile of biomolecules and nanoparticles for the desired duration and amount. The collagen constructs containing biomolecules/nanoconstructs were incubated under physiological conditions (ie, 37°C and 5% CO2) for 24 hours, and the release profile was tuned from 20% to 70% of initially loaded materials by varying the gelation conditions of the collagen constructs. The amounts of released biomolecules and nanoparticles were quantified respectively by measuring the intensity of fluorescence and X-ray scattering. The collagen hydrogel we fabricated may serve as an efficient platform for the controlled release of biomolecules and imaging agents in human gingiva to facilitate the regeneration of oral tissues.

  7. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  8. Development of criteria for release of Idaho National Engineering Laboratory sites following decontamination and decommissioning

    International Nuclear Information System (INIS)

    Kirol, L.

    1986-08-01

    Criteria have been developed for release of Idaho National Engineering Laboratory (INEL) facilities and land areas following decontamination and decommissioning (D and D). Although these facilities and land areas are not currently being returned to the public domain, and no plans exist for doing so, criteria suitable for unrestricted release to the public were desired. Midway through this study, the implementation of Department of Energy (DOE) Order 5820.2, Radioactive Waste Management, required development of site specific release criteria for use on D and D projects. These criteria will help prevent remedial actions from being required if INEL reuse considerations change in the future. Development of criteria for release of INEL facilities following D and D comprised four study areas: pathways analysis, dose and concentration guidelines, sampling and instrumentation, and implementation procedures. Because of the complex and sensitive nature of the first three categories, a thorough review by experts in those respective fields was desired. Input and support in preparing or reviewing each part of the criteria development task was solicited from several DOE field offices. Experts were identified and contracted to assist in preparing portions of the release criteria, or to serve on a peer-review committee. Thus, the entire release criteria development task was thoroughly reviewed by recognized experts from contractors at several DOE field offices, to validate technical content of the document. Each of the above four study areas was developed originally as an individual task, and a report was generated from each. These reports are combined here to form this document. This release criteria document includes INEL-specific pathways analysis, instrumentation requirements, sampling procedures, the basis for selection of dose and concentration guidelines, and cost-risk-benefit procedures

  9. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  10. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines

    Science.gov (United States)

    Schiff, Anshel J.

    1998-01-01

    To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

  11. Seismic damage to structures in the M s6.5 Ludian earthquake

    Science.gov (United States)

    Chen, Hao; Xie, Quancai; Dai, Boyang; Zhang, Haoyu; Chen, Hongfu

    2016-03-01

    On 3 August 2014, the Ludian earthquake struck northwest Yunnan Province with a surface wave magnitude of 6.5. This moderate earthquake unexpectedly caused high fatalities and great economic loss. Four strong motion stations were located in the areas with intensity V, VI, VII and IX, near the epicentre. The characteristics of the ground motion are discussed herein, including 1) ground motion was strong at a period of less than 1.4 s, which covered the natural vibration period of a large number of structures; and 2) the release energy was concentrated geographically. Based on materials collected during emergency building inspections, the damage patterns of adobe, masonry, timber frame and reinforced concrete (RC) frame structures in areas with different intensities are summarised. Earthquake damage matrices of local buildings are also given for fragility evaluation and earthquake damage prediction. It is found that the collapse ratios of RC frame and confined masonry structures based on the new design code are significantly lower than non-seismic buildings. However, the RC frame structures still failed to achieve the `strong column, weak beam' design target. Traditional timber frame structures with a light infill wall showed good aseismic performance.

  12. Earthquake focal mechanism forecasting in Italy for PSHA purposes

    Science.gov (United States)

    Roselli, Pamela; Marzocchi, Warner; Mariucci, Maria Teresa; Montone, Paola

    2018-01-01

    In this paper, we put forward a procedure that aims to forecast focal mechanism of future earthquakes. One of the primary uses of such forecasts is in probabilistic seismic hazard analysis (PSHA); in fact, aiming at reducing the epistemic uncertainty, most of the newer ground motion prediction equations consider, besides the seismicity rates, the forecast of the focal mechanism of the next large earthquakes as input data. The data set used to this purpose is relative to focal mechanisms taken from the latest stress map release for Italy containing 392 well-constrained solutions of events, from 1908 to 2015, with Mw ≥ 4 and depths from 0 down to 40 km. The data set considers polarity focal mechanism solutions until to 1975 (23 events), whereas for 1976-2015, it takes into account only the Centroid Moment Tensor (CMT)-like earthquake focal solutions for data homogeneity. The forecasting model is rooted in the Total Weighted Moment Tensor concept that weighs information of past focal mechanisms evenly distributed in space, according to their distance from the spatial cells and magnitude. Specifically, for each cell of a regular 0.1° × 0.1° spatial grid, the model estimates the probability to observe a normal, reverse, or strike-slip fault plane solution for the next large earthquakes, the expected moment tensor and the related maximum horizontal stress orientation. These results will be available for the new PSHA model for Italy under development. Finally, to evaluate the reliability of the forecasts, we test them with an independent data set that consists of some of the strongest earthquakes with Mw ≥ 3.9 occurred during 2016 in different Italian tectonic provinces.

  13. Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3

    Science.gov (United States)

    Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.

    2017-12-01

    Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.

  14. Application of Incremental Dynamic Analysis (IDA) Method for Studying the Dynamic Behavior of Structures During Earthquakes

    OpenAIRE

    Javanpour, M.; Zarfam, P.

    2017-01-01

    Prediction of existing buildings’ vulnerability by future earthquakes is one of the most essential topics in structural engineering. Modeling steel structures is a giant step in determining the damage caused by the earthquake, as such structures are increasingly being used in constructions. Hence, two same-order steel structures with two types of structural systems were selected (coaxial moment frames and moment frame). In most cases, a specific structure needs to satisfy several functional l...

  15. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    Science.gov (United States)

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    , and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.

  16. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  17. Martin Marietta Paducah Gaseous Diffusion Plant comprehensive earthquake emergency management program

    International Nuclear Information System (INIS)

    1990-01-01

    Recognizing the value of a proactive, integrated approach to earthquake preparedness planning, Martin Marietta Energy Systems, Inc. initiated a contract in June 1989 with Murray State University, Murray, Kentucky, to develop a comprehensive earthquake management program for their Gaseous Diffusion Plant in Paducah, Kentucky (PGDP -- Subcontract No. 19P-JV649V). The overall purpose of the program is to mitigate the loss of life and property in the event of a major destructive earthquake. The program includes four distinct (yet integrated) components: an emergency management plan, with emphasis on the catastrophic earthquake; an Emergency Operations Center Duty Roster Manual; an Integrated Automated Emergency Management Information System (IAEMIS); and a series of five training program modules. The PLAN itself is comprised of four separate volumes: Volume I -- Chapters 1--3; Volume II -- Chapters 4--6, Volume III -- Chapter 7, and Volume IV -- 23 Appendices. The EOC Manual (which includes 15 mutual aid agreements) is designated as Chapter 7 in the PLAN and is a ''stand alone'' document numbered as Volume III. This document, Volume II, discusses methodology, engineering and environmental analyses, and operational procedures

  18. Novel cobalt releasing sol-gel derived bioactive glass for bone tissue engineering

    International Nuclear Information System (INIS)

    Oliveira, Ana Celeste Ximenes; Barrioni, Breno Rocha; Leite, Maria de Fatima; Pereira, Marivalda Magalhaes

    2016-01-01

    Full text: Bone defects are caused by traumas, congenital disorders or infections, and bone grafts are the usual treatment. However, limitations of this therapy have lead to the advance of tissue engineering approaches. Bioactive glasses (BG) are an attractive bioactive ceramic for bone repair [1], due to its osteogenic properties and capability of releasing different ions, inducing specific biological responses. Tissue repair depends also on blood vessels formation. Among angiogenic agents, cobalt ion has been regarded as strategic component to incorporate into ion releasing materials. In this study, 5% (molar) cobalt releasing BG was synthesized by sol-gel method. To characterize the material, powder samples were evaluated by FTIR and DRX. To access the cytotoxic effects, MTT and LIVE/DEAD tests were performed on osteoblasts exposed to the ionic product of the material (100 μg/mL) for 72h. FTIR analysis reveals typical absorption bands of present groups in BG. X-ray diffractogram of DRX confirmed the amorphous character of BG, without the occurrence of recrystallization of cobalt precursor, suggesting that cobalt incorporation was well succeeded. MTT test showed that cells exposed to ionic product presented high levels of metabolic activity. LIVE/DEAD assay evidenced that cell membrane integrity and intracellular esterases activity were preserved. Both cytotoxic tests proved that cobalt-BG material generated a cell friendly environment. This work shows that BG with cobalt agent presented proper structural features and a non-cytotoxic behaviour. Reference: [1] Hench LL, J Mater Sci Mater Med 17(11), 967-78 (2006). (author)

  19. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  20. Heat release and engine performance effects of soybean oil ethyl ester blending into diesel fuel

    International Nuclear Information System (INIS)

    Bueno, Andre Valente; Velasquez, Jose Antonio; Milanez, Luiz Fernando

    2011-01-01

    The engine performance impact of soybean oil ethyl ester blending into diesel fuel was analyzed employing heat release analysis, in-cylinder exergy balances and dynamometric tests. Blends with concentrations of up to 30% of soybean oil ethyl ester in volume were used in steady-state experiments conducted in a high speed turbocharged direct injection engine. Modifications in fuel heat value, fuel-air equivalence ratio and combustion temperature were found to govern the impact resulting from the addition of biodiesel on engine performance. For the analyzed fuels, the 20% biodiesel blend presented the best results of brake thermal efficiency, while the 10% biodiesel blend presented the best results of brake power and sfc (specific fuel consumption). In relation to mineral diesel and in full load conditions, an average increase of 4.16% was observed in brake thermal efficiency with B20 blend. In the same conditions, an average gain of 1.15% in brake power and a reduction of 1.73% in sfc was observed with B10 blend.

  1. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  2. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  3. Size-fractionated characterization and quantification of nanoparticle release rates from a consumer spray product containing engineered nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Hagendorfer, Harald, E-mail: Harald.Hagendorfer@empa.c [EMPA, Swiss Federal Laboratories for Materials Testing and Research (Switzerland); Lorenz, Christiane, E-mail: Christiane.Lorenz@chem.ethz.c [ETHZ, Swiss Federal Institute of Technology Zurich (Switzerland); Kaegi, Ralf, E-mail: Ralf.Kaegi@eawag.ch; Sinnet, Brian, E-mail: Brian.Sinnet@eawag.c [EAWAG, Swiss Federal Institute of Aquatic Science and Technology (Switzerland); Gehrig, Robert, E-mail: Robert.Gehrig@empa.c [EMPA, Swiss Federal Laboratories for Materials Testing and Research (Switzerland); Goetz, Natalie V., E-mail: Natalie.vonGoetz@chem.ethz.ch; Scheringer, Martin, E-mail: Martin.Scheringer@chem.ethz.c [ETHZ, Swiss Federal Institute of Technology Zurich (Switzerland); Ludwig, Christian, E-mail: Christian.Ludwig@psi.c [PSI, Paul Scherrer Institue (Switzerland); Ulrich, Andrea, E-mail: Andrea.Ulrich@empa.c [EMPA, Swiss Federal Laboratories for Materials Testing and Research (Switzerland)

    2010-09-15

    This study describes methods developed for reliable quantification of size- and element-specific release of engineered nanoparticles (ENP) from consumer spray products. A modified glove box setup was designed to allow controlled spray experiments in a particle-minimized environment. Time dependence of the particle size distribution in a size range of 10-500 nm and ENP release rates were studied using a scanning mobility particle sizer (SMPS). In parallel, the aerosol was transferred to a size-calibrated electrostatic TEM sampler. The deposited particles were investigated using electron microscopy techniques in combination with image processing software. This approach enables the chemical and morphological characterization as well as quantification of released nanoparticles from a spray product. The differentiation of solid ENP from the released nano-sized droplets was achieved by applying a thermo-desorbing unit. After optimization, the setup was applied to investigate different spray situations using both pump and gas propellant spray dispensers for a commercially available water-based nano-silver spray. The pump spray situation showed no measurable nanoparticle release, whereas in the case of the gas spray, a significant release was observed. From the results it can be assumed that the homogeneously distributed ENP from the original dispersion grow in size and change morphology during and after the spray process but still exist as nanometer particles of size <100 nm. Furthermore, it seems that the release of ENP correlates with the generated aerosol droplet size distribution produced by the spray vessel type used. This is the first study presenting results concerning the release of ENP from spray products.

  4. Size-fractionated characterization and quantification of nanoparticle release rates from a consumer spray product containing engineered nanoparticles

    International Nuclear Information System (INIS)

    Hagendorfer, Harald; Lorenz, Christiane; Kaegi, Ralf; Sinnet, Brian; Gehrig, Robert; Goetz, Natalie V.; Scheringer, Martin; Ludwig, Christian; Ulrich, Andrea

    2010-01-01

    This study describes methods developed for reliable quantification of size- and element-specific release of engineered nanoparticles (ENP) from consumer spray products. A modified glove box setup was designed to allow controlled spray experiments in a particle-minimized environment. Time dependence of the particle size distribution in a size range of 10-500 nm and ENP release rates were studied using a scanning mobility particle sizer (SMPS). In parallel, the aerosol was transferred to a size-calibrated electrostatic TEM sampler. The deposited particles were investigated using electron microscopy techniques in combination with image processing software. This approach enables the chemical and morphological characterization as well as quantification of released nanoparticles from a spray product. The differentiation of solid ENP from the released nano-sized droplets was achieved by applying a thermo-desorbing unit. After optimization, the setup was applied to investigate different spray situations using both pump and gas propellant spray dispensers for a commercially available water-based nano-silver spray. The pump spray situation showed no measurable nanoparticle release, whereas in the case of the gas spray, a significant release was observed. From the results it can be assumed that the homogeneously distributed ENP from the original dispersion grow in size and change morphology during and after the spray process but still exist as nanometer particles of size <100 nm. Furthermore, it seems that the release of ENP correlates with the generated aerosol droplet size distribution produced by the spray vessel type used. This is the first study presenting results concerning the release of ENP from spray products.

  5. Factors Contributing to the Catastrophe in Mexico City During the Earthquake of September 19, 1985

    OpenAIRE

    Beck, James L.; Hall, John F.

    1986-01-01

    The extensive damage to high‐rise buildings in Mexico City during the September 19, 1985 earthquake is primarily due to the intensity of the ground shaking exceeding what was previously considered credible for the city by Mexican engineers. There were two major factors contributing to the catastrophe, resonance in the sediments of an ancient lake that once existed in the Valley of Mexico, and the long duration of shaking compared with other coastal earthquakes in the last 50 years. Both of th...

  6. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  7. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage

    Science.gov (United States)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.

    2016-12-01

    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June

  8. Permeability, storage and hydraulic diffusivity controlled by earthquakes

    Science.gov (United States)

    Brodsky, E. E.; Fulton, P. M.; Xue, L.

    2016-12-01

    Earthquakes can increase permeability in fractured rocks. In the farfield, such permeability increases are attributed to seismic waves and can last for months after the initial earthquake. Laboratory studies suggest that unclogging of fractures by the transient flow driven by seismic waves is a viable mechanism. These dynamic permeability increases may contribute to permeability enhancement in the seismic clouds accompanying hydraulic fracking. Permeability enhancement by seismic waves could potentially be engineered and the experiments suggest the process will be most effective at a preferred frequency. We have recently observed similar processes inside active fault zones after major earthquakes. A borehole observatory in the fault that generated the M9.0 2011 Tohoku earthquake reveals a sequence of temperature pulses during the secondary aftershock sequence of an M7.3 aftershock. The pulses are attributed to fluid advection by a flow through a zone of transiently increased permeability. Directly after the M7.3 earthquake, the newly damaged fault zone is highly susceptible to further permeability enhancement, but ultimately heals within a month and becomes no longer as sensitive. The observation suggests that the newly damaged fault zone is more prone to fluid pulsing than would be expected based on the long-term permeability structure. Even longer term healing is seen inside the fault zone of the 2008 M7.9 Wenchuan earthquake. The competition between damage and healing (or clogging and unclogging) results in dynamically controlled permeability, storage and hydraulic diffusivity. Recent measurements of in situ fault zone architecture at the 1-10 meter scale suggest that active fault zones often have hydraulic diffusivities near 10-2 m2/s. This uniformity is true even within the damage zone of the San Andreas fault where permeability and storage increases balance each other to achieve this value of diffusivity over a 400 m wide region. We speculate that fault zones

  9. One feature of the activated southern Ordos block: the Ziwuling small earthquake cluster

    Directory of Open Access Journals (Sweden)

    Li Yuhang

    2014-08-01

    Full Text Available Small earthquakes (Ms > 2.0 have been recorded from 1970 to the present day and reveal a significant difference in seismicity between the stable Ordos block and its active surrounding area. The southern Ordos block is a conspicuous small earthquake belt clustered and isolated along the NNW direction and extends to the inner stable Ordos block; no active fault can match this small earthquake cluster. In this paper, we analyze the dynamic mechanism of this small earthquake cluster based on the GPS velocity field (from 1999 to 2007, which are mainly from Crustal Movement Observation Network of China (CMONOC with respect to the north and south China blocks. The principal direction of strain rate field, the expansion ratefield, the maximum shear strain rate, and the rotation rate were constrained using the GPS velocity field. The results show that the velocity field, which is bounded by the small earthquake cluster from Tongchuan to Weinan, differs from the strain rate field, and the crustal deformation is left-lateral shear. This left-lateral shear belt not only spatially coincides with the Neo-tectonic belt in the Weihe Basin but also with the NNW small earthquake cluster (the Ziwuling small earthquake cluster. Based on these studies, we speculate that the NNW small earthquake cluster is caused by left-lateral shear slip, which is prone to strain accumulation. When the strain releases along the weak zone of structure, small earthquakes diffuse within its upper crust. The maximum principal compression strees direction changed from NE-SW to NEE-SWW, and the former reverse faults in the southwestern margin of the Ordos block became a left-lateral strike slip due to readjustment of the tectonic strees field after the middle Pleistocene. The NNW Neo-tectonic belt in the Weihe Basin, the different movement character of the inner Weihe Basin (which was demonstrated through GPS measurements and the small earthquake cluster belt reflect the activated

  10. Modeling of release of radionuclides from an engineered disposal facility for shallow-land disposal of low-level radioactive wastes

    International Nuclear Information System (INIS)

    Matsuzuru, H.; Suzuki, A.

    1989-01-01

    The computer code, ENBAR-1, for the simulation of radionuclide releases from an engineered disposal facility has been developed to evaluate the source term for subsequent migration of radionuclides in and through a natural barrier. The system considered here is that a waste package (waste form and container) is placed, together with backfill materials, into a concrete pit as a disposal unit for shallow-land disposal of low-level radioactive wastes. The code developed includes the following modules: water penetration into a concrete pit, corrosion of a drum as a container, leaching of radionuclides from a waste form, migration of radionuclides in backfill materials, release of radionuclides from the pit. The code has the advantage of its simplicity of operation and presentation while still allowing comprehensive evaluation of each element of an engineered disposal facility to be treated. The performance and source term of the facility might be readily estimated with a few key parameters to define the problem

  11. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  12. Activated Very Low Frequency Earthquakes By the Slow Slip Events in the Ryukyu Subduction Zone

    Science.gov (United States)

    Nakamura, M.; Sunagawa, N.

    2014-12-01

    The Ryukyu Trench (RT), where the Philippine Sea plate is subducting, has had no known thrust earthquakes with a Mw>8.0 in the last 300 years. However, the rupture source of the 1771 tsunami has been proposed as an Mw > 8.0 earthquake in the south RT. Based on the dating of tsunami boulders, it has been estimated that large tsunamis occur at intervals of 150-400 years in the south Ryukyu arc (RA) (Araoka et al., 2013), although they have not occurred for several thousand years in the central and northern Ryukyu areas (Goto et al., 2014). To address the discrepancy between recent low moment releases by earthquakes and occurrence of paleo-tsunamis in the RT, we focus on the long-term activity of the very low frequency earthquakes (VLFEs), which are good indicators of the stress release in the shallow plate interface. VLFEs have been detected along the RT (Ando et al., 2012), which occur on the plate interface or at the accretionary prism. We used broadband data from the F-net of NIED along the RT and from the IRIS network. We applied two filters to all the raw broadband seismograms: a 0.02-0.05 Hz band-pass filter and a 1 Hz high-pass filter. After identification of the low-frequency events from the band-pass-filtered seismograms, the local and teleseismic events were removed. Then we picked the arrival time of the maximum amplitude of the surface wave of the VLFEs and determined the epicenters. VLFEs occurred on the RA side within 100 km from the trench axis along the RT. Distribution of the 6670 VLFEs from 2002 to 2013 could be divided to several clusters. Principal large clusters were located at 27.1°-29.0°N, 25.5°-26.6°N, and 122.1°-122.4°E (YA). We found that the VLFEs of the YA are modulated by repeating slow slip events (SSEs) which occur beneath south RA. The activity of the VLFEs increased to two times of its ordinary rate in 15 days after the onset of the SSEs. Activation of the VLFEs could be generated by low stress change of 0.02-20 kPa increase in

  13. Assessment of Structural Resistance of building 4862 to Earthquake and Tornado Forces [SEC 1 and 2

    International Nuclear Information System (INIS)

    METCALF, I.L.

    1999-01-01

    This report presents the results of work done for Hanford Engineering Laboratory under contract Y213-544-12662. LATA performed an assessment of building 4862 resistance to earthquake and tornado forces

  14. Assessment of Structural Resistance of building 4862 to Earthquake and Tornado Forces [SEC 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    METCALF, I.L.

    1999-12-06

    This report presents the results of work done for Hanford Engineering Laboratory under contract Y213-544-12662. LATA performed an assessment of building 4862 resistance to earthquake and tornado forces.

  15. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  16. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  17. Protecting your family from earthquakes: The seven steps to earthquake safety

    Science.gov (United States)

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  18. ELER software - a new tool for urban earthquake loss assessment

    Science.gov (United States)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and

  19. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  20. Evidence for strong Holocene earthquake(s) in the Wabash Valley seismic zone

    International Nuclear Information System (INIS)

    Obermeier, S.

    1991-01-01

    Many small and slightly damaging earthquakes have taken place in the region of the lower Wabash River Valley of Indiana and Illinois during the 200 years of historic record. Seismologists have long suspected the Wabash Valley seismic zone to be capable of producing earthquakes much stronger than the largest of record (m b 5.8). The seismic zone contains the poorly defined Wabash Valley fault zone and also appears to contain other vaguely defined faults at depths from which the strongest earthquakes presently originate. Faults near the surface are generally covered with thick alluvium in lowlands and a veneer of loess in uplands, which make direct observations of faults difficult. Partly because of this difficulty, a search for paleoliquefaction features was begun in 1990. Conclusions of the study are as follows: (1) an earthquake much stronger than any historic earthquake struck the lower Wabash Valley between 1,500 and 7,500 years ago; (2) the epicentral region of the prehistoric strong earthquake was the Wabash Valley seismic zone; (3) apparent sites have been located where 1811-12 earthquake accelerations can be bracketed

  1. Primary variables influencing generation of earthquake motions by a deconvolution process

    International Nuclear Information System (INIS)

    Idriss, I.M.; Akky, M.R.

    1979-01-01

    In many engineering problems, the analysis of potential earthquake response of a soil deposit, a soil structure or a soil-foundation-structure system requires the knowledge of earthquake ground motions at some depth below the level at which the motions are recorded, specified, or estimated. A process by which such motions are commonly calculated is termed a deconvolution process. This paper presents the results of a parametric study which was conducted to examine the accuracy, convergence, and stability of a frequency used deconvolution process and the significant parameters that may influence the output of this process. Parameters studied in included included: soil profile characteristics, input motion characteristics, level of input motion, and frequency cut-off. (orig.)

  2. Sustained release of sphingosine 1-phosphate for therapeutic arteriogenesis and bone tissue engineering.

    Science.gov (United States)

    Sefcik, Lauren S; Petrie Aronin, Caren E; Wieghaus, Kristen A; Botchwey, Edward A

    2008-07-01

    Sphingosine 1-phosphate (S1P) is a bioactive phospholipid that impacts migration, proliferation, and survival in diverse cell types, including endothelial cells, smooth muscle cells, and osteoblast-like cells. In this study, we investigated the effects of sustained release of S1P on microvascular remodeling and associated bone defect healing in vivo. The murine dorsal skinfold window chamber model was used to evaluate the structural remodeling response of the microvasculature. Our results demonstrated that 1:400 (w/w) loading and subsequent sustained release of S1P from poly(lactic-co-glycolic acid) (PLAGA) significantly enhanced lumenal diameter expansion of arterioles and venules after 3 and 7 days. Incorporation of 5-bromo-2-deoxyuridine (BrdU) at day 7 revealed significant increases in mural cell proliferation in response to S1P delivery. Additionally, three-dimensional (3D) scaffolds loaded with S1P (1:400) were implanted into critical-size rat calvarial defects, and healing of bony defects was assessed by radiograph X-ray, microcomputed tomography (muCT), and histology. Sustained release of S1P significantly increased the formation of new bone after 2 and 6 weeks of healing and histological results suggest increased numbers of blood vessels in the defect site. Taken together, these experiments support the use of S1P delivery for promoting microvessel diameter expansion and improving the healing outcomes of tissue-engineered therapies.

  3. Strong motion modeling at the Paducah Diffusion Facility for a large New Madrid earthquake

    International Nuclear Information System (INIS)

    Herrmann, R.B.

    1991-01-01

    The Paducah Diffusion Facility is within 80 kilometers of the location of the very large New Madrid earthquakes which occurred during the winter of 1811-1812. Because of their size, seismic moment of 2.0 x 10 27 dyne-cm or moment magnitude M w = 7.5, the possible recurrence of these earthquakes is a major element in the assessment of seismic hazard at the facility. Probabilistic hazard analysis can provide uniform hazard response spectra estimates for structure evaluation, but a deterministic modeling of a such a large earthquake can provide strong constraints on the expected duration of motion. The large earthquake is modeled by specifying the earthquake fault and its orientation with respect to the site, and by specifying the rupture process. Synthetic time histories, based on forward modeling of the wavefield, from each subelement are combined to yield a three component time history at the site. Various simulations are performed to sufficiently exercise possible spatial and temporal distributions of energy release on the fault. Preliminary results demonstrate the sensitivity of the method to various assumptions, and also indicate strongly that the total duration of ground motion at the site is controlled primarily by the length of the rupture process on the fault

  4. Statistical distributions of earthquakes and related non-linear features in seismic waves

    International Nuclear Information System (INIS)

    Apostol, B.-F.

    2006-01-01

    A few basic facts in the science of the earthquakes are briefly reviewed. An accumulation, or growth, model is put forward for the focal mechanisms and the critical focal zone of the earthquakes, which relates the earthquake average recurrence time to the released seismic energy. The temporal statistical distribution for average recurrence time is introduced for earthquakes, and, on this basis, the Omori-type distribution in energy is derived, as well as the distribution in magnitude, by making use of the semi-empirical Gutenberg-Richter law relating seismic energy to earthquake magnitude. On geometric grounds, the accumulation model suggests the value r = 1/3 for the Omori parameter in the power-law of energy distribution, which leads to β = 1,17 for the coefficient in the Gutenberg-Richter recurrence law, in fair agreement with the statistical analysis of the empirical data. Making use of this value, the empirical Bath's law is discussed for the average magnitude of the aftershocks (which is 1.2 less than the magnitude of the main seismic shock), by assuming that the aftershocks are relaxation events of the seismic zone. The time distribution of the earthquakes with a fixed average recurrence time is also derived, the earthquake occurrence prediction is discussed by means of the average recurrence time and the seismicity rate, and application of this discussion to the seismic region Vrancea, Romania, is outlined. Finally, a special effect of non-linear behaviour of the seismic waves is discussed, by describing an exact solution derived recently for the elastic waves equation with cubic anharmonicities, its relevance, and its connection to the approximate quasi-plane waves picture. The properties of the seismic activity accompanying a main seismic shock, both like foreshocks and aftershocks, are relegated to forthcoming publications. (author)

  5. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  6. How complete is the ISC-GEM Global Earthquake Catalog?

    Science.gov (United States)

    Michael, Andrew J.

    2014-01-01

    The International Seismological Centre, in collaboration with the Global Earthquake Model effort, has released a new global earthquake catalog, covering the time period from 1900 through the end of 2009. In order to use this catalog for global earthquake studies, I determined the magnitude of completeness (Mc) as a function of time by dividing the earthquakes shallower than 60 km into 7 time periods based on major changes in catalog processing and data availability and applying 4 objective methods to determine Mc, with uncertainties determined by non-parametric bootstrapping. Deeper events were divided into 2 time periods. Due to differences between the 4 methods, the final Mc was determined subjectively by examining the features that each method focused on in both the cumulative and binned magnitude frequency distributions. The time periods and Mc values for shallow events are: 1900-1917, Mc=7.7; 1918-1939, Mc=7.0; 1940-1954, Mc=6.8; 1955-1963, Mc=6.5; 1964-1975, Mc=6.0; 1976-2003, Mc=5.8; and 2004-2009, Mc=5.7. Using these Mc values for the longest time periods they are valid for (e.g. 1918-2009, 1940-2009,…) the shallow data fits a Gutenberg-Richter distribution with b=1.05 and a=8.3, within 1 standard deviation, with no declustering. The exception is for time periods that include 1900-1917 in which there are only 33 events with M≥ Mc and for those few data b=2.15±0.46. That result calls for further investigations for this time period, ideally having a larger number of earthquakes. For deep events, the results are Mc=7.1 for 1900-1963, although the early data are problematic; and Mc=5.7 for 1964-2009. For that later time period, b=0.99 and a=7.3.

  7. The Need for More Earthquake Science in Southeast Asia

    Science.gov (United States)

    Sieh, K.

    2015-12-01

    Many regions within SE Asia have as great a density of active seismic structures as does the western US - Sumatra, Myanmar, Bangladesh, New Guinea and the Philippines come first to mind. Much of Earth's release of seismic energy in the current millennium has, in fact, come from these regions, with great losses of life and livelihoods. Unfortunately, the scientific progress upon which seismic-risk reduction in SE Asia ultimately depends has been and continues to be slow. Last year at AGU, for example, I counted 57 talks about the M6 Napa earthquake. In contrast, I can't recall hearing any talk on a SE Asian M6 earthquake at any venue in the past many years. In fact, even M7+ earthquakes often go unstudied. Not uncommonly, the region's earthquake scientists face high financial and political impediments to conducting earthquake research. Their slow speed in the development of scientific knowledge doesn't bode well for speedy progress in the science of seismic hazards, the sine qua non for substantially reducing seismic risk. There are two basic necessities for the region to evolve significantly from the current state of affairs. Both involve the development of regional infrastructure: 1) Data: Robust and accessible geophysical monitoring systems would need to be installed, maintained and utilized by the region's earth scientists and their results shared internationally. Concomitantly, geological mapping (sensu lato) would need to be undertaken. 2) People: The training, employment, and enduring support of a new, young, international corps of earth scientists would need to accelerate markedly. The United States could play an important role in achieving the goal of significant seismic risk reduction in the most seismically active countries of SE Asia by taking the lead in establishing a coalition to robustly fund a multi-decadal program that supports scientists and their research institutions to work alongside local expertise.

  8. Geochemical variation of groundwater in the Abruzzi region: earthquakes related signals?

    Science.gov (United States)

    Cardellini, C.; Chiodini, G.; Caliro, S.; Frondini, F.; Avino, R.; Minopoli, C.; Morgantini, N.

    2009-12-01

    The presence of a deep and inorganic source of CO2 has been recently recognized in Italy on the basis of the deeply derived carbon dissolved in the groundwater. In particular, the regional map of CO2 Earth degassing shows that two large degassing structures affect the Tyrrhenian side of the Italian peninsula. The northern degassing structure (TRDS, Tuscan Roman degassing structure) includes Tuscany, Latium and part of Umbria regions (~30000 km2) and releases > 6.1 Mt/y of deeply derived CO2. The southern degassing structure (CDS, Campanian degassing structure) affects the Campania region (~10000 km2) and releases > 3.1 Mt/y of deeply derived CO2. The total CO2 released by TRDS and CDS (> 9.2 Mt/y) is globally significant, being ~10% of the estimated present-day total CO2 discharge from sub aerial volcanoes of the Earth. The comparison between the map of CO2 Earth degassing and of the location of the Italian earthquakes highlights that the anomalous CO2 flux suddenly disappears in the Apennine in correspondence of a narrow band where most of the seismicity concentrates. A previous conceptual model proposed that in this area, at the eastern borders of TRDS and CDS plumes, the CO2 from the mantle wedge intrudes the crust and accumulate in structural traps generating over-pressurized reservoirs. These CO2 over-pressurized levels can play a major role in triggering the Apennine earthquakes, by reducing fault strength and potentially controlling the nucleation, arrest, and recurrence of both micro and major (M>5) earthquakes. The 2009 Abruzzo earthquakes, like previous seismic crises in the Northern Apennine, occurred at the border of the TRDS, suggesting also in this case a possible role played by deeply derived fluids in the earthquake generation. In order to investigate this process, detailed hydro-geochemical campaigns started immediately after the main shock of the 6th of April 2009. The surveys include the main springs of the area which were previously studied in

  9. Risk evaluation method for faults by engineering approach. (1) Nuclear safety for accident scenario and measures for fault movement

    International Nuclear Information System (INIS)

    Narabayashi, Tadashi; Chiba, Go; Okamoto, Koji; Kameda, Hiroyuki; Ebisawa, Katsumi; Yamazaki, Haruo; Konagai, Kazuo; Kamiya, Masanobu; Nagasawa, Kazuyuki

    2016-01-01

    Japan, as a frequent earthquake country, has a responsibility to resolve efficient measures to enhance nuclear safety, to continue utilizing the nuclear power, based on the risks and importance levels in the scientific and rational manner. In his paper describes how to evaluate the risk of faults movement by engineering approach. An open fruitful discussion by experts in the various area of earthquake, geology, geotechnical, civil, and a seismic design as well as other stakeholders such as academia professors, nuclear reactor engineers, regulators, and licensees. The Atomic Energy Society established an Investigation Committee on Development of Activity and Risk Evaluation Method for Faults by Engineering Approach (IC-DAREFEA) on October 1st, a 2014. The Investigation Committee utilizes the most advanced scientific and rational judgement, and continuous discussions and efforts in the global field, in order to collect and organize these knowledge and reflect the global standards and nuclear regulations, such as risk evaluation method for the faults movements and prevention of severe accidents, based on the accumulated database in the world, including Chuetsuoki Earthquake, North Nagano Earthquake and Kumamoto Earthquake. (author)

  10. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    Science.gov (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  11. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    Science.gov (United States)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  12. Long-Term Fault Memory: A New Time-Dependent Recurrence Model for Large Earthquake Clusters on Plate Boundaries

    Science.gov (United States)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.; Campbell, M. R.

    2017-12-01

    A challenge for earthquake hazard assessment is that geologic records often show large earthquakes occurring in temporal clusters separated by periods of quiescence. For example, in Cascadia, a paleoseismic record going back 10,000 years shows four to five clusters separated by approximately 1,000 year gaps. If we are still in the cluster that began 1700 years ago, a large earthquake is likely to happen soon. If the cluster has ended, a great earthquake is less likely. For a Gaussian distribution of recurrence times, the probability of an earthquake in the next 50 years is six times larger if we are still in the most recent cluster. Earthquake hazard assessments typically employ one of two recurrence models, neither of which directly incorporate clustering. In one, earthquake probability is time-independent and modeled as Poissonian, so an earthquake is equally likely at any time. The fault has no "memory" because when a prior earthquake occurred has no bearing on when the next will occur. The other common model is a time-dependent earthquake cycle in which the probability of an earthquake increases with time until one happens, after which the probability resets to zero. Because the probability is reset after each earthquake, the fault "remembers" only the last earthquake. This approach can be used with any assumed probability density function for recurrence times. We propose an alternative, Long-Term Fault Memory (LTFM), a modified earthquake cycle model where the probability of an earthquake increases with time until one happens, after which it decreases, but not necessarily to zero. Hence the probability of the next earthquake depends on the fault's history over multiple cycles, giving "long-term memory". Physically, this reflects an earthquake releasing only part of the elastic strain stored on the fault. We use the LTFM to simulate earthquake clustering along the San Andreas Fault and Cascadia. In some portions of the simulated earthquake history, events would

  13. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  14. Revisiting the November 27, 1945 Makran (Mw=8.2) interplate earthquake

    Science.gov (United States)

    Zarifi, Z.; Raeesi, M.

    2012-04-01

    Makran Subduction Zone (MSZ) in southern Iran and southwestern Pakistan is a zone of convergence, where the remnant oceanic crust of Arabian plate is subducting beneath the Eurasian plate with a rate of less than 30 mm/yr. The November 27, 1945 earthquake (Mw=8.2) in eastern section of Makran followed by a tsunami, at some points 15 meters high. More than 4000 victims and widespread devastation along the coastal area of Pakistan, Iran, Oman and India are reported for this earthquake. We have collected the old seismograms of the 1945 earthquake and its largest following earthquake (August 5, 1947, Mw=7.3) from a number of stations around the globe. Using ISS data, we relocated these two events. We used the teleseismic body-waveform inversion code of Kikuchi and Kanamori to determine the slip distribution of these two earthquakes for the first time. The results show that the extent of rupture of the 1945 earthquake is larger than what previously had been approximated in other studies. The slip distribution suggests two distinct sets of asperities with different behavior in the west close to Pasni and in the east close to Ormara. The highest slip was obtained for an area between these two cities which shows geological evidence of rapid uplift. To associate this behavior with the structure of slab interface we studied the TPGA (Trench Parallel Free-air Gravity Anomaly) and TPBA (Trench Parallel Bouguer Anomaly) in MSZ. The results of TPGA does not show the expected phenomenon, which is the correlation of asperities with the area of highly negative TPGA. However, TPBA can make correlation between the observed slip distribution and the structure of slab interface. Using the topography and gravity profiles perpendicular to trench and along the MSZ, we could observe the segmentation in the slab interface. This confirms that we barely expect that the whole interface releases energy in one single megathrust earthquake. Current seismicity in MSZ, although sparse, can fairly

  15. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  16. Effects of Fault Segmentation, Mechanical Interaction, and Structural Complexity on Earthquake-Generated Deformation

    Science.gov (United States)

    Haddad, David Elias

    2014-01-01

    Earth's topographic surface forms an interface across which the geodynamic and geomorphic engines interact. This interaction is best observed along crustal margins where topography is created by active faulting and sculpted by geomorphic processes. Crustal deformation manifests as earthquakes at centennial to millennial timescales. Given that…

  17. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    Science.gov (United States)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  18. LastQuake: a comprehensive strategy for rapid engagement of earthquake eyewitnesses, massive crowdsourcing and risk reduction

    Science.gov (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Steed, R.; Frobert, L.

    2015-12-01

    LastQuake is a smartphone app, browser add-on and the most sophisticated Twitter robot (quakebot) for earthquakes currently in operation. It fulfills eyewitnesses' needs by offering information on felt earthquakes and their effects within tens of seconds of their occurrence. Associated with an active presence on Facebook, Pinterest and on websites, this proves a very efficient engagement strategy. For example, the app was installed thousands of times after the Ghorka earthquake in Nepal. Language barriers have been erased by using visual communication; for example, felt reports are collected through a set of cartoons representing different shaking levels. Within 3 weeks of the magnitude 7.8 Ghorka earthquakes, 7,000 felt reports with thousands of comments were collected related to the mainshock and tens of its aftershocks as well as 100 informative geo-located pics. The QuakeBot was essential in allowing us to be identified so well and interact with those affected. LastQuake is also a risk reduction tool since it provides rapid information. Rapid information is similar to prevention since when it does not exist, disasters can happen. When no information is available after a felt earthquake, the public block emergency lines by trying to find out the cause of the shaking, crowds form potentially leading to unpredictable crowd movement, rumors spread. In its next release LastQuake will also provide people with guidance immediately after a shaking through a number of pop-up cartoons illustrating "do/don't do" items (go to open places, do not phone emergency services except if people are injured…). LastQuake's app design is simple and intuitive and has a global audience. It benefited from a crowdfunding campaign (and the support of the Fondation MAIF) and more improvements have been planned after an online feedback campaign organized in early June with the Ghorka earthquake eyewitnesses. LastQuake is also a seismic risk reduction tools thanks to its very rapid

  19. Performance evaluation recommendations of nuclear power plants outdoor significant civil structures earthquake resistance. Performance evaluation examples

    International Nuclear Information System (INIS)

    2005-06-01

    The Japan Society of Civil Engineers has updated performance evaluation recommendations of nuclear power plants outdoor significant civil structures earthquake resistance in June 2005. Based on experimental and analytical considerations, analytical seismic models of soils for underground structures, effects of vertical motions on time-history dynamic analysis and shear fracture of reinforced concretes by cyclic loadings have been incorporated in new recommendations. This document shows outdoor civil structures earthquake resistance and endurance performance evaluation examples based on revised recommendations. (T. Tanaka)

  20. Testing the accelerating moment release (AMR) hypothesis in areas of high stress

    Science.gov (United States)

    Guilhem, Aurélie; Bürgmann, Roland; Freed, Andrew M.; Ali, Syed Tabrez

    2013-11-01

    Several retrospective analyses have proposed that significant increases in moment release occurred prior to many large earthquakes of recent times. However, the finding of accelerating moment release (AMR) strongly depends on the choice of three parameters: (1) magnitude range, (2) area being considered surrounding the events and (3) the time period prior to the large earthquakes. Consequently, the AMR analysis has been criticized as being a posteriori data-fitting exercise with no new predictive power. As AMR has been hypothesized to relate to changes in the state of stress around the eventual epicentre, we compare here AMR results to models of stress accumulation in California. Instead of assuming a complete stress drop on all surrounding fault segments implied by a back-slip stress lobe method, we consider that stress evolves dynamically, punctuated by the occurrence of earthquakes, and governed by the elastic and viscous properties of the lithosphere. We study the seismicity of southern California and extract events for AMR calculations following the systematic approach employed in previous studies. We present several sensitivity tests of the method, as well as grid-search analyses over the region between 1955 and 2005 using fixed magnitude range, radius of the search area and period of time. The results are compared to the occurrence of large events and to maps of Coulomb stress changes. The Coulomb stress maps are compiled using the coseismic stress from all M > 7.0 earthquakes since 1812, their subsequent post-seismic relaxation, and the interseismic strain accumulation. We find no convincing correlation of seismicity rate changes in recent decades with areas of high stress that would support the AMR hypothesis. Furthermore, this indicates limited utility for practical earthquake hazard analysis in southern California, and possibly other regions.

  1. Study on vibration behaviors of engineered barrier system

    Energy Technology Data Exchange (ETDEWEB)

    Mikoshiba, Tadashi; Ogawa, Nobuyuki; Minowa, Chikahiro [National Research Inst. for Earth Science and Disaster Prevention, Tsukuba, Ibaraki (Japan)

    1999-02-01

    Small engineered barrier model was mode and tested by vibrating with the random wave and the real earthquake wave. The wave observed at Kamaishi (N-S, N-W), Iwate Prefecture, in September 6, 1993, and Kobe (N-S) etc. were used as the real earthquake waves. The trial overpack showed non-linear characteristics (soft spring) by vibrating with the random wave. The pressure and acceleration of trial overpack and constraint container increased with increasing the vibration level of the real earthquake wave. The trial overpack moved the maximum 1.7 mm of displacement and 16 mm subsidence. The results showed both waves rocked the trialpack. (S.Y.)

  2. Deterministic earthquake scenarios for the city of Sofia

    International Nuclear Information System (INIS)

    Slavov, S.; Paskaleva, I.; Kouteva, M.; Vaccari, P.; Panza, G.F.

    2002-08-01

    The city of Sofia is exposed to a high seismic risk. Macroseismic intensities in the range of VIII-X (MSK) can be expected in the city. The earthquakes, that can influence the hazard at Sofia, originate either beneath the city or are caused by seismic sources located within a radius of 40km. The city of Sofia is also prone to the remote Vrancea seismic zone in Romania, and particularly vulnerable are the long - period elements of the built environment. The high seismic risk and the lack of instrumental recordings of the regional seismicity makes the use of appropriate credible earthquake scenarios and ground motion modelling approaches for defining the seismic input for the city of Sofia necessary. Complete synthetic seismic signals, due to several earthquake scenarios, were computed along chosen geological profiles crossing the city, applying a hybrid technique, based on the modal summation technique and finite differences. The modelling takes into account simultaneously the geotechnical properties of the site, the position and geometry of the seismic source and the mechanical properties of the propagation medium. Acceleration, velocity and displacement time histories and related quantities of earthquake engineering interest (e.g. response spectra, ground motion amplification along the profiles) have been supplied. The approach applied in this study allows us to obtain the definition of the seismic input at low cost exploiting large quantities of existing data (e.g. geotechnical, geological, seismological). It may be efficiently used to estimate the ground motion for the purposes of microzonation, urban planning, retrofitting or insurance of the built environment, etc. (author)

  3. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    Science.gov (United States)

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  4. Neural tissue engineering scaffold with sustained RAPA release relieves neuropathic pain in rats.

    Science.gov (United States)

    Ding, Tan; Zhu, Chao; Kou, Zhen-Zhen; Yin, Jun-Bin; Zhang, Ting; Lu, Ya-Cheng; Wang, Li-Ying; Luo, Zhuo-Jing; Li, Yun-Qing

    2014-09-01

    To investigate the effect of locally slow-released rapamycin (RAPA) from bionic peripheral nerve stent to reduce the incidence of neuropathic pain or mitigate the degree of pain after nerve injury. We constructed a neural tissue engineering scaffold with sustained release of RAPA to repair 20mm defects in rat sciatic nerves. Four presurgical and postsurgical time windows were selected to monitor the changes in the expression of pain-related dorsal root ganglion (DRG) voltage-gated sodium channels 1.3 (Nav1.3), 1.7 (Nav1.7), and 1.8 (Nav1.8) through immunohistochemistry (IHC) and Western Blot, along with the observation of postsurgical pathological pain in rats by pain-related behavior approaches. Relatively small upregulation of DRG sodium channels was observed in the experimental group (RAPA+poly(lactic-co-glycolic acid) (PLGA)+stent) after surgery, along with low degrees of neuropathic pain and anxiety, which were similar to those in the Autologous nerve graft group. Autoimmune inflammatory response plays a leading role in the occurrence of post-traumatic neuropathic pain, and that RAPA significantly inhibits the abnormal upregulation of sodium channels to reduce pain by alleviating inflammatory response. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Expanding the Delivery of Rapid Earthquake Information and Warnings for Response and Recovery

    Science.gov (United States)

    Blanpied, M. L.; McBride, S.; Hardebeck, J.; Michael, A. J.; van der Elst, N.

    2017-12-01

    Scientific organizations like the United States Geological Survey (USGS) release information to support effective responses during an earthquake crisis. Information is delivered to the White House, the National Command Center, the Departments of Defense, Homeland Security (including FEMA), Transportation, Energy, and Interior. Other crucial stakeholders include state officials and decision makers, emergency responders, numerous public and private infrastructure management centers (e.g., highways, railroads and pipelines), the media, and the public. To meet the diverse information requirements of these users, rapid earthquake notifications have been developed to be delivered by e-mail and text message, as well as a suite of earthquake information resources such as ShakeMaps, Did You Feel It?, PAGER impact estimates, and data are delivered via the web. The ShakeAlert earthquake early warning system being developed for the U.S. West Coast will identify and characterize an earthquake a few seconds after it begins, estimate the likely intensity of ground shaking, and deliver brief but critically important warnings to people and infrastructure in harm's way. Currently the USGS is also developing a capability to deliver Operational Earthquake Forecasts (OEF). These provide estimates of potential seismic behavior after large earthquakes and during evolving aftershock sequences. Similar work is underway in New Zealand, Japan, and Italy. In the development of OEF forecasts, social science research conducted during these sequences indicates that aftershock forecasts are valued for a variety of reasons, from informing critical response and recovery decisions to psychologically preparing for more earthquakes. New tools will allow users to customize map-based, spatiotemporal forecasts to their specific needs. Hazard curves and other advanced information will also be available. For such authoritative information to be understood and used during the pressures of an earthquake

  6. An Earthquake Swarm Search Implemented at Major Convergent Margins to Test for Associated Aseismic Slip

    Science.gov (United States)

    Holtkamp, S. G.; Pritchard, M. E.; Lohman, R. B.; Brudzinski, M. R.

    2009-12-01

    Recent geodetic analysis indicates earthquake swarms may be associated with slow slip such that earthquakes may only represent a fraction of the moment release. To investigate this potential relationship, we have developed a manual search approach to identify earthquake swarms from a seismicity catalog. Our technique is designed to be insensitive to spatial and temporal scales and the total number of events, as seismicity rates vary in different fault zones. Our first application of this technique on globally recorded earthquakes in South America detects 35 possible swarms of varying spatial scale, with 18 in the megathrust region and 8 along the volcanic arc. Three swarms in the vicinity of the arc appear to be triggered by the Mw=8.5 2001 Peru earthquake, and are examined for possible triggering mechanisms. Coulomb stress modeling suggests that static stress changes due to the earthquake are insufficient to trigger activity, so a dynamic or secondary triggering mechanism is more likely. Volcanic swarms are often associated with ground deformation, either associated with fluid movement (e.g. dike intrusion or chamber inflation or deflation) or fault movement, although these processes are sometimes difficult to differentiate. The only swarm along the arc with sufficient geodetic data that we can process and model is near Ticsani Volcano in Peru. In this case, a swarm of events southeast of the volcano precedes a more typical earthquake sequence beneath the volcano, and evidence for deformation is found in the location of the swarm, but there is no evidence for aseismic slip. Rather, we favor a model where the swarm is associated with deflation of a magma body to the southeast that triggered the earthquake sequence by promoting movement on a fault beneath Ticsani. Since swarms on the subduction interface may indicate aseismic moment release, with a direct impact on hazard, we examine potential relations between swarms and megathrust ruptures. We find evidence that

  7. Seismogenic structures of the 2006 ML4.0 Dangan Island earthquake offshore Hong Kong

    Science.gov (United States)

    Xia, Shaohong; Cao, Jinghe; Sun, Jinlong; Lv, Jinshui; Xu, Huilong; Zhang, Xiang; Wan, Kuiyuan; Fan, Chaoyan; Zhou, Pengxiang

    2018-02-01

    The northern margin of the South China Sea, as a typical extensional continental margin, has relatively strong intraplate seismicity. Compared with the active zones of Nanao Island, Yangjiang, and Heyuan, seismicity in the Pearl River Estuary is relatively low. However, a ML4.0 earthquake in 2006 occurred near Dangan Island (DI) offshore Hong Kong, and this site was adjacent to the source of the historical M5.8 earthquake in 1874. To reveal the seismogenic mechanism of intraplate earthquakes in DI, we systematically analyzed the structural characteristics in the source area of the 2006 DI earthquake using integrated 24-channel seismic profiles, onshore-offshore wide-angle seismic tomography, and natural earthquake parameters. We ascertained the locations of NW- and NE-trending faults in the DI sea and found that the NE-trending DI fault mainly dipped southeast at a high angle and cut through the crust with an obvious low-velocity anomaly. The NW-trending fault dipped southwest with a similar high angle. The 2006 DI earthquake was adjacent to the intersection of the NE- and NW-trending faults, which suggested that the intersection of the two faults with different strikes could provide a favorable condition for the generation and triggering of intraplate earthquakes. Crustal velocity model showed that the high-velocity anomaly was imaged in the west of DI, but a distinct entity with low-velocity anomaly in the upper crust and high-velocity anomaly in the lower crust was found in the south of DI. Both the 1874 and 2006 DI earthquakes occurred along the edge of the distinct entity. Two vertical cross-sections nearly perpendicular to the strikes of the intersecting faults revealed good spatial correlations between the 2006 DI earthquake and the low to high speed transition in the distinct entity. This result indicated that the transitional zone might be a weakly structural body that can store strain energy and release it as a brittle failure, resulting in an earthquake

  8. Long‐term creep rates on the Hayward Fault: evidence for controls on the size and frequency of large earthquakes

    Science.gov (United States)

    Lienkaemper, James J.; McFarland, Forrest S.; Simpson, Robert W.; Bilham, Roger; Ponce, David A.; Boatwright, John; Caskey, S. John

    2012-01-01

    The Hayward fault (HF) in California exhibits large (Mw 6.5–7.1) earthquakes with short recurrence times (161±65 yr), probably kept short by a 26%–78% aseismic release rate (including postseismic). Its interseismic release rate varies locally over time, as we infer from many decades of surface creep data. Earliest estimates of creep rate, primarily from infrequent surveys of offset cultural features, revealed distinct spatial variation in rates along the fault, but no detectable temporal variation. Since the 1989 Mw 6.9 Loma Prieta earthquake (LPE), monitoring on 32 alinement arrays and 5 creepmeters has greatly improved the spatial and temporal resolution of creep rate. We now identify significant temporal variations, mostly associated with local and regional earthquakes. The largest rate change was a 6‐yr cessation of creep along a 5‐km length near the south end of the HF, attributed to a regional stress drop from the LPE, ending in 1996 with a 2‐cm creep event. North of there near Union City starting in 1991, rates apparently increased by 25% above pre‐LPE levels on a 16‐km‐long reach of the fault. Near Oakland in 2007 an Mw 4.2 earthquake initiated a 1–2 cm creep event extending 10–15 km along the fault. Using new better‐constrained long‐term creep rates, we updated earlier estimates of depth to locking along the HF. The locking depths outline a single, ∼50‐km‐long locked or retarded patch with the potential for an Mw∼6.8 event equaling the 1868 HF earthquake. We propose that this inferred patch regulates the size and frequency of large earthquakes on HF.

  9. A Geodynamic Study of Active Crustal Deformation and Earthquakes in North China

    Science.gov (United States)

    Yang, Y.; Liu, M.

    2005-12-01

    North China is part of the Archaean Sino-Korean craton, yet today it is a region of intense crustal deformation and earthquakes, including 21 M >=7.0 events since 512 AD. More than half of the large events occurred within the Fen-Wei rift system surrounding the stable Ordos plateau; the largest events (M >=7.3) show a sequential southward migration along the rift. However, since 1695 the Fen-Wei rift has became seismically dormant, while seismicity seems having shifted eastward to the North China plain, marked by the 1996 Tangshan earthquake (M=7.8). We have developed a 3D viscoelastic geodynamic model to study the cause of seismicity and its spatial-temporal pattern in North China. Constrained by crustal kinematics from GPS and neotectonic data, the model shows high deviatoric stress in the North China crust, resulting mainly from compression of the expanding Tibetan Plateau and resistance from the stable Siberian block. Within North China seismicity is largely controlled by lateral heterogeneity of lithospheric structures, which explains the concentration of seismicity in the Fen-Wei rift. Our results show that stress triggering may have contributed to the sequential migration of large events along the rift, and the release and migration of stress and strain energy from these large events may partially explain the intense seismicity in the North China plain in the past 300 years. Comparing the predicted long-term spatial pattern of strain energy with seismic energy release provides some insights of potential earthquake risks in North China.

  10. The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment

    International Nuclear Information System (INIS)

    Coppari, S.; Di Pasquale, G.; Goretti, A.; Papa, F.; Papa, S.; Paoli, G.; Pizza, A. G.; Severino, M.

    2008-01-01

    The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspection teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through wide

  11. Lessons of L'Aquila for Operational Earthquake Forecasting

    Science.gov (United States)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  12. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes

    Science.gov (United States)

    Yamada, T.; Ide, S.

    2007-12-01

    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  13. Earthquake prediction in Japan and natural time analysis of seismicity

    Science.gov (United States)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  14. Global life cycle releases of engineered nanomaterials

    International Nuclear Information System (INIS)

    Keller, Arturo A.; McFerran, Suzanne; Lazareva, Anastasiya; Suh, Sangwon

    2013-01-01

    Engineered nanomaterials (ENMs) are now becoming a significant fraction of the material flows in the global economy. We are already reaping the benefits of improved energy efficiency, material use reduction, and better performance in many existing and new applications that have been enabled by these technological advances. As ENMs pervade the global economy, however, it becomes important to understand their environmental implications. As a first step, we combined ENM market information and material flow modeling to produce the first global assessment of the likely ENM emissions to the environment and landfills. The top ten most produced ENMs by mass were analyzed in a dozen major applications. Emissions during the manufacturing, use, and disposal stages were estimated, including intermediate steps through wastewater treatment plants and waste incineration plants. In 2010, silica, titania, alumina, and iron and zinc oxides dominate the ENM market in terms of mass flow through the global economy, used mostly in coatings/paints/pigments, electronics and optics, cosmetics, energy and environmental applications, and as catalysts. We estimate that 63–91 % of over 260,000–309,000 metric tons of global ENM production in 2010 ended up in landfills, with the balance released into soils (8–28 %), water bodies (0.4–7 %), and atmosphere (0.1–1.5 %). While there are considerable uncertainties in the estimates, the framework for estimating emissions can be easily improved as better data become available. The material flow estimates can be used to quantify emissions at the local level, as inputs for fate and transport models to estimate concentrations in different environmental compartments.

  15. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  16. Tomography of the 2011 Iwaki earthquake (M 7.0) and Fukushima nuclear power plant area

    Energy Technology Data Exchange (ETDEWEB)

    Tong, P. [Tohoku Univ., Sendai (Japan). Dept. of Geophysics; Tsinghua Univ., Beijing (China). Dept. of Mathematical Sciences; Zhao, D. [Tohoku Univ., Sendai (Japan). Dept. of Geophysics; Yang, D. [Tsinghua Univ., Beijing (China). Dept. of Mathematical Sciences

    2012-07-01

    High-resolution tomographic images of the crust and upper mantle in and around the area of the 2011 Iwaki earthquake (M 7.0) and the Fukushima nuclear power plant are determined by inverting a large number of high-quality arrival times with both the finite-frequency and ray tomography methods. The Iwaki earthquake and its aftershocks mainly occurred in a boundary zone with strong variations in seismic velocity and Poisson's ratio. Prominent low-velocity and high Poisson's ratio zones are revealed under the Iwaki source area and the Fukushima nuclear power plant, which may reflect fluids released from the dehydration of the subducting Pacific slab under Northeast Japan. The 2011 Tohoku-oki earthquake (Mw 9.0) caused static stress transfer in the overriding Okhotsk plate, resulting in the seismicity in the Iwaki source area that significantly increased immediately following the Tohoku-oki main-shock. Our results suggest that the Iwaki earthquake was triggered by the ascending fluids from the Pacific slab dehydration and the stress variation induced by the Tohoku-oki main-shock. The similar structures under the Iwaki source area and the Fukushima nuclear power plant suggest that the security of the nuclear power plant site should be strengthened to withstand potential large earthquakes in the future. (orig.)

  17. Tomography of the 2011 Iwaki earthquake (M 7.0 and Fukushima nuclear power plant area

    Directory of Open Access Journals (Sweden)

    P. Tong

    2012-02-01

    Full Text Available High-resolution tomographic images of the crust and upper mantle in and around the area of the 2011 Iwaki earthquake (M 7.0 and the Fukushima nuclear power plant are determined by inverting a large number of high-quality arrival times with both the finite-frequency and ray tomography methods. The Iwaki earthquake and its aftershocks mainly occurred in a boundary zone with strong variations in seismic velocity and Poisson's ratio. Prominent low-velocity and high Poisson's ratio zones are revealed under the Iwaki source area and the Fukushima nuclear power plant, which may reflect fluids released from the dehydration of the subducting Pacific slab under Northeast Japan. The 2011 Tohoku-oki earthquake (Mw 9.0 caused static stress transfer in the overriding Okhotsk plate, resulting in the seismicity in the Iwaki source area that significantly increased immediately following the Tohoku-oki mainshock. Our results suggest that the Iwaki earthquake was triggered by the ascending fluids from the Pacific slab dehydration and the stress variation induced by the Tohoku-oki mainshock. The similar structures under the Iwaki source area and the Fukushima nuclear power plant suggest that the security of the nuclear power plant site should be strengthened to withstand potential large earthquakes in the future.

  18. Comparison of the sand liquefaction estimated based on codes and practical earthquake damage phenomena

    Science.gov (United States)

    Fang, Yi; Huang, Yahong

    2017-12-01

    Conducting sand liquefaction estimated based on codes is the important content of the geotechnical design. However, the result, sometimes, fails to conform to the practical earthquake damages. Based on the damage of Tangshan earthquake and engineering geological conditions, three typical sites are chosen. Moreover, the sand liquefaction probability was evaluated on the three sites by using the method in the Code for Seismic Design of Buildings and the results were compared with the sand liquefaction phenomenon in the earthquake. The result shows that the difference between sand liquefaction estimated based on codes and the practical earthquake damage is mainly attributed to the following two aspects: The primary reasons include disparity between seismic fortification intensity and practical seismic oscillation, changes of groundwater level, thickness of overlying non-liquefied soil layer, local site effect and personal error. Meanwhile, although the judgment methods in the codes exhibit certain universality, they are another reason causing the above difference due to the limitation of basic data and the qualitative anomaly of the judgment formulas.

  19. Modified n-HA/PA66 scaffolds with chitosan coating for bone tissue engineering: cell stimulation and drug release.

    Science.gov (United States)

    Zou, Qin; Li, Junfeng; Niu, Lulu; Zuo, Yi; Li, Jidong; Li, Yubao

    2017-09-01

    The dipping-drying procedure and cross-linking method were used to make drug-loaded chitosan (CS) coating on nano-hydroxyapatite/polyamide66 (nHA/PA66) composite porous scaffold, endowing the scaffold controlled drug release functionality. The prefabricated scaffold was immersed into an aqueous drug/CS solution in a vacuum condition and then crosslinked by vanillin. The structure, porosity, composition, compressive strength, swelling ratio, drug release and cytocompatibility of the pristine and coating scaffolds were investigated. After coating, the scaffold porosity and pore interconnection were slightly decreased. Cytocompatibility performance was observed through an in vitro experiment based on cell attachment and the MTT assay by MG63 cells which revealed positive cell viability and increasing proliferation over the 11-day period in vitro. The drug could effectively release from the coated scaffold in a controlled fashion and the release rate was sustained for a long period and highly dependent on coating swelling, suggesting the possibility of a controlled drug release. Our results demonstrate that the scaffold with drug-loaded crosslinked CS coating can be used as a simple technique to render the surfaces of synthetic scaffolds active, thus enabling them to be a promising high performance biomaterial in bone tissue engineering.

  20. Maximum magnitude of injection-induced earthquakes: A criterion to assess the influence of pressure migration along faults

    Science.gov (United States)

    Norbeck, Jack H.; Horne, Roland N.

    2018-05-01

    The maximum expected earthquake magnitude is an important parameter in seismic hazard and risk analysis because of its strong influence on ground motion. In the context of injection-induced seismicity, the processes that control how large an earthquake will grow may be influenced by operational factors under engineering control as well as natural tectonic factors. Determining the relative influence of these effects on maximum magnitude will impact the design and implementation of induced seismicity management strategies. In this work, we apply a numerical model that considers the coupled interactions of fluid flow in faulted porous media and quasidynamic elasticity to investigate the earthquake nucleation, rupture, and arrest processes for cases of induced seismicity. We find that under certain conditions, earthquake ruptures are confined to a pressurized region along the fault with a length-scale that is set by injection operations. However, earthquakes are sometimes able to propagate as sustained ruptures outside of the zone that experienced a pressure perturbation. We propose a faulting criterion that depends primarily on the state of stress and the earthquake stress drop to characterize the transition between pressure-constrained and runaway rupture behavior.

  1. Reflections on Communicating Science during the Canterbury Earthquake Sequence of 2010-2011, New Zealand

    Science.gov (United States)

    Wein, A. M.; Berryman, K. R.; Jolly, G. E.; Brackley, H. L.; Gledhill, K. R.

    2015-12-01

    The 2010-2011 Canterbury Earthquake Sequence began with the 4th September 2010 Darfield earthquake (Mw 7.1). Perhaps because there were no deaths, the mood of the city and the government was that high standards of earthquake engineering in New Zealand protected us, and there was a confident attitude to response and recovery. The demand for science and engineering information was of interest but not seen as crucial to policy, business or the public. The 22nd February 2011 Christchurch earthquake (Mw 6.2) changed all that; there was a significant death toll and many injuries. There was widespread collapse of older unreinforced and two relatively modern multi-storey buildings, and major disruption to infrastructure. The contrast in the interest and relevance of the science could not have been greater compared to 5 months previously. Magnitude 5+ aftershocks over a 20 month period resulted in confusion, stress, an inability to define a recovery trajectory, major concerns about whether insurers and reinsurers would continue to provide cover, very high levels of media interest from New Zealand and around the world, and high levels of political risk. As the aftershocks continued there was widespread speculation as to what the future held. During the sequence, the science and engineering sector sought to coordinate and offer timely and integrated advice. However, other than GeoNet, the national geophysical monitoring network, there were few resources devoted to communication, with the result that it was almost always reactive. With hindsight we have identified the need to resource information gathering and synthesis, execute strategic assessments of stakeholder needs, undertake proactive communication, and develop specific information packages for the diversity of users. Overall this means substantially increased resources. Planning is now underway for the science sector to adopt the New Zealand standardised CIMS (Coordinated Incident Management System) structure for

  2. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  3. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  4. Urgent Safety Measures in Japan after Great East Japan Earthquake

    International Nuclear Information System (INIS)

    Taniura, Wataru; Otani, Hiroyasu

    2012-01-01

    Due to tsunami triggered by the Great East Japan Earthquake, the operating and refueling reactor facilities at Fukushima Dai-ichi and Dai-ni Nuclear Power Plants caused a nuclear hazard. Given the fact, Japanese electric power companies voluntarily began to compile various urgent measures against tsunami. And then the Nuclear and Industrial Safety Agency (NISA) ordered the licensees to put into practice the voluntarily compiled urgent safety measures, in order to ensure the effectiveness of the means for recovering cooling functions along with avoiding the release of radioactive substances to the possible minimum, even if a huge tsunami following a severe earthquake hits nuclear power plants. The following describes the state and the effect of the urgent safety measures implemented for 44 reactors (under operation) and 1 reactor (under construction) in Japan and also describes the measures to be implemented by the licensees of reactor operation in the future.

  5. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  6. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    Science.gov (United States)

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and

  7. Recent applications for rapid estimation of earthquake shaking and losses with ELER Software

    International Nuclear Information System (INIS)

    Demircioglu, M.B.; Erdik, M.; Kamer, Y.; Sesetyan, K.; Tuzun, C.

    2012-01-01

    A methodology and software package entitled Earthquake Loss Estimation Routine (ELER) was developed for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region. The work was carried out under the Joint Research Activity-3 (JRA3) of the EC FP6 project entitled Network of Research Infrastructures for European Seismology (NERIES). The ELER methodology anticipates: 1) finding of the most likely location of the source of the earthquake using regional seismo-tectonic data base; 2) estimation of the spatial distribution of selected ground motion parameters at engineering bedrock through region specific ground motion prediction models, bias-correcting the ground motion estimations with strong ground motion data, if available; 3) estimation of the spatial distribution of site-corrected ground motion parameters using regional geology database using appropriate amplification models; and 4) estimation of the losses and uncertainties at various orders of sophistication (buildings, casualties). The multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships which are coded into ELER. The present paper provides brief information on the methodology of ELER and provides an example application with the recent major earthquake that hit the Van province in the east of Turkey on 23 October 2011 with moment magnitude (Mw) of 7.2. For this earthquake, Kandilli Observatory and Earthquake Research Institute (KOERI) provided almost real time estimations in terms of building damage and casualty distribution using ELER. (author)

  8. Causes of earthquake spatial distribution beneath the Izu-Bonin-Mariana Arc

    Science.gov (United States)

    Kong, Xiangchao; Li, Sanzhong; Wang, Yongming; Suo, Yanhui; Dai, Liming; Géli, Louis; Zhang, Yong; Guo, Lingli; Wang, Pengcheng

    2018-01-01

    Statistics about the occurrence frequency of earthquakes (1973-2015) at shallow, intermediate and great depths along the Izu-Bonin-Mariana (IBM) Arc is presented and a percent perturbation relative to P-wave mean value (LLNL-G3Dv3) is adopted to show the deep structure. The correlation coefficient between the subduction rate and the frequency of shallow seismic events along the IBM is 0.605, proving that the subduction rate is an important factor for shallow seismic events. The relationship between relief amplitudes of the seafloor and earthquake occurrences implies that some seamount chains riding on the Pacific seafloor may have an effect on intermediate-depth seismic events along the IBM. A probable hypothesis is proposed that the seamounts or surrounding seafloor with high degree of fracture may bring numerous hydrous minerals into the deep and may result in a different thermal structure compared to the seafloor where no seamounts are subducted. Fluids from the seamounts or surrounding seafloor are released to trigger earthquakes at intermediate-depth. Deep events in the northern and southern Mariana arc are likely affected by a horizontal propagating tear parallel to the trench.

  9. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    Science.gov (United States)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  10. International Aftershock Forecasting: Lessons from the Gorkha Earthquake

    Science.gov (United States)

    Michael, A. J.; Blanpied, M. L.; Brady, S. R.; van der Elst, N.; Hardebeck, J.; Mayberry, G. C.; Page, M. T.; Smoczyk, G. M.; Wein, A. M.

    2015-12-01

    Following the M7.8 Gorhka, Nepal, earthquake of April 25, 2015 the USGS issued a series of aftershock forecasts. The initial impetus for these forecasts was a request from the USAID Office of US Foreign Disaster Assistance to support their Disaster Assistance Response Team (DART) which coordinated US Government disaster response, including search and rescue, with the Government of Nepal. Because of the possible utility of the forecasts to people in the region and other response teams, the USGS released these forecasts publicly through the USGS Earthquake Program web site. The initial forecast used the Reasenberg and Jones (Science, 1989) model with generic parameters developed for active deep continental regions based on the Garcia et al. (BSSA, 2012) tectonic regionalization. These were then updated to reflect a lower productivity and higher decay rate based on the observed aftershocks, although relying on teleseismic observations, with a high magnitude-of-completeness, limited the amount of data. After the 12 May M7.3 aftershock, the forecasts used an Epidemic Type Aftershock Sequence model to better characterize the multiple sources of earthquake clustering. This model provided better estimates of aftershock uncertainty. These forecast messages were crafted based on lessons learned from the Christchurch earthquake along with input from the U.S. Embassy staff in Kathmandu. Challenges included how to balance simple messaging with forecasts over a variety of time periods (week, month, and year), whether to characterize probabilities with words such as those suggested by the IPCC (IPCC, 2010), how to word the messages in a way that would translate accurately into Nepali and not alarm the public, and how to present the probabilities of unlikely but possible large and potentially damaging aftershocks, such as the M7.3 event, which had an estimated probability of only 1-in-200 for the week in which it occurred.

  11. Characterization of Aftershock Sequences from Large Strike-Slip Earthquakes Along Geometrically Complex Faults

    Science.gov (United States)

    Sexton, E.; Thomas, A.; Delbridge, B. G.

    2017-12-01

    Large earthquakes often exhibit complex slip distributions and occur along non-planar fault geometries, resulting in variable stress changes throughout the region of the fault hosting aftershocks. To better discern the role of geometric discontinuities on aftershock sequences, we compare areas of enhanced and reduced Coulomb failure stress and mean stress for systematic differences in the time dependence and productivity of these aftershock sequences. In strike-slip faults, releasing structures, including stepovers and bends, experience an increase in both Coulomb failure stress and mean stress during an earthquake, promoting fluid diffusion into the region and further failure. Conversely, Coulomb failure stress and mean stress decrease in restraining bends and stepovers in strike-slip faults, and fluids diffuse away from these areas, discouraging failure. We examine spatial differences in seismicity patterns along structurally complex strike-slip faults which have hosted large earthquakes, such as the 1992 Mw 7.3 Landers, the 2010 Mw 7.2 El-Mayor Cucapah, the 2014 Mw 6.0 South Napa, and the 2016 Mw 7.0 Kumamoto events. We characterize the behavior of these aftershock sequences with the Epidemic Type Aftershock-Sequence Model (ETAS). In this statistical model, the total occurrence rate of aftershocks induced by an earthquake is λ(t) = λ_0 + \\sum_{i:t_i

  12. Conventional estimating method of earthquake response of mechanical appendage system

    International Nuclear Information System (INIS)

    Aoki, Shigeru; Suzuki, Kohei

    1981-01-01

    Generally, for the estimation of the earthquake response of appendage structure system installed in main structure system, the method of floor response analysis using the response spectra at the point of installing the appendage system has been used. On the other hand, the research on the estimation of the earthquake response of appendage system by the statistical procedure based on probability process theory has been reported. The development of a practical method for simply estimating the response is an important subject in aseismatic engineering. In this study, the method of estimating the earthquake response of appendage system in the general case that the natural frequencies of both structure systems were different was investigated. First, it was shown that floor response amplification factor was able to be estimated simply by giving the ratio of the natural frequencies of both structure systems, and its statistical property was clarified. Next, it was elucidated that the procedure of expressing acceleration, velocity and displacement responses with tri-axial response spectra simultaneously was able to be applied to the expression of FRAF. The applicability of this procedure to nonlinear system was examined. (Kako, I.)

  13. Earthquakes, May-June 1991

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  14. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  15. The role of INGVterremoti blog in information management during the earthquake sequence in central Italy

    Directory of Open Access Journals (Sweden)

    Maurizio Pignone

    2017-01-01

    Full Text Available In this paper, we describe the role the INGVterremoti blog in information management during the first part of the earthquake sequence in central Italy (August 24 to September 30. In the last four years, we have been working on the INGVterremoti blog in order to provide quick updates on the ongoing seismic activity in Italy and in-depth scientific information. These include articles on specific historical earthquakes, seismic hazard, geological interpretations, source models from different type of data, effects at the surface, and so on. We have delivered information in quasi-real-time also about all the recent magnitude M≥4.0 earthquakes in Italy, the strongest events in the Mediterranean and in the world. During the 2016 central Italy, the INGVterremoti blog has continuously released information about seismic sequences with three types of posts: i updates on the ongoing seismic activity; ii reports on the activities carried out by the INGV teams in the field and any other working groups; iii in-depth scientific articles describing some specific analysis and results. All the blog posts have been shared automatically and in real time on the other social media of the INGVterremoti platform, also to counter the bad information and to fight rumors. These include Facebook, Twitter and INGVterremoti App on IOS and Android. As well, both the main INGV home page (http://www.ingv.it and the INGV earthquake portal (http://terremoti.ingv.it have published the contents of the blog on dedicated pages that were fed automatically. The work done day by day on the INGVterremoti blog has been coordinated with the INGV Press Office that has written several press releases based on the contents of the blog. Since August 24, 53 articles were published on the blog they have had more than 1.9 million views and 1 million visitors. The peak in the number of views, which was more than 800,000 in a single day, was registered on August 24, 2016, following the M 6

  16. Dealing with uncertainty in Earthquake Engineering: a discussion on the application of the Theory of Open Dynamical Systems

    OpenAIRE

    Quintana-Gallo, Patricio; Rebolledo, Rolando; Allan, George

    2013-01-01

    Earthquakes, as a natural phenomenon and their consequences upon structures, have been addressed from deterministic, pseudo-empirical and primary statistical-probabilistic points of view. In the latter approach, 'primary' is meant to suggest that randomness has been artificially introduced into the variables of investigation. An alternative view has been advanced by a number ofresearchers that have classified earthquakes as chaotic from an ontological perspective. Their arguments are founded ...

  17. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  18. Gambling scores for earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  19. Estimated airborne release of plutonium from Atomics International's Nuclear Materials Development Facility in the Santa Susana site, California, as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.

    1981-09-01

    The potential mass of airborne releases of plutonium (source term) that could result from wind and seismic damage is estimated for the Atomics International Company's Nuclear Materials Development Facility (NMDF) at the Santa Susana site in California. The postulated source terms will be useful as the basis for estimating the potential dose to the maximum exposed individual by inhalation and to the total population living within a prescribed radius of the site. The respirable fraction of airborne particles is thus the principal concern. The estimated source terms are based on the damage ratio, and the potential airborne releases if all enclosures suffer particular levels of damage. In an attempt to provide a realistic range of potential source terms that include most of the normal processing conditions, a best estimate bounded by upper and lower limits is provided. The range of source terms is calculated by combining a high best estimate and a low damage ratio, based on a fraction of enclosures suffering crush or perforation, with the airborne release from enclosures based upon an upper limit, average, and lower limit inventory of dispersible materials at risk. Two throughput levels are considered. The factors used to evaluate the fractional airborne release of materials and the exchange rates between enclosed and exterior atmospheres are discussed. The postulated damage and source terms are discussed for wind and earthquake hazard scenarios in order of their increasing severity

  20. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  1. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  2. How controlled release technology can aid gene delivery.

    Science.gov (United States)

    Jo, Jun-Ichiro; Tabata, Yasuhiko

    2015-01-01

    Many types of gene delivery systems have been developed to enhance the level of gene expression. Controlled release technology is a feasible gene delivery system which enables genes to extend the expression duration by maintaining and releasing them at the injection site in a controlled manner. This technology can reduce the adverse effects by the bolus dose administration and avoid the repeated administration. Biodegradable biomaterials are useful as materials for the controlled release-based gene delivery technology and various biodegradable biomaterials have been developed. Controlled release-based gene delivery plays a critical role in a conventional gene therapy and genetic engineering. In the gene therapy, the therapeutic gene is released from biodegradable biomaterial matrices around the tissue to be treated. On the other hand, the intracellular controlled release of gene from the sub-micro-sized matrices is required for genetic engineering. Genetic engineering is feasible for cell transplantation as well as research of stem cells biology and medicine. DNA hydrogel containing a sequence of therapeutic gene and the exosome including the individual specific nucleic acids may become candidates for controlled release carriers. Technologies to deliver genes to cell aggregates will play an important role in the promotion of regenerative research and therapy.

  3. High-frequency source radiation during the 2011 Tohoku-Oki earthquake, Japan, inferred from KiK-net strong-motion seismograms

    Science.gov (United States)

    Kumagai, Hiroyuki; Pulido, Nelson; Fukuyama, Eiichi; Aoi, Shin

    2013-01-01

    investigate source processes of the 2011 Tohoku-Oki earthquake, we utilized a source location method using high-frequency (5-10 Hz) seismic amplitudes. In this method, we assumed far-field isotropic radiation of S waves, and conducted a spatial grid search to find the best fitting source locations along the subducted slab in each successive time window. Our application of the method to the Tohoku-Oki earthquake resulted in artifact source locations at shallow depths near the trench caused by limited station coverage and noise effects. We then assumed various source node distributions along the plate, and found that the observed seismograms were most reasonably explained when assuming deep source nodes. This result suggests that the high-frequency seismic waves were radiated at deeper depths during the earthquake, a feature which is consistent with results obtained from teleseismic back-projection and strong-motion source model studies. We identified three high-frequency subevents, and compared them with the moment-rate function estimated from low-frequency seismograms. Our comparison indicated that no significant moment release occurred during the first high-frequency subevent and the largest moment-release pulse occurred almost simultaneously with the second high-frequency subevent. We speculated that the initial slow rupture propagated bilaterally from the hypocenter toward the land and trench. The landward subshear rupture propagation consisted of three successive high-frequency subevents. The trenchward propagation ruptured the strong asperity and released the largest moment near the trench.

  4. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  5. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  6. Intermediate-term earthquake prediction and seismic zoning in Northern Italy

    International Nuclear Information System (INIS)

    Panza, G.F.; Orozova Stanishkova, I.; Costa, G.; Vaccari, F.

    1993-12-01

    The algorithm CN for intermediate earthquake prediction has been applied to an area in Northern Italy, which has been chosen according to a recently proposed seismotectonic model. Earthquakes with magnitude ≥ 5.4 occur in the area with a relevant frequency and their occurrence is predicted by algorithm CN. Therefore a seismic hazard analysis has been performed using a deterministic procedure, based on the computation of complete synthetic seismograms. The results are summarized in a map giving the distribution of peak ground acceleration, but the complete time series are available, which can be used by civil engineers in the design of new seismo-resistant constructions and in the retrofitting of the existing ones. This risk reduction action should be intensified in connection with warnings issued on the basis of the forward predictions made by CN. (author). Refs, 7 figs, 1 tab

  7. Cloud-based systems for monitoring earthquakes and other environmental quantities

    Science.gov (United States)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  8. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  9. Application of a linked stress release model in Corinth Gulf and Central Ionian Islands (Greece)

    Science.gov (United States)

    Mangira, Ourania; Vasiliadis, Georgios; Papadimitriou, Eleftheria

    2017-06-01

    Spatio-temporal stress changes and interactions between adjacent fault segments consist of the most important component in seismic hazard assessment, as they can alter the occurrence probability of strong earthquake onto these segments. The investigation of the interactions between adjacent areas by means of the linked stress release model is attempted for moderate earthquakes ( M ≥ 5.2) in the Corinth Gulf and the Central Ionian Islands (Greece). The study areas were divided in two subareas, based on seismotectonic criteria. The seismicity of each subarea is investigated by means of a stochastic point process and its behavior is determined by the conditional intensity function, which usually gets an exponential form. A conditional intensity function of Weibull form is used for identifying the most appropriate among the models (simple, independent and linked stress release model) for the interpretation of the earthquake generation process. The appropriateness of the models was decided after evaluation via the Akaike information criterion. Despite the fact that the curves of the conditional intensity functions exhibit similar behavior, the use of the exponential-type conditional intensity function seems to fit better the data.

  10. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  11. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  12. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  13. Economic consequences of earthquakes: bridging research and practice with HayWired

    Science.gov (United States)

    Wein, A. M.; Kroll, C.

    2016-12-01

    The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.

  14. Seismogenic Structure Beneath Décollement Inferred from 2009/11/5 ML 6.2 Mingjian Earthquake in Central Taiwan

    Directory of Open Access Journals (Sweden)

    Che-Min Lin

    2014-01-01

    Full Text Available One decade after the 1999 Chi-Chi earthquake, central Taiwan experienced more strong ground shaking [Central Weather Bureau (CWB, intensity VII] induced by a ML 6.2 earthquake on 5th November 2009. This earthquake occurred in the Mingjian Township of Nantou County, only 12 km southwest of the Chi-Chi earthquake epicenter. The broadband microearthquake monitoring network operated by the National Center for Research on Earthquake Engineering (NCREE observed numerous aftershocks in the five days following the mainshock. The relocated aftershocks and the mainshock focal mechanism indicated a NE-SW striking fault dipping 60¢X toward the northwest. This fault plane is inside the pre-Miocene basement and the rupture extends from the lower crust to 10 km depth just beneath the basal décollement of the thin-skinned model that is generally used to explain the regional tectonics in Taiwan. The fault plane is vertically symmetrical with the Chelungpu fault by the basal décollement. The NW-SE compressive stress of plate collision in Taiwan, as well as the deep tectonic background, resulted in the seismogenic structure of the Mingjian earthquake at this location.

  15. Rupture processes of the 2013-2014 Minab earthquake sequence, Iran

    Science.gov (United States)

    Kintner, Jonas A.; Ammon, Charles J.; Cleveland, K. Michael; Herman, Matthew

    2018-06-01

    We constrain epicentroid locations, magnitudes and depths of moderate-magnitude earthquakes in the 2013-2014 Minab sequence using surface-wave cross-correlations, surface-wave spectra and teleseismic body-wave modelling. We estimate precise relative locations of 54 Mw ≥ 3.8 earthquakes using 48 409 teleseismic, intermediate-period Rayleigh and Love-wave cross-correlation measurements. To reduce significant regional biases in our relative locations, we shift the relative locations to align the Mw 6.2 main-shock centroid to a location derived from an independent InSAR fault model. Our relocations suggest that the events lie along a roughly east-west trend that is consistent with the faulting geometry in the GCMT catalogue. The results support previous studies that suggest the sequence consists of left-lateral strain release, but better defines the main-shock fault length and shows that most of the Mw ≥ 5.0 aftershocks occurred on one or two similarly oriented structures. We also show that aftershock activity migrated westwards along strike, away from the main shock, suggesting that Coulomb stress transfer played a role in the fault failure. We estimate the magnitudes of the relocated events using surface-wave cross-correlation amplitudes and find good agreement with the GCMT moment magnitudes for the larger events and underestimation of small-event size by catalogue MS. In addition to clarifying details of the Minab sequence, the results demonstrate that even in tectonically complex regions, relative relocation using teleseismic surface waves greatly improves the precision of relative earthquake epicentroid locations and can facilitate detailed tectonic analyses of remote earthquake sequences.

  16. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    Science.gov (United States)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  17. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    Directory of Open Access Journals (Sweden)

    W. F. Peng

    2012-03-01

    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  18. ShakeCast: Automating and improving the use of shakemap for post-earthquake deeision-making and response

    Science.gov (United States)

    Wald, D.; Lin, K.-W.; Porter, K.; Turner, Loren

    2008-01-01

    When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, on its own can be useful for emergency response, loss estimation, and public information. However, to take full advantage of the potential of ShakeMap, we introduce ShakeCast. ShakeCast facilitates the complicated assessment of potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps of structures or facilities most likely impacted. ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users' facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for both public and private emergency managers and responders. ?? 2008, Earthquake Engineering Research Institute.

  19. The Global Earthquake Model and Disaster Risk Reduction

    Science.gov (United States)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  20. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  1. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    Energy Technology Data Exchange (ETDEWEB)

    na

    2001-02-08

    preceded by foreshocks. The monitoring area of the SGBDSN has been in a long period of very low moment release rate since February of 1999. The seismicity catalog to date suggests that the next significant (M > 4) earthquake within the SGBDSN will be preceded by foreshocks.

  2. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    International Nuclear Information System (INIS)

    2001-01-01

    monitoring area of the SGBDSN has been in a long period of very low moment release rate since February of 1999. The seismicity catalog to date suggests that the next significant (M > 4) earthquake within the SGBDSN will be preceded by foreshocks

  3. Rapid estimation of the economic consequences of global earthquakes

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  4. Contribution of Satellite Gravimetry to Understanding Seismic Source Processes of the 2011 Tohoku-Oki Earthquake

    Science.gov (United States)

    Han, Shin-Chan; Sauber, Jeanne; Riva, Riccardo

    2011-01-01

    The 2011 great Tohoku-Oki earthquake, apart from shaking the ground, perturbed the motions of satellites orbiting some hundreds km away above the ground, such as GRACE, due to coseismic change in the gravity field. Significant changes in inter-satellite distance were observed after the earthquake. These unconventional satellite measurements were inverted to examine the earthquake source processes from a radically different perspective that complements the analyses of seismic and geodetic ground recordings. We found the average slip located up-dip of the hypocenter but within the lower crust, as characterized by a limited range of bulk and shear moduli. The GRACE data constrained a group of earthquake source parameters that yield increasing dip (7-16 degrees plus or minus 2 degrees) and, simultaneously, decreasing moment magnitude (9.17-9.02 plus or minus 0.04) with increasing source depth (15-24 kilometers). The GRACE solution includes the cumulative moment released over a month and demonstrates a unique view of the long-wavelength gravimetric response to all mass redistribution processes associated with the dynamic rupture and short-term postseismic mechanisms to improve our understanding of the physics of megathrusts.

  5. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  6. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  7. Tissue-engineered matrices as functional delivery systems: adsorption and release of bioactive proteins from degradable composite scaffolds.

    Science.gov (United States)

    Cushnie, Emily K; Khan, Yusuf M; Laurencin, Cato T

    2010-08-01

    A tissue-engineered bone graft should imitate the ideal autograft in both form and function. However, biomaterials that have appropriate chemical and mechanical properties for grafting applications often lack biological components that may enhance regeneration. The concept of adding proteins such as growth factors to scaffolds has therefore emerged as a possible solution to improve overall graft design. In this study, we investigated this concept by loading porous hydroxyapatite-poly(lactide-co-glycolide) (HA-PLAGA) scaffolds with a model protein, cytochrome c, and then studying its release in a phosphate-buffered saline solution. The HA-PLAGA scaffold has previously been shown to be bioactive, osteoconductive, and to have appropriate physical properties for tissue engineering applications. The loading experiments demonstrated that the HA-PLAGA scaffold could also function effectively as a substrate for protein adsorption and release. Scaffold protein adsorptive loading (as opposed to physical entrapment within the matrix) was directly related to levels of scaffold HA-content. The HA phase of the scaffold facilitated protein retention in the matrix following incubation in aqueous buffer for periods up to 8 weeks. Greater levels of protein retention time may improve the protein's effective activity by increasing the probability for protein-cell interactions. The ability to control protein loading and delivery simply via composition of the HA-PLAGA scaffold offers the potential of forming robust functionalized bone grafts. (c) 2010 Wiley Periodicals, Inc.

  8. Engineering risk assessment for hydro facilities

    International Nuclear Information System (INIS)

    Laurence, K.G.

    1991-01-01

    Faced with escalating property insurance premiums, the Alaska Energy Authority decided to evaluate what losses may realistically be expected due to catastrophic events at their hydroelectric generation and transmission facilities. Ideally insurance rates are established using historic loss statistics. Where these statistics are non-existent, other means must be employed to estimate expected losses so that appropriate steps may be taken to protect investments in facilities. The natural perils of earthquake, flood, tidal wave (tsunami), wind, snow and internal failure potentially can cause catastrophic damage, but due to their infrequency in the higher magnitudes, meaningful statistics are as yet insufficient to be of value in estimating losses from these events. In order to overcome this deficiency a quasi-engineering approach can be adopted as distinct from the actuarial approach preferred and most often used by the insurance industry. This paper describes the quasi-engineering approach used for this assessment with a specific example worked through for earthquake peril

  9. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  10. Time-history simulation of civil architecture earthquake disaster relief- based on the three-dimensional dynamic finite element method

    Directory of Open Access Journals (Sweden)

    Liu Bing

    2014-10-01

    Full Text Available Earthquake action is the main external factor which influences long-term safe operation of civil construction, especially of the high-rise building. Applying time-history method to simulate earthquake response process of civil construction foundation surrounding rock is an effective method for the anti-knock study of civil buildings. Therefore, this paper develops a civil building earthquake disaster three-dimensional dynamic finite element numerical simulation system. The system adopts the explicit central difference method. Strengthening characteristics of materials under high strain rate and damage characteristics of surrounding rock under the action of cyclic loading are considered. Then, dynamic constitutive model of rock mass suitable for civil building aseismic analysis is put forward. At the same time, through the earthquake disaster of time-history simulation of Shenzhen Children’s Palace, reliability and practicability of system program is verified in the analysis of practical engineering problems.

  11. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  12. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    Science.gov (United States)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  13. Systems engineering and integration as a foundation for mission engineering

    OpenAIRE

    Beam, David F.

    2015-01-01

    Approved for public release; distribution is unlimited This paper investigates the emerging term mission engineering through the framework of systems engineering and systems integration. Systems engineering concepts, processes, and methodologies are extrapolated for use in conjunction with a systems integration, life-cycle based framework to effect mission engineering. The specific systems engineering concepts of measures of effectiveness, performance and suitability are recommended as fou...

  14. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    Science.gov (United States)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  15. Soft computing analysis of the possible correlation between temporal and energy release patterns in seismic activity

    Science.gov (United States)

    Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin

    2010-05-01

    This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and

  16. MODEL OF TECTONIC EARTHQUAKE PREPARATION AND OCCURRENCE AND ITS PRECURSORS IN CONDITIONS OF CRUSTAL STRETCHING

    Directory of Open Access Journals (Sweden)

    R. M. Semenov

    2018-01-01

    Full Text Available In connection with changes in the stress-strain state of the Earth's crust, various physical and mechanical processes, including destruction, take place in the rocks and are accompanied by tectonic earthquakes. Different models have been proposed to describe earthquake preparation and occurrence, depending on the mechanisms and the rates of geodynamic processes. One of the models considers crustal stretching that is characteristic of formation of rift structures. The model uses the data on rock samples that are stretched until destruction in a special laboratory installation. Based on the laboratory modeling, it is established that the samples are destroyed in stages that are interpreted as stages of preparation and occurrence of an earthquake source. The preparation stage of underground tremors is generally manifested by a variety of temporal (long-, medium- and short-term precursors. The main shortcoming of micro-modeling is that, considering small sizes of the investigated samples, it is impossible to reveal a link between the plastic extension of rocks (taking place in the earthquake hypocenter and the rock rupture. Plasticity is the ability of certain rocks to change shape and size irreversibly, while the rock continuity is maintained, in response to applied external forces. In order to take into account the effect of plastic deformation of rocks on earthquake preparation and occurrence, we propose not to refer to the diagrams showing stretching of the rock samples, but use a typical diagram of metal stretching, which can be obtained when testing a metal rod for breakage (Fig. 1. The diagram of metal stretching as a function of the relative elongation (to some degree of approximation and taking into account the coefficient of plasticity can be considered as a model of preparation and occurrence of an earthquake source in case of rifting. The energy released in the period immediately preceding the earthquake contributes to the emergence of

  17. Failure analysis of pebble bed reactors during earthquake by discrete element method

    International Nuclear Information System (INIS)

    Keppler, Istvan

    2013-01-01

    Highlights: ► We evaluated the load acting on the central reflector beam of a pebble bed reactor. ► The load acting on the reflector beam highly depends on fuel element distribution. ► The contact force values do not show high dependence on fuel element distribution. ► Earthquake increases the load of the reflector, not the contact forces. -- Abstract: Pebble bed reactors (PBR) are graphite-moderated, gas-cooled nuclear reactors. PBR reactors use a large number of spherical fuel elements called pebbles. From mechanical point of view, the arrangement of “small” spherical fuel elements in a container poses the same problem, as the so-called silo problem in powder technology and agricultural engineering. To get more exact information about the contact forces arising between the fuel elements in static and dynamic case, we simulated the static case and the effects of an earthquake on a model reactor by using discrete element method. We determined the maximal contact forces acting between the individual fuel elements. We found that the value of the maximal bending moment in the central reflector beam has a high deviation from the average value even in static case, and it can significantly increase in case of an earthquake. Our results can help the engineers working on the design of such types of reactors to get information about the contact forces, to determine the dust production and the crush probability of fuel elements within the reactor, and to model different accident scenarios

  18. Failure analysis of pebble bed reactors during earthquake by discrete element method

    Energy Technology Data Exchange (ETDEWEB)

    Keppler, Istvan, E-mail: keppler.istvan@gek.szie.hu [Department of Mechanics and Engineering Design, Szent István University, Páter K.u.1., Gödöllő H-2103 (Hungary)

    2013-05-15

    Highlights: ► We evaluated the load acting on the central reflector beam of a pebble bed reactor. ► The load acting on the reflector beam highly depends on fuel element distribution. ► The contact force values do not show high dependence on fuel element distribution. ► Earthquake increases the load of the reflector, not the contact forces. -- Abstract: Pebble bed reactors (PBR) are graphite-moderated, gas-cooled nuclear reactors. PBR reactors use a large number of spherical fuel elements called pebbles. From mechanical point of view, the arrangement of “small” spherical fuel elements in a container poses the same problem, as the so-called silo problem in powder technology and agricultural engineering. To get more exact information about the contact forces arising between the fuel elements in static and dynamic case, we simulated the static case and the effects of an earthquake on a model reactor by using discrete element method. We determined the maximal contact forces acting between the individual fuel elements. We found that the value of the maximal bending moment in the central reflector beam has a high deviation from the average value even in static case, and it can significantly increase in case of an earthquake. Our results can help the engineers working on the design of such types of reactors to get information about the contact forces, to determine the dust production and the crush probability of fuel elements within the reactor, and to model different accident scenarios.

  19. Stress triggering of the Lushan M7. 0 earthquake by the Wenchuan Ms8. 0 earthquake

    Directory of Open Access Journals (Sweden)

    Wu Jianchao

    2013-08-01

    Full Text Available The Wenchuan Ms8. 0 earthquake and the Lushan M7. 0 earthquake occurred in the north and south segments of the Longmenshan nappe tectonic belt, respectively. Based on the focal mechanism and finite fault model of the Wenchuan Ms8. 0 earthquake, we calculated the coulomb failure stress change. The inverted coulomb stress changes based on the Nishimura and Chenji models both show that the Lushan M7. 0 earthquake occurred in the increased area of coulomb failure stress induced by the Wenchuan Ms8. 0 earthquake. The coulomb failure stress increased by approximately 0. 135 – 0. 152 bar in the source of the Lushan M7. 0 earthquake, which is far more than the stress triggering threshold. Therefore, the Lushan M7. 0 earthquake was most likely triggered by the coulomb failure stress change.

  20. Space Geodetic Observations and Modeling of 2016 Mw 5.9 Menyuan Earthquake: Implications on Seismogenic Tectonic Motion

    Directory of Open Access Journals (Sweden)

    Yongsheng Li

    2016-06-01

    Full Text Available Determining the relationship between crustal movement and faulting in thrust belts is essential for understanding the growth of geological structures and addressing the proposed models of a potential earthquake hazard. A Mw 5.9 earthquake occurred on 21 January 2016 in Menyuan, NE Qinghai Tibetan plateau. We combined satellite interferometry from Sentinel-1A Terrain Observation with Progressive Scans (TOPS images, historical earthquake records, aftershock relocations and geological data to determine fault seismogenic structural geometry and its relationship with the Lenglongling faults. The results indicate that the reverse slip of the 2016 earthquake is distributed on a southwest dipping shovel-shaped fault segment. The main shock rupture was initiated at the deeper part of the fault plane. The focal mechanism of the 2016 earthquake is quite different from that of a previous Ms 6.5 earthquake which occurred in 1986. Both earthquakes occurred at the two ends of a secondary fault. Joint analysis of the 1986 and 2016 earthquakes and aftershocks distribution of the 2016 event reveals an intense connection with the tectonic deformation of the Lenglongling faults. Both earthquakes resulted from the left-lateral strike-slip of the Lenglongling fault zone and showed distinct focal mechanism characteristics. Under the shearing influence, the normal component is formed at the releasing bend of the western end of the secondary fault for the left-order alignment of the fault zone, while the thrust component is formed at the restraining bend of the east end for the right-order alignment of the fault zone. Seismic activity of this region suggests that the left-lateral strike-slip of the Lenglongling fault zone plays a significant role in adjustment of the tectonic deformation in the NE Tibetan plateau.

  1. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  2. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    Science.gov (United States)

    Dong, Zhi-hui; Yang, Zhi-gang; Chen, Tian-wu; Chu, Zhi-gang; Deng, Wen; Shao, Heng

    2011-01-01

    PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; pchest (45/143 vs. 11/66 patients, RR = 1.9; ptraumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries. PMID:21789386

  3. Reflections from the interface between seismological research and earthquake risk reduction

    Science.gov (United States)

    Sargeant, S.

    2012-04-01

    Scientific understanding of earthquakes and their attendant hazards is vital for the development of effective earthquake risk reduction strategies. Within the global disaster reduction policy framework (the Hyogo Framework for Action, overseen by the UN International Strategy for Disaster Reduction), the anticipated role of science and scientists is clear, with respect to risk assessment, loss estimation, space-based observation, early warning and forecasting. The importance of information sharing and cooperation, cross-disciplinary networks and developing technical and institutional capacity for effective disaster management is also highlighted. In practice, the degree to which seismological information is successfully delivered to and applied by individuals, groups or organisations working to manage or reduce the risk from earthquakes is variable. The challenge for scientists is to provide fit-for-purpose information that can be integrated simply into decision-making and risk reduction activities at all levels of governance and at different geographic scales, often by a non-technical audience (i.e. people without any seismological/earthquake engineering training). The interface between seismological research and earthquake risk reduction (defined here in terms of both the relationship between the science and its application, and the scientist and other risk stakeholders) is complex. This complexity is a function of a range issues that arise relating to communication, multidisciplinary working, politics, organisational practices, inter-organisational collaboration, working practices, sectoral cultures, individual and organisational values, worldviews and expectations. These factors can present significant obstacles to scientific information being incorporated into the decision-making process. The purpose of this paper is to present some personal reflections on the nature of the interface between the worlds of seismological research and risk reduction, and the

  4. Consideration for standard earthquake vibration (1). The Niigataken Chuetsu-oki Earthquake in 2007

    International Nuclear Information System (INIS)

    Ishibashi, Katsuhiko

    2007-01-01

    Outline of new guideline of quakeproof design standard of nuclear power plant and the standard earthquake vibration are explained. The improvement points of new guideline are discussed on the basis of Kashiwazaki-Kariwa Nuclear Power Plant incidents. The fundamental limits of new guideline are pointed. Placement of the quakeproof design standard of nuclear power plant, JEAG4601 of Japan Electric Association, new guideline, standard earthquake vibration of new guideline, the Niigataken Chuetsu-oki Earthquake in 2007 and damage of Kashiwazaki-Kariwa Nuclear Power Plant are discussed. The safety criteria of safety review system, organization, standard and guideline should be improved on the basis of this earthquake and nuclear plant accident. The general knowledge, 'a nuclear power plant is not constructed in the area expected large earthquake', has to be realized. Preconditions of all nuclear power plants should not cause damage to anything. (S.Y.)

  5. Tweeting Earthquakes using TensorFlow

    Science.gov (United States)

    Casarotti, E.; Comunello, F.; Magnoni, F.

    2016-12-01

    The use of social media is emerging as a powerful tool for disseminating trusted information about earthquakes. Since 2009, the Twitter account @INGVterremoti provides constant and timely details about M2+ seismic events detected by the Italian National Seismic Network, directly connected with the seismologists on duty at Istituto Nazionale di Geofisica e Vulcanologia (INGV). Currently, it updates more than 150,000 followers. Nevertheless, since it provides only the manual revision of seismic parameters, the timing (approximately between 10 and 20 minutes after an event) has started to be under evaluation. Undeniably, mobile internet, social network sites and Twitter in particular require a more rapid and "real-time" reaction. During the last 36 months, INGV tested the tweeting of the automatic detection of M3+ earthquakes, studying the reliability of the information both in term of seismological accuracy that from the point of view of communication and social research. A set of quality parameters (i.e. number of seismic stations, gap, relative error of the location) has been recognized to reduce false alarms and the uncertainty of the automatic detection. We present an experiment to further improve the reliability of this process using TensorFlow™ (an open source software library originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization).

  6. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  7. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  8. Analysis of the Earthquake-Resistant Design Approach for Buildings in Mexico

    Directory of Open Access Journals (Sweden)

    Carrillo Julián

    2014-01-01

    Full Text Available The development of new codes for earthquake-resistant structures has made possible to guarantee a better performance of buildings, when they are subjected to seismic actions. Therefore, it is convenient that current codes for design of building become conceptually transparent when defining the strength modification factors and assessing maximum lateral displacements, so that the design process can be clearly understood by structural engineers. The aim of this study is to analyze the transparency of earthquake-resistant design approach for buildings in Mexico by means of a critical review of the factors for strength modification and displacement amplification. The approach of building design codes in US is also analyzed. It is concluded that earthquake-resistant design in Mexico have evolved in refinement and complexity. It is also demonstrated that the procedure prescribed by such design codes allows the assessment of the design strengths and displacements in a more rational way, in accordance not only with the present stage of knowledge but also with the contemporary tendencies in building codes. In contrast, the procedures used in US codes may not provide a clear view for seismic response assessment of buildings.

  9. Tremors behind the power outlet - where earthquakes appear on our monthly bill

    Science.gov (United States)

    Baisch, Stefan

    2013-04-01

    The world's appetite for energy has significantly increased over the last decades, not least due to the rapid growth of Asian economies. In parallel, the Fukushima shock raised widespread concerns against nuclear power generation and an increasing desire for clean energy technologies. To solve the conflict of higher demands, limited resources and a growing level of green consciousness, both up-scaling of conventional and development of renewable energy technologies are required. This is where the phenomenon of man-made earthquakes appears on the radar screen. Several of our energy production technologies have the potential to cause small, moderate, or sometimes even larger magnitude earthquakes. There is a general awareness that coal mining activities can produce moderate sized earthquakes. Similarly, long-term production from hydrocarbon reservoirs can lead to subsurface deformations accompanied by even larger magnitude earthquakes. Even the "renewables" are not necessarily earthquake-free. Several of the largest man-made earthquakes have been caused by water impoundment for hydropower plants. On a much smaller scale, micro earthquakes can occur in enhanced geothermal systems (EGS). Although still in its infancy, the EGS technology has an enormous potential to supply base load electricity, and its technical feasibility for a large scale application is currently being investigated in about a dozen pilot projects. The principal concept of heat extraction by circulating water through a subsurface reservoir is fairly simple, the technical implementation of EGS, however, exhibits several challenges not all of which are yet being solved. As the hydraulic conductivity at depth is usually extremely low at EGS sites, a technical stimulation of hydraulic pathways is required for creating an artificial heat exchanger. By injecting fluid under high pressure into the subsurface, tectonic stress on existing fractures can be released and the associated shearing of the fractures

  10. Safety Aspects of Sustainable Storage Dams and Earthquake Safety of Existing Dams

    Directory of Open Access Journals (Sweden)

    Martin Wieland

    2016-09-01

    Full Text Available The basic element in any sustainable dam project is safety, which includes the following safety elements: ① structural safety, ② dam safety monitoring, ③ operational safety and maintenance, and ④ emergency planning. Long-term safety primarily includes the analysis of all hazards affecting the project; that is, hazards from the natural environment, hazards from the man-made environment, and project-specific and site-specific hazards. The special features of the seismic safety of dams are discussed. Large dams were the first structures to be systematically designed against earthquakes, starting in the 1930s. However, the seismic safety of older dams is unknown, as most were designed using seismic design criteria and methods of dynamic analysis that are considered obsolete today. Therefore, we need to reevaluate the seismic safety of existing dams based on current state-of-the-art practices and rehabilitate deficient dams. For large dams, a site-specific seismic hazard analysis is usually recommended. Today, large dams and the safety-relevant elements used for controlling the reservoir after a strong earthquake must be able to withstand the ground motions of a safety evaluation earthquake. The ground motion parameters can be determined either by a probabilistic or a deterministic seismic hazard analysis. During strong earthquakes, inelastic deformations may occur in a dam; therefore, the seismic analysis has to be carried out in the time domain. Furthermore, earthquakes create multiple seismic hazards for dams such as ground shaking, fault movements, mass movements, and others. The ground motions needed by the dam engineer are not real earthquake ground motions but models of the ground motion, which allow the safe design of dams. It must also be kept in mind that dam safety evaluations must be carried out several times during the long life of large storage dams. These features are discussed in this paper.

  11. Seismic fragility analyses of nuclear power plant structures based on the recorded earthquake data in Korea

    International Nuclear Information System (INIS)

    Joe, Yang Hee; Cho, Sung Gook

    2003-01-01

    This paper briefly introduces an improved method for evaluating seismic fragilities of components of nuclear power plants in Korea. Engineering characteristics of small magnitude earthquake spectra recorded in the Korean peninsula during the last several years are also discussed in this paper. For the purpose of evaluating the effects of the recorded earthquake on the seismic fragilities of Korean nuclear power plant structures, several cases of comparative studies have been performed. The study results show that seismic fragility analysis based on the Newmark's spectra in Korea might over-estimate the seismic capacities of Korean facilities. (author)

  12. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    Science.gov (United States)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  13. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  14. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  15. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  16. Comparison of aftershock sequences between 1975 Haicheng earthquake and 1976 Tangshan earthquake

    Science.gov (United States)

    Liu, B.

    2017-12-01

    The 1975 ML 7.3 Haicheng earthquake and the 1976 ML 7.8 Tangshan earthquake occurred in the same tectonic unit. There are significant differences in spatial-temporal distribution, number of aftershocks and time duration for the aftershock sequence followed by these two main shocks. As we all know, aftershocks could be triggered by the regional seismicity change derived from the main shock, which was caused by the Coulomb stress perturbation. Based on the rate- and state- dependent friction law, we quantitative estimated the possible aftershock time duration with a combination of seismicity data, and compared the results from different approaches. The results indicate that, aftershock time durations from the Tangshan main shock is several times of that form the Haicheng main shock. This can be explained by the significant relationship between aftershock time duration and earthquake nucleation history, normal stressand shear stress loading rateon the fault. In fact the obvious difference of earthquake nucleation history from these two main shocks is the foreshocks. 1975 Haicheng earthquake has clear and long foreshocks, while 1976 Tangshan earthquake did not have clear foreshocks. In that case, abundant foreshocks may mean a long and active nucleation process that may have changed (weakened) the rocks in the source regions, so they should have a shorter aftershock sequences for the reason that stress in weak rocks decay faster.

  17. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  18. Large LOCA-earthquake combination probability assessment - Load combination program. Project 1 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, S; Streit, R D; Chou, C K

    1980-01-01

    This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10{sup -12}). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)

  19. Large LOCA-earthquake combination probability assessment - Load combination program. Project 1 summary report

    International Nuclear Information System (INIS)

    Lu, S.; Streit, R.D.; Chou, C.K.

    1980-01-01

    This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10 -12 ). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)

  20. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living